Join us in developing the quality, monitoring (observability), and maintainability of Data and AI systems in modern cloud environments such as Azure, Databricks, and MLflow. You will be building the lifecycle management of production models and developing quality assurance models for both traditional data projects and Generative AI initiatives.
Your Responsibilities Include:
- Developing the quality, observability, and maintainability of Data and AI products.
- Planning and implementing QA and test automation processes in data projects.
- Qualitative testing and evaluation of GenAI solutions.
- Building automated data verification and monitoring systems.
- Collaboration with development teams and stakeholders, and deploying best practices.
We Expect From You:
- Strong experience in a QA/automation role within data and/or AI projects.
- A good understanding of testing data and ML models.
- Technology expertise: Azure, Databricks, MLflow, Octopus, or similar.
- Good communication skills in English (working language).
- Willingness to work mostly remotely, but participate in a team office day in Espoo once a week and attend on-site orientation when necessary.
The start date is flexible during Q1, and the work allocation is negotiable between 40-60%.