Our customer that develops AI driven tools and platform for marketing and advertising is looking for an interim principal data engineer to lead a small team of inhouse data engineers. In the assignment you will design and develop scalable data pipelines and data models on Google BigQuery to support analytics and AI use cases. You will also further improve the existing data architecture design and roadmap & manage the implementation of the target architecture. You will also mature the DevOps practices and tooling of the data engineering team to improve the quality and cadence of the development work. As BigQuery is a relatively new platform for the team, you will coach the less experienced team members in its use in the data pipeline implementation and in the efficient ways of working in general.
To succeed in the assignment you have:
- Proven experience in designing and implementing data, analytics & AI platforms in a hands-on tech lead / principal role
- Profound understanding and experience in Google BigQuery
- Hands-on experience and conceptual understanding on setting up DevOps practices and tooling to improve and automate data engineering work
- Experience in data target architecture design and roadmapping
- Experience in processing big event data for analytics and AI use cases. Kafka experience is a plus
- Interest and experience to coach more junior speacialists
- Good communication and collaboration skills.
The team operates in a hybrid more, with ca 3 on-site days per week at Espoo office.