This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
Join an agile distributed team of software engineers focused on data engineering to set a strategy and guidelines for streaming and data warehouse technologies.
Responsible for engaging with engineers in product teams, building integration pipelines, transforming streaming data, and influencing technology selection.
Establish good data engineering practices including infrastructure as code, automated testing, monitoring tools, and CI/CD best practices.
Collaborate with stakeholders to understand data needs and deliver high-quality solutions.
Requirements:
4+ years of experience in designing, building, monitoring, and managing large-scale data products, pipelines, tooling, and platforms.
Proven track record as a Data Engineer, with experience in streaming ETL solutions and cloud-based solutions on GCP, AWS, or Azure.
Proficiency in programming languages like Scala, Python, Typescript, Java, or Kotlin.
Experience with Dataform/Data Fusion and building data pipelines.
Strong understanding of software quality, automation, and continuous delivery.
Ability to work in an agile environment, familiarity with CI/CD, and deployment strategies.
Excellent communication skills in English and a degree in computer science or equivalent.
Benefits:
Engage in challenging projects within a professional and supportive environment.
Work in small, skilled teams with opportunities for long-term professional growth.
Competitive compensation based on experience and skills.
Support for professional, family, and personal goals.