This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
Design and develop custom ingestion pipelines to integrate aggregate Call Detail Records for accurate usage measurement per eSIM/user
Identify and implement process optimizations within data engineering workflows to enhance efficiency and reliability
Orchestrate and monitor data pipelines to ensure seamless data flow
Manage and optimize data warehouse performance for fast and reliable data access
Implement and enforce data governance and security policies to ensure data integrity, privacy, and compliance with regulations
Stay updated on industry trends and technologies, recommending new techniques and tools for staying ahead
Requirements:
4+ years of hands-on experience with modern warehousing solutions like Snowflake or BigQuery
Proven track record of building and automating robust, scalable data pipelines
Deep understanding of data architecture, governance, and security
Familiarity with orchestrators such as Airflow, Prefect, or Mage.AI
Expertise in SQL, Spark, and proficiency in Python/Scala/Java
Working knowledge of Docker, Git, and the command line
Excellent communication skills for complex technical topics
Strong passion for structure, efficiency, and attention to detail
Benefits:
Health insurance, work-from-anywhere stipend, annual wellness & learning credits
Annual all-expenses-paid company retreat in a beautiful destination