This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
Design, develop, and implement scalable, secure, and efficient data solutions to meet organizational needs.
Create and maintain logical and physical data models for business intelligence, analytics, and reporting.
Design, build, and optimize ETL processes and data pipelines for smooth data flow.
Integrate diverse data sources into a unified data platform.
Monitor and optimize data systems and pipelines for performance.
Implement data quality checks, validation processes, and governance frameworks.
Collaborate with stakeholders to understand data requirements and deliver solutions.
Maintain comprehensive documentation of data architectures, models, and pipelines.
Monitor and manage production environment to deliver data within defined SLAs.
Requirements:
3+ years of experience building scalable data pipelines.
3+ years of experience with Airflow or Python-based data pipeline codebase.
Experience in backend software development and distributed computing.
Proficiency in SQL, relational databases, and MPP databases.
Proficiency in Python, Java, or Scala.
Knowledge of data modeling techniques and tools.
Familiarity with cloud platforms like Amazon AWS.
Excellent communication and collaboration skills.
Bachelor’s degree in Computer Science or equivalent industry experience.
Benefits:
Competitive salary with generous annual cash bonus.
Stock options.
Remote first work from home culture.
Flexible vacation policy.
Generous parental leave.
Health, dental, and vision insurance with above-market employer contributions.