Remote Senior Data Engineer

Posted

Apply now
Please, let Docker know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • Docker is a remote-first company that simplifies the lives of developers creating world-changing applications.
  • The company is seeking a Senior Data Engineer to join the Data Engineering team, which is led by the Director of Data Engineering.
  • The team transforms billions of data points from Docker products and services into actionable insights to influence product strategy and development.
  • The role involves designing and implementing event ingestion, data models, and ETL processes for mission-critical reporting and analysis while ensuring privacy and compliance.
  • The Senior Data Engineer will help lay the foundation for ML infrastructure to support data scientists and enhance analytics capabilities.
  • The data stack includes Snowflake as the central data warehouse, DBT/Airflow for orchestration, and Looker for visualization and reporting.
  • Data sources include Segment, Fivetran, S3, Kafka, and various other cloud systems.
  • Responsibilities include managing ETL jobs, building the Central Data Model, integrating new methodologies, and supporting stakeholders across the company.
  • The first 30 days will involve onboarding, understanding current data architecture, and identifying quick wins for improvement.
  • By the end of the first 90 days, the engineer is expected to contribute to data engineering projects and recommend improvements.

Requirements:

  • A minimum of 4 years of relevant industry experience is required.
  • Candidates must have experience in data modeling and building scalable data pipelines with complex transformations.
  • Proficiency with a Data Warehouse platform, preferably Snowflake or BigQuery, is necessary.
  • Experience in data governance, access, and security controls is required, with a strong preference for Snowflake and dbt experience.
  • Candidates should have experience creating production-ready ETL scripts and pipelines using Python and SQL, as well as orchestration frameworks like Airflow, Dagster, or Prefect.
  • Experience in designing and deploying high-performance systems with reliable monitoring and logging practices is essential.
  • Familiarity with at least one cloud ecosystem (AWS, Azure, or Google Cloud) is required.
  • Experience with a comprehensive BI and visualization framework, such as Tableau or Looker, is necessary.
  • Candidates should have experience working in an agile environment on multiple projects and prioritizing work based on organizational needs.
  • Strong verbal and written English communication skills are required.

Benefits:

  • Employees enjoy freedom and flexibility to fit work around their lives.
  • A home office setup is provided to ensure comfort while working.
  • The company offers 16 weeks of paid parental leave.
  • A technology stipend of $100 net per month is available.
  • The PTO plan encourages employees to take time off for personal enjoyment.
  • Quarterly company-wide hackathons are organized.
  • A training stipend is provided for conferences, courses, and classes.
  • Employees receive equity in the company, allowing them to share in its success.
  • Docker swag is provided to employees.
  • Medical benefits, retirement plans, and holidays vary by country.
Apply now
Please, let Docker know you found this job on RemoteYeah . This helps us grow 🌱.
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback