Remote Data Engineer

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • Remo is seeking a highly skilled Data Engineer to develop and maintain their Data and Analytics Platform.
  • The role involves managing the full lifecycle of data, from ingestion to transformation and downstream consumption.
  • Responsibilities include designing, developing, and maintaining data pipelines and ETL processes using platforms like Fivetran and Google Cloud Composer.
  • The Data Engineer will manage the Customer Data Platform and lifecycle, as well as GCP services such as BigQuery and Dataflow for data processing and model deployment.
  • The position requires implementing and managing tools like dbt for data transformations and modeling.
  • The engineer will optimize SQL queries for data warehousing in BigQuery and develop Python scripts for data processing and automation tasks.
  • The role includes acting as a lead analyst to support analytical roles across various departments and ensuring compliance with data governance policies.
  • The Data Engineer will handle Protected Health Information (PHI) and Personally Identifiable Information (PII) securely and collaborate with data scientists to deploy machine learning models.

Requirements:

  • Candidates must have 3+ years of experience as a Data Engineer and 3+ years in an Analytics Engineering role within the Healthcare industry.
  • Experience in building, maintaining, and monitoring robust ETL pipelines is required.
  • A lead analytical role in forecasting, cohort analysis, and time-series wrangling using healthcare and product data is necessary.
  • Candidates should have experience developing a strategy for a data semantic layer in a healthcare setting.
  • Mastery of BI tools such as Tableau or PowerBI is required.
  • Hands-on experience with Google Cloud services, specifically BigQuery and Dataflow, is essential.
  • Expert-level SQL skills in Postgres and BigQuery are required.
  • Experience in developing and maintaining Python packages for data transformation pipelines is necessary.
  • Strong experience with transform and modeling tools such as dbt is required.
  • Proficiency in leveraging platforms like Segment and Fivetran is necessary.
  • Knowledge of version control for data models using Git is required.
  • A strong understanding of change data capture (CDC) is essential.
  • Excellent problem-solving, analytical, and communication skills are required.
  • Knowledge of HIPAA/HITRUST compliance and related security best practices is necessary.

Benefits:

  • The position offers a 401(k) plan with employer matching up to 4%.
  • Employees receive 100% employer-paid health insurance benefits for themselves and their families.
  • Dental and vision benefits are included.
  • There is a monthly reimbursement for wifi and cell phone expenses.
  • Employees have access to Talkspace for mental health support.
  • Fertility benefits are provided.
  • The company offers 20 days of paid time off (PTO) plus 11 company holidays.
  • Parental leave includes 16 weeks for birthing parents and 8 weeks for non-birthing parents.
  • Additional leave is available for pregnancy loss and bereavement.
About the job
Posted on
Job type
Salary
$ 175,000 - 200,000 USD / year
Leave a feedback