Please, let OneImaging know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
We are seeking a Data Engineer to join our dynamic team.
The ideal candidate is an enthusiastic problem-solver who excels at building scalable data systems and has hands-on experience with Databricks, Looker, AWS, MongoDB, PostgreSQL, and Terraform.
You will work alongside sales, customer success, and engineering to design, implement, and maintain the operational data infrastructure that powers our analytics and platform offerings.
Key Responsibilities include:
Designing, building, and maintaining end-to-end data pipelines using Databricks (SparkSQL, PySpark) for data ingestion, transformation, and processing.
Integrating data from various structured and unstructured sources, including medical imaging systems, EMRs, Change-Data-Capture from SQL Databases, and external APIs.
Collaborating with the analytics team to create, optimize, and maintain dashboards in Looker.
Implementing best practices in data modeling and visualization for operational efficiency.
Deploying and managing cloud-based solutions on AWS (e.g., S3, EMR, Lambda, EC2) to ensure scalability, availability, and cost-efficiency using IaC tooling (Terraform and Databricks Asset Bundles).
Developing and maintaining CI/CD pipelines for data-related services and applications.
Overseeing MongoDB and PostgreSQL databases, including schema design, indexing, and performance tuning.
Ensuring data integrity, availability, and optimized querying for both transactional and analytical workloads.
Adhering to healthcare compliance requirements (e.g., HIPAA) and best practices for data privacy and security.
Implementing error handling, logging, and monitoring frameworks to ensure reliability and transparency.
Implementing data governance frameworks to maintain data integrity and confidentiality.
Working cross-functionally with data scientists, product managers, and other engineering teams to gather requirements and define data workflows.
Documenting data pipelines, system architecture, and processes for internal and external stakeholders.
Requirements:
Education & Experience:
A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field is required.
A minimum of 3 years of professional experience in data engineering or a similar role is necessary.
Technical Skills:
Proven expertise in Databricks (Spark) for building large-scale data pipelines is essential.
Experience in creating dashboards, data models, and self-service analytics solutions using Looker is required.
Proficiency with core AWS services like S3, EMR, Lambda, IAM, EC2, etc. is necessary.
Demonstrated ability to design schemas, optimize queries, and manage high-volume databases in MongoDB and PostgreSQL is required.
Strong SQL skills, plus familiarity with Python, Scala, or Java for data-related tasks are essential.
Soft Skills:
Excellent communication and team collaboration abilities are required.
A strong problem-solving aptitude and analytical thinking are necessary.
Detail-oriented, with a focus on delivering reliable, high-quality solutions is essential.
Preferred:
Experience in healthcare or imaging (e.g., DICOM, HL7/FHIR) is preferred.
Familiarity with DevOps tools (Docker, Kubernetes, Terraform) and CI/CD pipelines is a plus.
Knowledge of machine learning workflows and MLOps practices is preferred.
Benefits:
A comprehensive Health Care Plan that includes Medical, Dental, and Vision coverage is provided.
A Retirement Plan that includes options for 401k and IRA is available.
Paid Time Off is offered, including vacation, sick leave, and public holidays.
Opportunities for Training & Development are provided to enhance skills and career growth.
The option to Work From Home is available to promote work-life balance.
Apply now
Please, let OneImaging know you found this job
on RemoteYeah
.
This helps us grow 🌱.