Remote Data Engineer (Databricks)

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • Design, develop, and maintain scalable data pipelines using Databricks and other relevant technologies.
  • Integrate data from various sources, ensuring data quality, consistency, and reliability.
  • Implement and manage ETL (Extract, Transform, Load) processes to support data warehousing and analytics.
  • Optimize and tune data processing jobs for performance and efficiency.
  • Work closely with data scientists, analysts, and other stakeholders to understand data needs and deliver solutions.
  • Ensure compliance with data governance and security policies.
  • Maintain clear and comprehensive documentation of data processes, workflows, and systems.

Requirements:

  • Experience with Databricks.
  • 5+ years in Data Engineering.
  • Proficiency in Python and SQL.
  • Strong understanding of data engineering principles and best practices.
  • Familiarity with cloud-based platforms (Azure/AWS/GCP).
  • Excellent technical advisory and guidance skills.
  • Ability to work collaboratively with clients and internal teams.
  • Strong communication and problem-solving skills.

Benefits:

  • Full-time remote position.
  • Opportunity to work with a dynamic team in a leading AI client.
  • Chance to contribute to building and maintaining robust data pipelines and analytics infrastructure.
  • Collaborate with data scientists, analysts, and stakeholders to deliver data solutions.
  • Ensure compliance with data governance and security policies.
  • Competitive salary and benefits package.
About the job
Posted on
Job type
Salary
-
Location requirements

-

Position

-

Experience level
Technology stack
Leave a feedback