Remote Senior Data Engineer- Tech Lead (immediate role)

Posted

Apply now
Please, let Zealogics.com know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • The Senior Data Engineer - Tech Lead will participate in business discussions and assist in gathering data requirements.
  • The role requires good analytical and problem-solving skills to address data challenges by leading a team.
  • Proficiency in writing complex SQL queries for data extraction, transformation, and analysis is essential, including knowledge of SQL functions, joins, subqueries, and performance tuning.
  • The candidate should be able to navigate source systems with minimal guidance to understand data relationships and utilize data profiling for better data comprehension.
  • Hands-on experience with PySQL/Pyspark is required.
  • The position involves creating and managing data pipelines using Azure Data Factory, with an understanding of data integration, transformation, and workflow orchestration in Azure environments.
  • Knowledge of data engineering workflows and best practices in Databricks is necessary, along with the ability to understand existing templates and patterns for development.
  • Hands-on experience with Unity Catalog and Databricks workflow is also required.
  • Proficiency in using Git for version control and collaboration in data projects is essential.
  • The candidate must possess clear and effective communication skills to articulate findings and recommendations to team members.
  • The ability to document processes, workflows, and data analysis results effectively is important.
  • A willingness to learn new tools, technologies, and techniques as the field of data analytics evolves is expected, along with adaptability to changing project requirements and priorities.

Requirements:

  • Candidates must have 9+ years of overall experience, with more than 5 years of expertise in Azure technologies.
  • The ability to envision and lead end-to-end solutions and solve technical issues during offshore operations is required.
  • Expertise in Azure Databricks, Data Lakehouse architectures, and Azure Data Factory is necessary.
  • Candidates should have experience in optimizing data workflows and predictive modeling.
  • Designing and implementing data pipelines using Databricks and Spark is essential.
  • Expertise in batch and streaming data solutions, automating workflows with CI/CD tools like Jenkins and Azure DevOps, and ensuring data governance with Delta Lake is required.
  • Proficiency in Spark, PySpark, Delta Lake, Azure DevOps, and Python is necessary.

Benefits:

  • This position offers a remote work environment, allowing for flexibility in work location.
  • It is a full-time role, providing job security and stability.
  • The opportunity to work in an experienced team and lead data engineering projects is available.
  • The role encourages continuous learning and adaptation to new tools and technologies in the evolving field of data analytics.
Apply now
Please, let Zealogics.com know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
-
Location requirements

-

Position
Experience level
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback