Remote Senior Cloud Data Engineer (Azure and Databricks)

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • We are looking for a Senior Cloud Data Engineer with expertise in Azure for our Data Solutions business line.
  • The candidate should have a minimum of 5 years of experience in IT, including at least 3.5 years working with data in the Azure cloud, confirmed by commercial projects implemented in production.
  • The role requires commercial experience in processing large data sets using Databricks.
  • Advanced SQL skills are necessary, with the ability to apply it in Microsoft technologies and beyond.
  • Proficiency in Git and CI/CD methodologies is required.
  • Experience with the Microsoft Fabric platform is essential.
  • The candidate will create and optimize data processing solutions (ETL, ELT, etc.) based on technical projects and alternative solutions.
  • Monitoring, diagnostics, and problem-solving in the cloud should not be a challenge, and the candidate should know how to plan infrastructure and calculate its costs.
  • Familiarity with Delta Lake and Data Lakehouse concepts is expected.
  • Knowledge of SMP and MPP architecture, along with examples of solutions based on these architectures, is required.
  • The candidate should understand the migration of on-premise solutions to the cloud and know the basic types of migration.
  • Knowledge of mechanisms related to the secure storage and processing of data in the cloud is necessary.
  • A strong understanding of data storage and processing services offered by Azure is required.
  • Experience in direct client collaboration is essential.
  • The candidate should have an intermediate level of English (minimum B2).

Requirements:

  • A minimum of 5 years of experience in IT is required, with at least 3.5 years in cloud data work specifically with Azure.
  • Commercial experience in processing large datasets using Databricks is necessary.
  • Advanced SQL skills are required, with experience in Microsoft technologies.
  • Proficiency in Git and CI/CD methodologies is essential.
  • Experience with Microsoft Fabric is required.
  • The candidate must be able to create and optimize data processing solutions (ETL, ELT, etc.) based on technical projects.
  • Strong skills in monitoring, diagnostics, and problem-solving in cloud environments are necessary.
  • Familiarity with Delta Lake and Data Lakehouse concepts is required.
  • Knowledge of SMP and MPP architecture is essential, along with examples of related solutions.
  • Understanding of on-premise to cloud migration processes and types is necessary.
  • Knowledge of secure data storage and processing mechanisms in the cloud is required.
  • A strong understanding of Azure's data storage and processing services is essential.
  • Experience in direct collaboration with clients is required.
  • The candidate must possess an intermediate level of English (minimum B2).

Benefits:

  • The position offers a competitive salary ranging from 125 to 190 PLN/h net + VAT (B2B).
  • The role provides the opportunity to take responsibility for the entire solution developed in collaboration with the team.
  • The candidate will have the chance to create or modify cloud data processing solutions.
  • There will be opportunities to create and modify documentation related to the solutions.
  • The role involves analyzing and optimizing existing or planned systems.
  • The candidate will analyze client requirements to deliver optimal business solutions.
  • The position includes assessing potential risks and adapting solutions to meet business requirements.
About the job
Posted on
Job type
Salary
zł 125 - 190 PLN / hour
Position
Experience level
Technology stack
Leave a feedback