Remote Senior Cloud Data Engineer (Azure and Databricks)
Posted
This job is closed
This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
We are looking for a Senior Cloud Data Engineer with expertise in Azure for our Data Solutions business line.
The candidate should have a minimum of 5 years of experience in IT, including at least 3.5 years working with data in the Azure cloud, confirmed by commercially implemented projects.
The role requires commercial experience in processing large data sets using Databricks.
Advanced SQL skills are necessary, with experience in Microsoft technology solutions and beyond.
The candidate should be proficient in Git and CI/CD methodologies.
Experience with the Microsoft Fabric platform is required.
The role involves creating and optimizing data processing solutions (ETL, ELT, etc.) based on technical projects and alternative solutions.
Monitoring, diagnosing, and troubleshooting cloud issues should not be a problem for the candidate, who should also know how to plan infrastructure and calculate its costs.
Familiarity with Delta Lake and Data Lakehouse concepts is essential.
Knowledge of SMP and MPP architecture, along with examples of solutions based on these architectures, is required.
The candidate should understand the migration of on-premise solutions to the cloud and the basic types of migration.
Knowledge of mechanisms related to the secure storage and processing of data in the cloud is necessary.
The candidate should have a strong understanding of Azure's data storage and processing services.
Direct collaboration with clients is part of the role.
Proficiency in English at an advanced level (minimum B2) and fluent communication in Polish is required.
Requirements:
A minimum of 5 years of experience in IT is required, with at least 3.5 years in cloud data work specifically with Azure.
Commercial experience in processing large datasets using Databricks is essential.
Advanced SQL skills are necessary, with experience in Microsoft technology solutions.
Proficiency in Git and CI/CD methodologies is required.
Experience with the Microsoft Fabric platform is a must.
The candidate should be capable of creating and optimizing data processing solutions (ETL, ELT, etc.) based on technical projects.
Strong skills in monitoring, diagnosing, and troubleshooting cloud issues are required, along with the ability to plan infrastructure and calculate costs.
Familiarity with Delta Lake and Data Lakehouse concepts is necessary.
Knowledge of SMP and MPP architecture, including examples of solutions based on these architectures, is required.
Understanding of the migration of on-premise solutions to the cloud and the basic types of migration is essential.
Knowledge of secure data storage and processing mechanisms in the cloud is necessary.
A strong understanding of Azure's data storage and processing services is required.
Experience in direct client collaboration is essential.
Proficiency in English at an advanced level (minimum B2) and fluent communication in Polish is required.
Benefits:
The position offers a competitive salary ranging from 125 to 190 PLN per hour net + VAT (B2B).
The role provides the opportunity to take responsibility for the entire solutions developed in collaboration with the team.
The candidate will have the chance to create or modify cloud data processing solutions.
The position includes the creation and modification of documentation.
The role involves analyzing and optimizing solutions for existing or planned systems.
The candidate will analyze client requirements to deliver optimal solutions for their business needs.
The position includes analyzing potential risks and adjusting solutions according to business requirements.