Remote Senior Cloud Data Engineer (AWS and Databricks)

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • We are looking for a Senior Cloud Data Engineer with expertise in AWS to join our Data Solutions business line.
  • The candidate should have a minimum of 5 years of experience in IT, including at least 3.5 years working with data in AWS cloud, confirmed by commercial projects implemented in production.
  • The role requires commercial experience in processing large data sets using Databricks and JSON.
  • Proficiency in using the Pandas library is essential.
  • Experience with Terraform is required.
  • The candidate must have advanced SQL skills and utilize it in various technological solutions, including MS technologies.
  • Familiarity with methodologies and proficiency in Git and CI/CD practices is necessary.
  • The role involves creating and optimizing data processing solutions (ETL, ELT, etc.) based on technical projects and alternative solutions.
  • Programming skills in Python are required.
  • Knowledge of AWS technologies such as Glue, Athena, and DynamoDB is necessary.
  • Experience with agile practices and tools used in software development, such as Azure DevOps, is expected.
  • The candidate should have knowledge of mechanisms for secure data storage and processing in the cloud.
  • Familiarity with migrating on-premise solutions to the cloud and understanding basic migration types is required.
  • Proficiency in English at an advanced level (minimum B2) and the ability to communicate fluently in Polish is essential.
  • Responsibilities include collaborating with the team on solutions, creating or modifying cloud data processing solutions, documenting processes, analyzing and optimizing existing or planned systems, and assessing client requirements for optimal business solutions.

Requirements:

  • A minimum of 5 years of experience in IT, with at least 3.5 years in AWS cloud data work confirmed by commercial projects.
  • Commercial experience in processing large data sets using Databricks and JSON is required.
  • Proficiency in the Pandas library is necessary.
  • Experience with Terraform is essential.
  • Advanced SQL skills are required, with experience in various technological solutions.
  • Familiarity with Git and CI/CD methodologies is necessary.
  • Ability to create and optimize data processing solutions (ETL, ELT, etc.) based on technical projects is required.
  • Programming skills in Python are essential.
  • Knowledge of AWS technologies such as Glue, Athena, and DynamoDB is necessary.
  • Experience with agile practices and tools like Azure DevOps is expected.
  • Knowledge of secure data storage and processing mechanisms in the cloud is required.
  • Familiarity with migrating on-premise solutions to the cloud and understanding basic migration types is necessary.
  • Proficiency in English at an advanced level (minimum B2) and fluency in Polish is essential.

Benefits:

  • The position offers a competitive salary ranging from 125 to 190 PLN per hour net + VAT (B2B).
  • The role allows for remote work flexibility.
  • The opportunity to work in a dynamic team focused on innovative data solutions.
  • Involvement in the development of new plugins and AI/ML models, enhancing professional growth and expertise.
  • The chance to contribute to significant projects that impact business needs and client interactions.
About the job
Posted on
Job type
Salary
zł 125 - 190 PLN / hour
Position
Experience level
Leave a feedback