Remote Senior Cloud Data Engineer (AWS and Databricks)

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • We are looking for a Senior Cloud Data Engineer with expertise in AWS to join our Data Solutions business line.
  • The candidate should have a minimum of 5 years of experience in IT, including at least 3.5 years working with data in AWS cloud, confirmed by commercial projects implemented in production.
  • The role involves processing large data sets using Databricks and JSON.
  • Proficiency in using the Pandas library is required.
  • Experience with Terraform is necessary.
  • The candidate should have advanced SQL skills and utilize it in various technological solutions, including MS technologies.
  • Familiarity with Git methodologies and CI/CD practices is essential.
  • The role includes creating and optimizing data processing solutions (ETL, ELT, etc.) based on technical projects and alternative solutions.
  • Programming skills in Python are required.
  • Knowledge of AWS technologies such as Glue, Athena, and DynamoDB is expected.
  • Experience with agile practices and tools used in software development processes, such as Azure DevOps, is important.
  • The candidate should understand mechanisms for secure data storage and processing in the cloud.
  • Familiarity with migrating on-premise solutions to the cloud and understanding basic migration types is necessary.
  • Proficiency in English at an intermediate level (minimum B2) is required.

Requirements:

  • A minimum of 5 years of experience in IT is required, with at least 3.5 years in AWS cloud data work confirmed by commercial projects.
  • Commercial experience in processing large data sets using Databricks is necessary.
  • Experience in processing large data sets using JSON is required.
  • Proficiency in the Pandas library is essential.
  • Experience with Terraform is required.
  • Advanced SQL skills are necessary, with experience in various technological solutions.
  • Familiarity with Git methodologies and CI/CD practices is essential.
  • The candidate should be able to create and optimize data processing solutions (ETL, ELT, etc.) based on technical projects.
  • Programming skills in Python are required.
  • Knowledge of AWS technologies such as Glue, Athena, and DynamoDB is expected.
  • Experience with agile practices and tools like Azure DevOps is important.
  • Understanding of secure data storage and processing mechanisms in the cloud is necessary.
  • Familiarity with migrating on-premise solutions to the cloud is required.
  • Proficiency in English at an intermediate level (minimum B2) is necessary.

Benefits:

  • The position offers a competitive salary ranging from 125 to 190 PLN/h net + VAT (B2B).
  • The role allows for remote work flexibility.
  • The candidate will be responsible for the overall solutions co-created with the team.
  • Opportunities to create or modify cloud data processing solutions are available.
  • The role includes creating and modifying documentation.
  • The candidate will analyze and optimize solutions for existing or designed systems.
  • There will be opportunities to analyze client requirements to deliver optimal business solutions.
  • The role involves analyzing potential risks and adapting solutions to business requirements.
  • The candidate will analyze needs and source data, including how CRM data can be used for customer interactions.
  • Opportunities to develop new plugins and test them will be available.
  • The role includes the development of artificial intelligence and machine learning models and their integration.
About the job
Posted on
Job type
Salary
zł 125 - 190 PLN / hour
Position
Experience level
Leave a feedback