Please, let Future Processing know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
The position requires a minimum of 5 years of experience in IT, including at least 3.5 years working with data in AWS cloud, confirmed by commercial projects implemented in production.
Experience with data in GCP cloud is a nice to have.
The candidate should have experience working with Terraform.
Commercial experience in processing large data using JSON is required.
The role requires commercial experience with PySpark.
Advanced SQL skills are necessary, with experience in Microsoft technologies and others.
The candidate should be familiar with methodologies and proficient in using Git and CI/CD.
Proficiency in using the Pandas and/or NumPy libraries is required.
The role involves creating and optimizing data processing solutions (ETL, ELT, etc.) preceded by technical project planning and alternative solutions (e.g., Apache Airflow).
Programming skills in Python are essential.
Familiarity with AWS technologies such as Glue, Athena, and DynamoDB is required.
Experience with agile practices and knowledge of tools used in software development processes, including Azure DevOps, is necessary.
Knowledge of mechanisms for secure data storage and processing in the cloud is required.
Understanding of migrating on-premise solutions to the cloud and basic types of migration is necessary.
The candidate should have a medium level of English proficiency (minimum B2).
Requirements:
A minimum of 5 years of experience in IT is required, with at least 3.5 years in AWS cloud data work confirmed by commercial projects.
Experience with GCP cloud data is a nice to have.
Proficiency in Terraform is necessary.
Commercial experience in processing large data using JSON is required.
Experience with PySpark is essential.
Advanced SQL skills are required, with experience in Microsoft technologies and others.
Familiarity with Git and CI/CD methodologies is necessary.
Proficiency in using Pandas and/or NumPy libraries is required.
The candidate must be able to create and optimize data processing solutions (ETL, ELT, etc.) based on technical project planning.
Programming skills in Python are essential.
Familiarity with AWS technologies such as Glue, Athena, and DynamoDB is required.
Experience with agile practices and knowledge of software development tools, including Azure DevOps, is necessary.
Knowledge of secure data storage and processing mechanisms in the cloud is required.
Understanding of on-premise to cloud migration and basic migration types is necessary.
A medium level of English proficiency (minimum B2) is required.
Benefits:
The position offers a competitive salary ranging from 125 to 190 PLN per hour net + VAT (B2B).
The role provides the opportunity to work on data solutions in a collaborative team environment.
The candidate will be responsible for the entire solution development process alongside the team.
There is an opportunity to create and modify cloud data processing solutions.
The role includes the development and modification of documentation.
The candidate will analyze and optimize existing or proposed system solutions.
The position involves analyzing client requirements to deliver optimal business solutions.
The candidate will assess potential risks and adjust solutions according to business requirements.
The role includes analyzing needs and source data, such as how CRM data can be used for customer interactions.
There is an opportunity to develop new plugins or extensions that optimize evaluation and prioritization within the CRM system.
The candidate will test new plugins and develop AI and machine learning models for integration.
Apply now
Please, let Future Processing know you found this job
on RemoteYeah
.
This helps us grow 🌱.