This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
Minimum 3 years of IT experience, including at least 2 years working with data in the AWS cloud (confirmed by at least 1 commercial project implemented in production).
Proficiency in SQL at an intermediate level and utilization in MS and other technological solutions.
Ability to create and optimize data processing solutions (ETL, ELT, others) and basic knowledge of cloud tools supporting these solutions.
Familiarity with SMP and MPP architectures along with examples of solutions based on these architectures.
Knowledge of tools for processing and analyzing medium/large datasets (e.g., Apache Spark, Synapse, Snowflake, Databricks).
Understanding of Delta Lake concept, basic data formats (CSV, Parquet, Delta), Modern Data Warehouse.
Familiarity with cloud CI/CD concepts and tools.
Good understanding of AWS data storage and processing services.
Intermediate level of English proficiency (minimum B2).
Requirements:
2-5 years of experience.
Proficiency in SQL, Apache Spark, Synapse, Delta Lake, Data Lakehouse, AWS.
Benefits:
Collaborating with a team to develop business-focused solutions.
Creating or modifying data processing solutions in the cloud.
Document creation and modification.
Analyzing and optimizing solutions within an existing or planned system.
Working with clients to efficiently meet their expectations.