This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
Design, develop, and maintain data pipelines and ETL processes.
Perform data modeling and data cleansing tasks.
Design and implement ETL/ELT solutions for transferring data between various sources and platforms.
Automate data processing workflows using tools like Airflow or other workflow management tools.
Optimize database performance by designing and implementing data structures and using indexes appropriately.
Implement data quality and data governance processes.
Work on a modern platform connecting professionals in different fields with service seekers.
Deal with big data, data migration, and ELT creation.
Utilize technologies such as MS SQL, Azure Data Factory, Azure Data Lake Storage, Azure Databricks, dbt, S3, Airflow, and Python.
Requirements:
Must have 3+ years of experience as a Data Engineer.
Should have experience developing and administering ETL processes in a Cloud environment (Azure, AWS, or GCP).
Need experience in developing Data Lake/Data Warehouse solutions.
Strong programming skills in Python and SQL are required.
Experience with ETL tools and data integration is necessary.
Must possess strong problem-solving and analytical skills.
Benefits:
Engage in diverse and technically challenging projects.
Enjoy a remote workplace model.
Benefit from a flexible schedule and Agile/SCRUM environment.