This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
Design, develop, and maintain ELT pipelines using Python, Airflow, and SQL in an AWS environment.
Create and manage data lake and data warehouse solutions on AWS.
Develop and maintain data-driven dashboards and reporting solutions in PowerBI.
Connect PowerBI to SQL Server and PostgreSQL Aurora databases using a gateway.
Perform data profiling and source system analysis to ensure data quality and integrity.
Collaborate with business stakeholders to capture and understand data requirements.
Implement industry best practices for data engineering and visualization.
Participate in architectural decisions and contribute to the continuous improvement of data solutions.
Follow agile practices and a Lean approach in project development.
Optimize SQL queries for performance and ensure efficient database operations.
Perform database tuning and optimization as needed.
Requirements:
Expert-level proficiency in SQL (TSQL, MS SQL) with a strong focus on optimizing SQL queries for performance.
At least 7 years of experience will be preferred.
Extensive experience with Python and Airflow for ELT processes.
Proven experience in designing and developing data warehousing solutions on the AWS cloud platform.
Strong expertise in PowerBI for data visualization and dashboard creation.
Familiarity with connecting PowerBI to SQL Server and PostgreSQL Aurora databases.
Experience with REST APIs and JSON.
Agile development experience with a focus on continuous delivery and improvement.
Excellent problem-solving skills and a proactive “can-do” attitude.
Strong communication skills and the ability to work collaboratively in a team environment.
Curiosity and eagerness to learn and adapt to new technologies and methodologies.
Ability to perform database tuning and optimization to ensure efficient data operations.