This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
The Senior Data Engineer will be responsible for developing and designing Data Engineering projects, handling large volumes of data.
The role requires at least 5 years of experience using cloud data engineering services, preferably on AWS.
The candidate should have a minimum of 5 years of experience as a Big Data architect/solution architect on Big Data platforms.
A minimum of 3 years of proven experience as an architect/solution architect on the Databricks platform is necessary.
The candidate should have designed and implemented at least 2-3 end-to-end projects in Databricks.
Expertise in E2E architecture of unified data platforms covering data lifecycle aspects like Data Ingestion, Transformation, Serving, and Consumption is required.
Experience in composable architecture to fully leverage Databricks capabilities is essential.
The candidate should have implemented and configured Databricks environments for optimal performance and resource utilization.
Experience in integrating data from various sources into Databricks for processing and analysis is necessary.
The role involves designing and developing scalable batch and streaming data pipelines, data lake architectures, and data warehousing solutions on the Databricks platform using Spark and Delta Lake.
Knowledge of the Databricks Lakehouse concept and its enterprise implementation is required.
Strong understanding of data warehousing, governance, and security standards related to Databricks is essential.
Proficiency in writing unit and integration tests, and setting best practices for Databricks CI/CD is necessary.
Requirements:
8+ years of experience in Data Engineering project development/design.
At least 5+ years using cloud data engineering services, preferably on AWS.
5+ years as a Big Data architect/solution architect on Big Data platforms.
3+ years of proven experience as an architect/solution architect on the Databricks platform.
Designed and implemented at least 2-3 end-to-end projects in Databricks.
Expertise in E2E architecture of unified data platforms.
Experience in composable architecture to fully leverage Databricks capabilities.
Experience in integrating data from various sources into Databricks for processing and analysis.
Strong understanding of data warehousing, governance, and security standards related to Databricks.
Proficient in writing unit and integration tests, and setting best practices for Databricks CI/CD.
Benefits:
Opportunity to work remotely.
Engage in challenging projects with large volumes of data.
Work with cutting-edge technologies like Databricks, Spark, and Delta Lake.
Competitive salary and benefits package.
Continuous learning and professional development opportunities.