Remote KF - Data Engineer with Databricks - Job9999
Posted
Apply now
Please, let Taller Technologies know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
We are looking for a highly experienced Data Solution Architect to design, implement, and optimize a scalable, secure, and high-performing data architecture.
This role is crucial in supporting business intelligence, analytics, and AI-driven initiatives, ensuring seamless integration across Databricks, SQL Server, and Azure Data Lake Storage (ADLSv2) environments.
The ideal candidate will own end-to-end data solutions, define strategic roadmaps, and collaborate with cross-functional teams to optimize data workflows while maintaining governance, security, and scalability.
Responsibilities include setting up, configuring, and deploying a working Databricks environment in Azure, including provisioning resources, configuring compute clusters, managing security, access controls, authentication, and network security.
The role involves integrating Databricks with data lakes, databases, and cloud storage, installing necessary libraries, ML frameworks, and Python dependencies, and creating notebooks and workflows for data processing, analytics, and machine learning.
The candidate will design and implement robust data pipelines for ingestion, transformation, and synchronization.
They will work with Azure SQL Server, Azure Data Factory, and ADLSv2 to optimize data movement, leveraging PySpark and SQL to enhance data processing efficiency.
The role requires evaluating and recommending emerging data technologies to enhance the data ecosystem.
The candidate will work closely with data engineers, analysts, and business stakeholders to align solutions with business objectives.
They will also develop and maintain comprehensive documentation and best practices.
Requirements:
Must have expert SQL skills, preferably with data warehouse experience and query optimization.
Proven experience with Azure Databricks & PySpark for big data processing and analytics is required.
Knowledge of Azure Data Factory and Data Engineering Fundamentals is essential.
Experience with Azure Data Lake (ADLSv2), specifically with Parquet/Delta Tables, is necessary.
Strong knowledge of data governance and security, including access control, compliance, and best practices in cloud environments, is required.
Experience in enterprise data integration, particularly with MS Dynamics and Workday, ensuring seamless data ingestion and processing, is a must.
Nice-to-have skills include experience with Dynamics Data and large-scale ERP migrations, understanding of data visualization and BI tools such as Tableau and Power BI, and knowledge of Python, Scala, or Terraform for infrastructure automation and orchestration.
The candidate should possess strategic thinking abilities to align technical solutions with business goals, strong problem-solving skills to tackle complex data architecture challenges, and a passion for continuous learning to stay updated on data technologies and industry best practices.
Benefits:
The position offers the flexibility of remote work, allowing for a better work-life balance.
As a contracted role, it provides opportunities for experienced professionals to engage in challenging projects.
The role supports professional growth through exposure to cutting-edge data technologies and collaboration with cross-functional teams.
The company fosters a culture of continuous learning, encouraging employees to stay updated on industry best practices and advancements.
Apply now
Please, let Taller Technologies know you found this job
on RemoteYeah
.
This helps us grow 🌱.