Please, let Scopeworker know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
The Data Architect will be responsible for developing and maintaining robust and reliable data pipelines to support 24x7 business operations.
They will work on building components for enterprise central data platforms such as data warehouses, Operational Data Stores, and Access layers with APIs.
The role involves innovating, designing, building, and automating scalable solutions for massive data sets.
The Data Architect will lead big data challenges in an agile way and build data models to deliver insightful analytics.
They will ensure the highest standard of data integrity and bring software engineering experience to the complex process of data processing and data pipeline development.
Requirements:
Bachelor's or Master’s degree in Computer Science, Mathematics, or a similar field (PhDs preferred).
Minimum of 7 years of experience working with massive data.
Strong knowledge of public cloud platforms, specifically AWS.
Experience with data acquisition through API calls/FTP downloads, ETL, and transformation/normalization.
Proficiency in ETL processes is mandatory.
Hands-on experience with SQL, Python, Spark, and Kafka.
Excellent communication skills, including proficiency in verbal and written English.
Benefits:
Opportunity to work with Fortune 100 users and stakeholders, providing real-time actionable business intelligence.
Chance to work on innovative solutions and lead big data challenges in an agile environment.
Exposure to building components for enterprise central data platforms and ensuring data integrity.
Work in a dynamic environment where you can bring software engineering experience to data processing and pipeline development.
Apply now
Please, let Scopeworker know you found this job
on RemoteYeah
.
This helps us grow 🌱.