Remote Data Engineer | Turning Raw Data into Gold (B2B or CIM)
Posted
Apply now
Please, let Tecknoworks Europe know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
Tecknoworks is a global technology consulting company that values curiosity, fearlessness, aspiration, and collaboration.
The company is seeking highly skilled and motivated Data Engineers to join their growing data team.
The Data Engineers will be responsible for building and maintaining scalable data pipelines, managing data architecture, and enabling data-driven decision-making across the organization.
Candidates should have hands-on experience with cloud platforms, specifically AWS and/or Azure, and proficiency in their respective data and analytics services.
For AWS, experience with AWS Glue for ETL/ELT processes, familiarity with Amazon Redshift, Athena, S3, and Lake Formation, and use of AWS Lambda, Step Functions, and CloudWatch for data pipeline orchestration and monitoring is required.
For Azure, experience with Azure Data Factory (ADF)/Synapse for data integration and orchestration, familiarity with Azure Synapse Analytics, Azure Data Lake Storage (ADLS), and Azure SQL Database is necessary.
The position is remote and based in Romania, with options for employment or collaboration contracts.
Requirements:
Candidates must design, develop, and maintain robust and scalable data pipelines to ingest, transform, and store data from diverse sources.
They should optimize data systems for performance, scalability, and reliability in a cloud-native environment.
Collaboration with data analysts, data scientists, and other stakeholders to ensure high data quality and availability is essential.
Development and management of data models using DBT, ensuring modular, testable, and well-documented transformation layers is required.
Implementation and enforcement of data governance, security, and privacy standards are necessary.
Candidates must manage and optimize cloud data warehouses, especially Snowflake, for performance, cost-efficiency, and scalability.
Monitoring, troubleshooting, and improving data workflows and ETL/ELT processes is expected.
Collaboration in the design and deployment of data lakes, warehouses, and lakehouse architectures is required.
A minimum of 3 years of experience as a Data Engineer or in a similar role is necessary.
Strong proficiency in SQL and Python is required.
Candidates should have a solid understanding of data modeling, ETL/ELT processes, and pipeline orchestration.
Experience working in DevOps environments using CI/CD tools (e.g., GitHub Actions, Azure DevOps) is necessary.
Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes, Airflow) is required.
Familiarity with data cataloging tools like AWS Glue Data Catalog or Azure Purview is necessary.
Strong interpersonal and communication skills are essential for collaboration with cross-functional teams and external clients.
Candidates should demonstrate adaptability in fast-paced environments with shifting client needs and priorities.
An analytical mindset with attention to detail and a commitment to delivering quality results is required.
Benefits:
The position offers the opportunity to work in a dynamic and innovative environment at a global technology consulting company.
Employees will have the chance to collaborate with a diverse team and contribute to significant outcomes for clients.
The role provides flexibility with remote work options.
Candidates will gain hands-on experience with leading cloud platforms and cutting-edge data technologies.
The company encourages continuous learning and professional development within the data engineering field.
Apply now
Please, let Tecknoworks Europe know you found this job
on RemoteYeah
.
This helps us grow 🌱.