Please, let Kpler know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
Kpler is seeking a Data Engineer to join their Commodities Market Data team, which is responsible for the ingestion, filtering, enrichment, and distribution of externally sourced data related to maritime trade.
The role involves developing and implementing core data ingestion and distribution pipelines, as well as associated back-end systems based on project requirements and design specifications.
The Data Engineer will troubleshoot and optimize existing workflows to improve performance and efficiency, ensuring data integrity and accuracy across various pipelines.
Candidates should demonstrate strong analytical and debugging skills with a proactive approach to learning.
Requirements:
Proficiency in Python, with experience in data manipulation and transformation, particularly using Pandas.
Knowledge of data integration, ETL processes, and batch/streaming data processing is essential.
An understanding of containerization and orchestration tools, such as Docker, Kubernetes, and Airflow, is required.
Familiarity with SQL, RDBMS, or Big Data technologies is necessary.
Comfort with Git, code reviews, and Agile methodologies is expected.
Experience working with AWS or another cloud provider, using Terraform, is a nice-to-have.
Benefits:
Kpler fosters a dynamic company culture that emphasizes nurturing connections and innovating solutions to market challenges.
The company is committed to providing a fair, inclusive, and diverse work environment, welcoming individuals from various backgrounds and experiences.
Kpler values customer satisfaction and encourages employees to turn ideas into reality, promoting a supportive and friendly approach among colleagues and clients.
Apply now
Please, let Kpler know you found this job
on RemoteYeah
.
This helps us grow 🌱.