Remote Python Developer - Data Engineering

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • Design, develop, and maintain scalable and efficient data pipelines
  • Implement ETL/ELT processes to collect, transform, and load data from various sources into data warehouses and data lakes
  • Collaborate with solution architects and other developers to optimize data models and schemas for performance and scalability
  • Develop and maintain data storage solutions, including databases, data warehouses, and data lakes
  • Ensure data quality, consistency, and reliability through validation, cleansing, and monitoring processes
  • Optimize data workflows and processes for performance, scalability, and cost-efficiency
  • Design, develop, and maintain scalable, robust, and high-performance applications using Python
  • Gather and analyze requirements, translating them into technical specifications
  • Write clean, maintainable, and efficient code following best practices and coding standards
  • Test and debug applications to ensure functionality and performance
  • Participate in code reviews to maintain code quality
  • Collaborate with client data teams and analysts to deliver solutions
  • Stay updated with industry trends, tools, and technologies in data engineering and Python development

Requirements:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field
  • Proven experience as a Python Developer focusing on data engineering
  • Strong proficiency in Python and relevant libraries
  • Solid understanding of SQL and NoSQL databases (e.g., PostgreSQL, ElasticSearch)
  • Experience with ETL tools and frameworks
  • Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS)
  • Knowledge of data modeling, data warehousing, and data lake architectures
  • Experience with version control systems (e.g., Git) and CI/CD pipelines
  • Strong problem-solving skills and ability to work independently and collaboratively
  • Excellent communication and teamwork skills

Preferred Skills:

  • Experience with containerization and orchestration tools (e.g., Docker, Kubernetes)
  • Familiarity with machine learning and data science concepts
  • Professional certifications such as Certified Data Management Professional (CDMP), AWS Certified Big Data, or Google Certified Professional Data Engineer will strengthen the application

Benefits:

  • Aretum is an equal opportunity employer, committed to diversity and inclusion
  • All qualified candidates will receive equal consideration for employment without regard to disability, race, color, religious creed, national origin, sexual orientation/gender identity, or age
  • Aretum utilizes e-Verify to check employment authorization
  • EEO/AA/F/M/Vet/Disabled
About the job
Posted on
Job type
Salary
-
Leave a feedback