Welcome to RemoteYeah 2.0! Find out more about the new version here.

Remote Data Engineer

at Particle41

Posted 1 day ago 2 applied

Description:

  • Particle41 is seeking a talented and versatile Data Engineer to join their innovative team.
  • The Data Engineer will play a key role in designing, building, and maintaining robust data pipelines and infrastructure to support clients' data needs.
  • Responsibilities include working on end-to-end data solutions and collaborating with cross-functional teams to ensure high-quality, scalable, and efficient data delivery.
  • The role offers an exciting opportunity to contribute to impactful projects, solve complex data challenges, and grow skills in a supportive and dynamic environment.
  • Key responsibilities include designing, developing, and maintaining scalable ETL (Extract, Transform, Load) pipelines to process large volumes of data from diverse sources.
  • The Data Engineer will build and optimize data storage solutions, such as data lakes and data warehouses, to ensure efficient data retrieval and processing.
  • Integration of structured and unstructured data from various internal and external systems to create a unified view for analysis is essential.
  • Ensuring data accuracy, consistency, and completeness through rigorous validation, cleansing, and transformation processes is required.
  • Comprehensive documentation for data processes, tools, and systems must be maintained while promoting best practices for efficient workflows.

Requirements:

  • A Bachelor's degree in computer science, Engineering, or a related field is required.
  • Proven experience as a Data Engineer, with a minimum of 3 years of experience is necessary.
  • Proficiency in the Python programming language is essential.
  • Experience with database technologies such as SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases is required.
  • A strong understanding of programming libraries/frameworks and technologies such as Flask, API frameworks, data warehousing/lakehouse principles, database and ORM, data analysis tools like Databricks, Pandas, Spark, Pyspark, Machine Learning, OpenCV, and scikit-learn is valued.
  • Familiarity with utilities and tools such as logging, requests, subprocess, regex, and pytest is necessary.
  • Knowledge of the ELK stack, Redis, and distributed task queues is required.
  • A strong understanding of data warehousing/lakehousing principles and concurrent/parallel processing concepts is essential.
  • Familiarity with at least one cloud data engineering stack (Azure, AWS, or GCP) and the ability to quickly learn and adapt to new ETL/ELT tools across various cloud providers is necessary.
  • Familiarity with version control systems like Git and collaborative development workflows is required.
  • Competence in working on Linux OS and creating shell scripts is necessary.
  • A solid understanding of software engineering principles, design patterns, and best practices is essential.
  • Excellent problem-solving and analytical skills, with a keen attention to detail, are required.
  • Effective communication skills, both written and verbal, and the ability to collaborate in a team environment are necessary.
  • Adaptability and willingness to learn new technologies and tools as needed are essential.

Benefits:

  • Particle41 values empowering leadership, innovation, teamwork, and excellence, driving everything they do to achieve the ultimate outcomes for clients.
  • The company provides equal employment opportunities to all employees and applicants, ensuring that hiring and employment decisions are based on merit and qualifications without discrimination.
  • Particle41 encourages individuals from all backgrounds to apply and is committed to providing a supportive work environment.
  • The company welcomes applicants who embody their core values and are committed to contributing to their mission.