Remote Data Engineer (The Data Pipeline Architect)

Posted

Apply now
Please, let Unreal Gigs know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • The Data Engineer (The Data Pipeline Architect) role involves designing, building, and maintaining data pipelines that support robust data architectures and facilitate seamless data flow.
  • The position requires creating scalable solutions that empower data-driven decision-making and developing and optimizing data systems that drive impactful analytics.
  • Responsibilities include developing and maintaining data pipelines for data ingestion, transformation, and integration using cloud technologies, automating data workflows, and ensuring seamless data movement between systems.
  • The Data Engineer will manage and optimize data storage solutions, architecting and maintaining data lakes and data warehouses using platforms like BigQuery, Redshift, or Snowflake.
  • Collaboration with data scientists, analysts, and business stakeholders is essential to understand data requirements and align data solutions with business goals.
  • The role includes ensuring data quality and reliability through processes for data validation, error handling, and consistency checks.
  • The Data Engineer will develop and automate ETL processes to handle complex data transformations and monitor and maintain data infrastructure using monitoring tools.
  • Strategies for efficient resource allocation and cost-effective data processing will be implemented to enhance performance.

Requirements:

  • Candidates must have experience with cloud data platforms such as AWS, GCP, or Azure, demonstrating proficiency in handling cloud-based data solutions.
  • Proficiency in programming and scripting languages such as Python, Java, or Scala is required for building data pipelines and processing tasks.
  • A proven ability to develop, maintain, and optimize ETL processes that handle large volumes of data is essential, along with experience using orchestration tools like Apache Airflow or Luigi.
  • Strong SQL skills and experience with relational and NoSQL databases are necessary.
  • Excellent problem-solving skills and a proactive approach to identifying and resolving data-related challenges are required.
  • A Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field is required, with equivalent experience in data engineering and cloud technologies considered.
  • Certifications in cloud data engineering are a plus.
  • Candidates should have 3+ years of experience in data engineering, with a proven track record of building and managing cloud-based data systems.
  • Experience with real-time data processing frameworks and familiarity with containerization and microservices architecture is advantageous.

Benefits:

  • Comprehensive medical, dental, and vision insurance plans are offered with low co-pays and premiums.
  • Competitive vacation, sick leave, and 20 paid holidays per year are provided.
  • Flexible work schedules and telecommuting options promote work-life balance.
  • Opportunities for training, certification reimbursement, and career advancement programs are available for professional development.
  • Access to wellness programs, including gym memberships, health screenings, and mental health resources, is provided.
  • Life insurance and short-term/long-term disability coverage are included.
  • Confidential counseling and support services are available through the Employee Assistance Program (EAP).
  • Financial assistance for continuing education and professional development is offered through tuition reimbursement.
  • Opportunities to participate in community service and volunteer activities are encouraged.
  • Employee recognition programs celebrate achievements and milestones.
Apply now
Please, let Unreal Gigs know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
-
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback