Remote Big Data Engineer (The Data Pipeline Innovator)

Posted

Apply now
Please, let Unreal Gigs know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • The Big Data Engineer will handle massive datasets and build infrastructure for complex data analysis and machine learning at scale.
  • This role involves creating robust, scalable data pipelines that support data-driven decision-making.
  • The engineer will collaborate with data scientists, analysts, and software engineers to design, implement, and optimize big data platforms.
  • Key responsibilities include architecting and implementing data pipelines for ETL processes using tools like Apache Spark, Kafka, and Hadoop.
  • The engineer will develop and manage data storage solutions optimized for performance and cost-efficiency.
  • Collaboration on data strategy and integration is essential to align big data architecture with analytics goals.
  • The role requires implementing data quality and governance standards to ensure data accuracy and reliability.
  • Automation of data workflows using tools like Apache Airflow or AWS Glue is a critical responsibility.
  • Monitoring and troubleshooting data systems to maintain optimal processing capabilities is necessary.
  • Staying updated on big data trends and technologies to integrate new techniques that promote innovation is expected.

Requirements:

  • Extensive experience with big data technologies such as Apache Spark, Hadoop, Kafka, and Hive is required.
  • Proven ability to design, build, and maintain ETL processes for massive datasets is essential.
  • Proficiency in programming languages like Python, Java, or Scala for data processing and automation is necessary.
  • Familiarity with cloud platforms such as AWS, GCP, or Azure, including their big data and storage services, is required.
  • A strong understanding of data quality standards and governance practices is necessary.
  • A Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Technology, or a related field is required.
  • Equivalent experience in data engineering or big data management may be considered.
  • Certifications in big data or cloud technologies are a plus.
  • A minimum of 5 years of experience in data engineering, with at least 3 years focusing on big data technologies, is required.
  • Experience in distributed systems and large-scale data storage management is necessary.
  • Familiarity with containerization tools like Docker and Kubernetes is advantageous.

Benefits:

  • Comprehensive medical, dental, and vision insurance plans with low co-pays and premiums are provided.
  • Competitive vacation, sick leave, and 20 paid holidays per year are offered.
  • Flexible work schedules and telecommuting options promote work-life balance.
  • Opportunities for training, certification reimbursement, and career advancement programs are available.
  • Access to wellness programs, including gym memberships, health screenings, and mental health resources, is provided.
  • Life insurance and short-term/long-term disability coverage are included.
  • Confidential counseling and support services through an Employee Assistance Program (EAP) are available.
  • Financial assistance for continuing education and professional development through tuition reimbursement is offered.
  • Opportunities to participate in community service and volunteer activities are encouraged.
  • Employee recognition programs celebrate achievements and milestones.
Apply now
Please, let Unreal Gigs know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
-
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback