Remote Staff Engineer - Cloudera-Hadoop - Big Data - Federal - 2nd Shift

Posted

This job is closed

This job post is closed and the position is probably filled. Please do not apply.  Automatically closed by a robot after apply link was detected as broken.

Description:

  • The position is for a Staff DevOps Engineer-Hadoop Admin on the Big Data Federal Team, providing 24x7 support for the Government Cloud infrastructure.
  • This role requires passing a ServiceNow background screening, including a credit check, criminal/misdemeanor check, and a drug test.
  • Only US citizens, US naturalized citizens, or US Permanent Residents will be considered due to Federal requirements.
  • The work schedule is a 4-day work week, either from Wednesday to Saturday or Sunday to Wednesday, with no on-call rotation.
  • This is a 2nd Shift position with work hours from 3 pm to 2 am Pacific Time.
  • The Big Data team is responsible for delivering state-of-the-art monitoring, analytics, and actionable business insights using new tools, Big Data systems, and methodologies.
  • Responsibilities include deploying, monitoring, maintaining, and supporting Big Data infrastructure and applications on ServiceNow Cloud and Azure environments.
  • The role involves automating deployment processes and CI/CD data pipelines using tools like Ansible, Puppet, Terraform, Jenkins, Docker, and Kubernetes.
  • Performance tuning and troubleshooting of various Hadoop components and data analytics tools are also part of the job.
  • The engineer will provide production support to resolve critical issues and collaborate with various teams to replicate complex issues.

Requirements:

  • Candidates must have 6+ years of experience with Cloudera-Hadoop Systems Administration.
  • A deep understanding of the Hadoop/Big Data Ecosystem is required.
  • Good knowledge in querying and analyzing large amounts of data on Hadoop HDFS using Hive and Spark Streaming is necessary.
  • Experience supporting CI/CD pipelines on Cloudera in Native cloud and Azure/AWS environments is essential.
  • Candidates should have experience securing the Hadoop stack with Sentry, Ranger, LDAP, and Kerberos KDC.
  • Strong Linux Systems Administration skills are required.
  • The ability to code automation scripts in Bash and knowledge of Python is necessary.
  • In-depth knowledge of Linux internals (Centos 7.x) and shell scripting is required.
  • Candidates must demonstrate the ability to learn quickly in a fast-paced, dynamic team environment.

Benefits:

  • The position offers a base pay range of $158,500 - $277,500, plus equity (when applicable) and variable/incentive compensation.
  • Health plans, including flexible spending accounts, are provided.
  • A 401(k) Plan with company match is available.
  • Employees can participate in an Employee Stock Purchase Plan (ESPP) and matching donations.
  • A flexible time away plan and family leave programs are offered, subject to eligibility requirements.
  • The company values inclusivity and encourages candidates from diverse backgrounds to apply, even if they do not meet every qualification.
Leave a feedback