Remote Staff, Data Engineer - Messaging Data Platform

Posted

Apply now
Please, let Twilio know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • Twilio is seeking a Staff Data Engineer for the Messaging Data Platform team.
  • The role involves building and maintaining highly scalable, reliable, and efficient data pipelines for messaging stacks and internal engineering solutions.
  • Responsibilities include overseeing the design, construction, testing, and maintenance of advanced data architectures and pipelines.
  • The engineer will drive the development of innovative data solutions to meet complex business requirements.
  • Best practices for data architecture will be created and enforced to ensure scalability, reliability, and performance.
  • The position includes providing architectural guidance and mentorship to junior engineers.
  • The engineer will tackle challenging technical issues and provide advanced troubleshooting support.
  • Collaboration with senior leadership to align data engineering strategies with organizational goals is essential.
  • Participation in long-term planning for data infrastructure and analytics initiatives is required.
  • The engineer will lead cross-functional projects, ensuring timely delivery and alignment with business objectives.
  • Coordination with product managers, analysts, and other stakeholders to define project requirements and scope is necessary.
  • Continuous monitoring and enhancement of the performance of data systems and pipelines will be part of the role.

Requirements:

  • A minimum of 3 years of Java development experience is required.
  • At least 5 years of experience with Big Data processing tools and frameworks such as Apache Spark and SparkSQL is necessary.
  • Experience with Lakehouse technologies, including Apache Hudi, Apache Iceberg, and Databricks Delta Lake, is required.
  • The candidate should have experience in building AI/ML pipelines.
  • A deep technical understanding of ETL tools, low-latency data stores, multiple data warehouses, and data catalogs is essential.
  • Familiarity with data testing and verification tooling and best practices is required.
  • Experience with cloud services, preferably AWS, Google, or Azure, is necessary.
  • Proficiency in working with Key-Value, Streaming, and Search Database technologies, including AWS DynamoDB, Apache Kafka, and ElasticSearch, is required.
  • The candidate must be ready to participate in the on-call rotation.
  • Desired qualifications include demonstrated technical breadth and depth through papers, code, or presentations.
  • Exposure to service-oriented architectures, microservices, and REST APIs is preferred.
  • Experience with containerization and orchestration tools such as Docker and Kubernetes is a plus.
  • A Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent training, fellowship, or work experience is desired.

Benefits:

  • Twilio offers competitive pay along with generous time-off policies.
  • Employees receive ample parental and wellness leave.
  • Comprehensive healthcare benefits are provided.
  • A retirement savings program is available for employees.
  • Additional benefits may vary by location.
  • Twilio supports employees in building positive change in their communities through volunteering and donation efforts.
Apply now
Please, let Twilio know you found this job on RemoteYeah . This helps us grow 🌱.
About the job
Posted on
Job type
Salary
-
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback