Remote Senior/Lead Data Engineer (Kafka, Apache Link, Apache Spark) - #34390

Posted

Apply now
Please, let Manila Recruitment know you found this job on RemoteYeah. This helps us grow 🌱.

Description:

  • The company is an Australian-based data-driven trading business that relies on live sports-market information for time-critical decisions.
  • They are seeking a Senior/Lead Data Engineer to join their team in the Philippines due to continued success and expansion.
  • The role involves designing and running a multi-cluster Apache Kafka backbone capable of sustaining over 200 Mbps ingest.
  • The engineer will build stateful pipelines in Apache Flink for data enrichment, cleansing, aggregation, and feature generation.
  • Processed streams will be persisted into fast analytical stores like Redis for low-latency look-ups and dashboards.
  • The engineer will create batch back-fill tooling for historical re-processing as needed.
  • Management of object storage (Amazon S3 / MinIO / Ceph) and defining retention/lifecycle policies is required.
  • The role includes establishing observability with tools like Prometheus and Grafana, as well as CI/CD and infrastructure-as-code using Terraform.
  • The engineer will work closely with the founder on roadmap, budget, and hardware choices, and may help onboard new engineers as the team grows.

Requirements:

  • Candidates must have 5+ years of experience as a Data Engineer.
  • The role requires autonomous ownership, with the ability to plan, prioritize, and deliver without supervision.
  • Experience in Apache Kafka operations and performance tuning is essential.
  • Candidates should have at least 2 years of experience building real-time stream-processing pipelines in production.
  • Clear and concise asynchronous communication skills are necessary.
  • The candidate must remain calm under pressure and maintain composure during incidents.
  • Proficiency in Java or Python, along with solid SQL skills, is required.
  • A basic understanding of distributed systems concepts such as partitioning, watermarking, checkpointing, and back-pressure is needed.
  • Experience in Linux administration and containerization (Docker/Podman) is required.
  • Strongly preferred qualifications include hands-on experience with Apache Flink, schema-driven serialization (Avro/Protobuf), and web scraping techniques for sports data.
  • Familiarity with object storage platforms, high throughput key-value stores, and modern virtualization platforms is advantageous.
  • Experience with infrastructure-as-code tools like Terraform is preferred.
  • Candidates should demonstrate rapid self-learning, curiosity, problem-solving skills, and a bias for automation and documentation.
  • Nice to have skills include knowledge of Apache Pulsar, OLAP engines, observability stacks, CI/CD tools, and domain exposure to sports data or trading.

Benefits:

  • The position offers full-time work from home as an offshore contractor for an Australian headquartered company.
  • A guaranteed 13th month pay is provided.
  • Employees receive 30 days of combined leave and 1 birthday leave starting from Day 1.
  • An HMO allowance of Php5,000 is included.
  • A one-off ergonomic kit worth Php12,000 is provided on Day 1.
  • A monthly wellness allowance of Php1,000 is available.
  • Employees can choose between a UPS or LTE fail-over stipend.
  • Work equipment will be provided by the company.
  • The company observes Philippine regular holidays.
  • An annual performance bonus of 10-20% based on KPI will be given, pro-rated and distributed in late April.
  • Employees will have an all-expense-paid trip to Brisbane HQ at the end of 2025 or 2026 to attend an NBA regular season game in late April 2026.
Apply now
Please, let Manila Recruitment know you found this job on RemoteYeah . This helps us grow 🌱.
Report this job

Job expired or something else is wrong with this job?

Report this job
Leave a feedback