The role involves building a universal data platform that meets both real-time and offline computing and storage requirements.
Responsibilities include defining the data pipeline scheme based on demand scenarios and delivering it through multiple methods across various infrastructures.
The position also requires enhancing the data platform to improve the stability and flexibility of data assets while optimizing resource efficiency.
Requirements:
A Bachelor’s degree or higher in computer science, big data, mathematics, or a related field is required, along with 3-5 years of data development experience.
Candidates must be familiar with popular data platform components such as Hadoop, Spark, Flink, and Airflow, and should understand their working principles and have optimization experience with these components.
A solid understanding of distributed system principles, calculations, and storage is necessary.
Good communication and logical thinking skills are essential, along with a strong self-drive and a commitment to continuous learning and updating knowledge.
Benefits:
The position offers a competitive salary and a medical insurance package that extends coverage to dependents.
Employees enjoy attractive annual leave entitlements, including birthday and work anniversary leave.
There is work flexibility with options for flexi-work hours and hybrid or remote setups.
The company provides opportunities for career advancement through an internal mobility program.
New employees receive a crypto.com visa card upon joining, and benefits packages may vary depending on regional requirements.