Please, let LocalStack know you found this job
on RemoteYeah.
This helps us grow π±.
Description:
LocalStack is a fast-growing Series A startup focused on revolutionizing cloud development processes and enhancing dev and test feedback loops.
The company has successfully closed a $25 million funding round in Q4 2024, led by Notable Capital, CRV, and Heavybit.
LocalStack provides a high-fidelity emulator and local cloud development platform, allowing developers to build and test cloud applications and data pipelines on their local machines within a lightweight cloud sandbox running in Docker.
The mission of LocalStack is to empower developers to rapidly build and test their cloud applications, improving the development experience and saving time and resources.
The company has a large open-source community with over 57,000 stars on GitHub, 100,000 active users worldwide, and over 290 million downloads to date.
LocalStack serves a diverse customer base, ranging from small and medium-sized businesses to Global Fortune 500 companies.
The team is globally distributed, with headquarters in Zurich, Switzerland, a main engineering office in Vienna, Austria, and remote team members from various countries including the US, France, the UK, Canada, and Spain.
The position is for a Senior Data Engineer who will redefine and optimize data processes at LocalStack.
Requirements:
The candidate must maintain, monitor, and optimize data ingestion pipelines for the current data platform.
The role involves leading the development of the future data platform based on evolving business needs.
The candidate will shape the data team roadmap and contribute to long-term strategic planning.
Full ownership of data ingestion from external sources is required, ensuring smooth functionality.
The candidate must design and implement a robust data modeling and data lake solution architecture.
Providing technical leadership and mentorship to the data engineering team is essential.
Collaboration with engineering teams to define and refine ingestion pipeline requirements is necessary.
The candidate will work with stakeholders to gather business questions and data needs.
Experience working with non-technical stakeholders to gather requirements is expected.
The candidate should have the ability to define technical initiatives required to satisfy business requirements.
Excellent knowledge of Python is required.
Experience in designing real-time data ingestion solutions with massive volumes of data is necessary.
Preferred experience includes working with AWS services commonly used in Data Engineering, such as S3, ECS, Glue, and EMR.
Experience with relational databases, data warehouses, data orchestration and ingestion tools, SQL, and BI tools is required.
Preferred experience in working remotely or in asynchronous settings is a plus.
The candidate should have experience owning initiatives at the individual contributor level.
Experience providing guidance to junior engineers is also required.
Benefits:
The position is fully remote, allowing for flexibility in work location.
A competitive salary is offered to the successful candidate.
Opportunities for professional development and training are available.
The work environment is dynamic and collaborative, fostering teamwork and innovation.
Flexible work arrangements are provided to accommodate different working styles and needs.
Apply now
Please, let LocalStack know you found this job
on RemoteYeah
.
This helps us grow π±.