Please, let Token Metrics know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
Token Metrics is seeking a multi-talented Senior Big Data Engineer to facilitate the operations of the Data Scientists and Engineering team.
The Senior Big Data Engineer will be responsible for employing various tools and techniques to construct frameworks that prepare information using SQL, Python, R, Java, and C++.
This role involves employing machine learning techniques to create and sustain structures that allow for the analysis of data while remaining familiar with dominant programming and deployment strategies in the field.
Responsibilities include liaising with coworkers and clients to elucidate the requirements for each task.
The engineer will conceptualize and generate infrastructure that allows big data to be accessed and analyzed.
Reformulating existing frameworks to optimize their functioning is also a key responsibility.
Testing such structures to ensure that they are fit for use is required.
The engineer will build a data pipeline from different data sources using various data types like API, CSV, JSON, etc.
Preparing raw data for manipulation by Data Scientists is part of the job.
Implementing proper data validation and data reconciliation methodologies is essential.
Ensuring that work remains backed up and readily accessible to relevant coworkers is necessary.
The engineer must remain up-to-date with industry standards and technological advancements that will improve the quality of outputs.
Requirements:
A Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or a related field is required.
A Master's degree in a relevant field is an added advantage.
Candidates must have 3+ years of experience in Python, Java, or any programming language development.
3+ years of SQL & No-SQL experience is required, with Snowflake Cloud DW & MongoDB experience being a plus.
Candidates should have 3+ years of experience with schema design and dimensional data modeling.
Expert proficiency in SQL, NoSQL, Python, C++, Java, and R is necessary.
Candidates must be expert in building Data Lake, Data Warehouse, or suitable equivalents.
Expertise in AWS Cloud is required.
Excellent analytical and problem-solving skills are essential.
A knack for independence and group work is important.
The capacity to successfully manage a pipeline of duties with minimal supervision is required.
Benefits:
Token Metrics helps crypto investors build profitable portfolios using artificial intelligence-based crypto indices, rankings, and price predictions.
The company has a diverse set of customers, from retail investors and traders to crypto fund managers, in more than 50 countries.
Apply now
Please, let Token Metrics know you found this job
on RemoteYeah
.
This helps us grow 🌱.