Please, let Token Metrics know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
Token Metrics is seeking a multi-talented Back End Engineer to facilitate the operations of the Data Scientists and Engineering team.
The Back End Engineer will be responsible for employing various tools and techniques to construct frameworks that prepare information using SQL, Python, R, Java, and C++.
The Big Data Engineer will employ machine learning techniques to create and sustain structures that allow for the analysis of data while remaining familiar with dominant programming and deployment strategies in the field.
Responsibilities include liaising with coworkers and clients to elucidate the requirements for each task.
The engineer will conceptualize and generate infrastructure that allows big data to be accessed and analyzed.
Reformulating existing frameworks to optimize their functioning is also a key responsibility.
Testing structures to ensure that they are fit for use is essential.
The engineer will build a data pipeline from different data sources using various data types like API, CSV, JSON, etc.
Preparing raw data for manipulation by Data Scientists is part of the role.
Implementing proper data validation and data reconciliation methodologies is required.
Ensuring that work remains backed up and readily accessible to relevant coworkers is important.
The engineer must remain up-to-date with industry standards and technological advancements that will improve the quality of outputs.
Requirements:
A Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or a related field is required.
A Master's degree in a relevant field is an added advantage.
Candidates must have 3+ years of development experience in Python, Java, or any programming language.
3+ years of SQL & No-SQL experience is required, with Snowflake Cloud DW & MongoDB experience being a plus.
Candidates should have 3+ years of experience with schema design and dimensional data modeling.
Expert proficiency in SQL, NoSQL, Python, C++, Java, and R is necessary.
Candidates must be experts in building Data Lakes, Data Warehouses, or suitable equivalents.
Expertise in AWS Cloud is required.
Excellent analytical and problem-solving skills are essential.
A knack for independence and group work is important.
The capacity to successfully manage a pipeline of duties with minimal supervision is required.
Benefits:
Token Metrics offers a remote work environment, allowing flexibility in work location.
The company provides opportunities to work with cutting-edge technologies in the field of data engineering and machine learning.
Employees will have the chance to collaborate with a diverse set of customers, including retail investors and crypto fund managers from over 50 countries.
The role offers the potential for professional growth and development in a rapidly evolving industry.
Apply now
Please, let Token Metrics know you found this job
on RemoteYeah
.
This helps us grow 🌱.