The role involves working with high-volume transactional datasets, improving data quality, and enabling data-driven enhancements in operational and analytical systems.
Responsibilities include performing data profiling, cleansing, quality checks, and exploratory analysis on complex transactional datasets.
The candidate will collaborate with product and technical teams to define and refine rules and logic used in data-driven workflows.
The role requires translating findings into actionable insights and preparing reports for both technical teams and business stakeholders.
The candidate will utilize Python and Git for data pipeline management, workflow automation, and documentation.
Coordination with engineering and data science teams is necessary, including the integration of new data sources and tools, with experience in GCP/BigQuery being a plus.
Requirements:
A Master's degree in Data Analysis or a related discipline is required.
The candidate must have 5+ years of professional experience in data analysis following the completion of the master's degree.
Demonstrated expertise in data profiling, validation, and quality assurance is essential.
Experience handling and analyzing structured transactional data is required.
Proficiency in Python and Git for data processing and collaboration is necessary.
Strong communication skills in English, both written and verbal, are required.
Familiarity with GCP/BigQuery and/or financial sector data systems is a plus.
The candidate should have a B2+ (Upper-Intermediate or higher) level of English proficiency.
Benefits:
The position is remote, allowing for flexible work arrangements.
It is a full-time role, providing stability and commitment.
The candidate will work within the Central European Time zone, which may offer a structured work schedule.