This job post is closed and the position is probably filled. Please do not apply.
🤖 Automatically closed by a robot after apply link
was detected as broken.
Description:
The Data Quality Engineer will be responsible for developing a data quality layer in a large-scale data infrastructure.
They will create manual and automated tests to monitor and test data in the warehouse, using the Clickhouse database with standard SQL support.
The role involves investigating blockchain data structures, invariants, consensus, and protocols for data test creation.
Automated scripts will be developed to fix data in automatic or semi-automatic mode to ensure continuous data quality.
Success metrics include test coverage over the data and the number of client complaints on data quality.
Responsibilities include designing and implementing the data quality layer, executing a data quality testing framework, preparing and executing tests in an Agile and DevOps environment, and collaborating with subject matter experts to develop and validate test scenarios.
The role also involves analyzing, debugging, and documenting quality issues, recording and reporting test status, supporting quality assurance initiatives, and following a Shift-left testing approach.
Requirements:
Minimum 5+ years of hands-on experience with databases.
Experience working with Big data products.
Good knowledge of SQL, data warehousing, data analytics, APIs, etc.
Proficiency in Russian is mandatory.
Experience with Clickhouse is mandatory.
Benefits:
100% Remote Policy (Work from anywhere in the world).
Leaves are not tracked (Responsibility-driven culture).
Opportunity to work and collaborate with a truly global team spread across 5 countries.
Flexible work hours.
Yearly trip with Bitquery team to any remote destination.