This job post is closed and the position is probably filled. Please do not apply.
π€ Automatically closed by a robot after apply link
was detected as broken.
Description:
Pavago is hiring a Database Engineer to support an innovative project involving the integration of advanced algorithms and dynamic user interactions for their client.
The project emphasizes the analysis and visualization of large-scale engineering datasets, requiring cutting-edge expertise in database architecture and optimization.
Key responsibilities include creating, optimizing, and enhancing database systems to efficiently manage and process large-scale data operations.
The role involves implementing robust data pipelines for seamless data exchange between algorithms and the database.
The Database Engineer will enhance database performance by optimizing SQL queries, indexing, and architecture.
Collaboration with the development team is essential to ensure real-time interactivity between users and graph data.
The position also requires assisting in refining proprietary algorithms to improve statistical and economic insights and contributing to the integration of machine learning frameworks for enhanced data processing and analysis.
Requirements:
Candidates must have strong knowledge of SQL, C#, and Python for optimizing databases and integrating algorithms, along with familiarity in big data technologies, cloud services (Azure, MongoDB, NoSQL), and machine learning frameworks like TensorFlow or PyTorch.
Exceptional analytical and problem-solving abilities are required, as well as strong communication skills.
Proficiency with project management tools such as Asana, Azure Boards, Slack, or Jira is necessary.
Candidates should have 5-6 years of proven experience in database design and optimization, developing data pipelines for large-scale operations, and working in fast-paced environments with cross-functional teams.
Benefits:
The position offers the opportunity to work on a cutting-edge project at the intersection of data optimization and innovation.
It provides a remote working environment with flexible working hours.
Candidates will have the chance to enhance their skills in database architecture, machine learning integration, and real-time data processing.
The role allows for collaboration with a dynamic team and the opportunity to tackle complex data challenges.