Please, let Gorilla Logic know you found this job
on RemoteYeah.
This helps us grow 🌱.
Description:
Gorilla Logic is looking for an Expert Data Engineer to assist in organizing the client's system data into dashboards and on-demand reports.
This role is unique and highly technical, requiring strong database and reporting experience to deliver leading-edge solutions.
The successful candidate will work effectively with teammates and will be measured by their ability to couple critical thinking with self-motivation, enthusiasm, and determination.
Responsibilities include acting as a bridge between engineering and analytics to provide high fidelity data to the business, working directly with the data team to build out a new ETL process, and collaborating with the engineering team to build a new Enterprise Data Warehouse (EDW).
The role also involves helping to build out the governance strategy on the data platform, breaking down requirements to gain clarity on critical use cases, and being self-directed while supporting the data needs of multiple teams.
Requirements:
Candidates must have 7+ years of experience as a Data Engineer or in similar roles.
Experience in designing and building data pipelines using at least one traditional ELT/ETL tool is required.
Solid experience with Snowflake, Redshift, or Databricks is necessary.
Candidates should have experience with RAG pipelines pulling from lakes or warehouses and vector stores.
Knowledge of modern techniques and tools for Data Engineering concepts, especially data ingestion, data quality control, curation, enrichment, and distribution is essential.
Experience in traditional RDBMS is required.
Candidates must have experience in at least one cloud platform, preferably AWS.
Familiarity with tools similar to AWS AppFlow that can automate data flows between SaaS and AWS services is necessary.
Experience in designing and building high-performance data pipelines is required.
Advanced experience in SQL is essential.
Candidates should have experience in event-driven architectures such as Pub/Sub, Kafka, RabbitMQ, AMQP, and SNS.
A strong understanding of overall data governance concepts is required.
Candidates must demonstrate strong customer focus and the ability to recognize the impact of decisions on end users.
Strong verbal and written communication skills and the ability to interact with team members and external customers are essential.
Candidates should have 7+ years of experience working with Application Development teams.
Experience in owning and optimizing analytics schemas, designing the structure of data warehouses, and implementing optimization techniques to make data accessible and distributed is required.
Experience in ETL scheduling, automating tasks, data refresh, and balancing processing/cost requirements with business needs is necessary.
Experience modeling complex data sets to enable analytics teams to query and distribute business intelligence assets is required.
Bonus skills include experience with data pipeline development for AI/ML.
Benefits:
The position offers a full-time salary with the flexibility of remote work from Colombia.
Employees will have the opportunity to work in a highly technical environment with a focus on innovative data solutions.
The role provides the chance to collaborate with a talented team and contribute to significant projects in data engineering and analytics.
Employees will gain experience with cutting-edge tools and technologies in the data and AI space.
The company promotes a culture of self-motivation and critical thinking, allowing for personal and professional growth.
Apply now
Please, let Gorilla Logic know you found this job
on RemoteYeah
.
This helps us grow 🌱.