We are seeking a skilled and analytical Data Architect & Business Intelligence Specialist to design, model, and implement robust data architectures, pipelines, and reporting frameworks.
This role will be responsible for building and maintaining data models, overseeing data migrations, and developing scalable data warehouse solutions to support business intelligence and analytics initiatives.
Key responsibilities include designing and maintaining the enterprise data architecture aligned with business and technical requirements, developing logical and physical data models using industry best practices, and establishing and maintaining metadata standards and data dictionaries.
The role also involves ensuring data consistency, quality, and governance across all systems, designing and building efficient and scalable data pipelines for structured and unstructured data, and developing ETL/ELT processes using tools like Apache Airflow, Talend, Informatica, or Azure Data Factory.
Additional responsibilities include planning and executing data migration projects from legacy systems to modern data platforms, ensuring data integrity and minimal downtime during migration activities, and collaborating with stakeholders to map old data structures to new architecture.
The position requires designing, implementing, and managing modern data warehouses (e.g., Snowflake, Redshift, BigQuery, Synapse) while ensuring high performance, scalability, and security of data warehousing environments.
The role also includes collaborating with business stakeholders to gather reporting and analytics requirements, building interactive dashboards and reports using tools like Power BI, Tableau, Looker, or Qlik, enabling self-service reporting, and ensuring data accuracy in BI platforms.
Requirements:
A Bachelorโs or Masterโs degree in Computer Science, Information Systems, Data Engineering, or a related field is required.
Candidates must have 5+ years of experience in data architecture, modeling, pipelines, and BI/reporting.
Strong expertise in SQL and data modeling (3NF, dimensional, star/snowflake schemas) is essential.
Experience with data warehouse technologies and cloud platforms (AWS, Azure, GCP) is required.
Proficiency in BI/reporting tools and data visualization best practices is necessary.
Knowledge of Python, Scala, or other scripting languages is a plus.
Familiarity with data governance, security, and compliance standards is important.
Excellent problem-solving skills and attention to detail are required.
Strong communication and collaboration skills with both technical and non-technical stakeholders are essential.
The ability to translate complex technical concepts into business language is necessary.
Benefits:
The position offers competitive compensation and benefits packages.
Opportunities for professional development and continuous learning are provided.
Employees will have access to a collaborative and innovative work environment.
The role includes flexible working arrangements to promote work-life balance.
Health and wellness programs are available to support employee well-being.