Posted on: 14/07/2025
Innovatily is looking to hire Data Architect who are available to spend few hours a week and will be responsible for designing, implementing, and maintaining scalable and high-performance data solutions to support the organizations strategic objectives. This role combines deep technical expertise in data modeling, data integration, and database technologies with an understanding of business processes and data governance best practices.
Key Responsibilities :
- Database & Data Warehouse Management : Design and oversee the implementation of relational and non-relational databases. Plan and manage data warehouse and data lake architectures. Optimize database performance, partitioning, and indexing strategies.
- Data Integration & ETL : Design data integration workflows to connect disparate systems (ERP, CRM, third-party APIs). Oversee the development of ETL pipelines, ensuring data accuracy and consistency. Work closely with data engineers and BI developers to enable reporting and analytics.
- Data Governance & Quality : Define and implement data governance policies, metadata management, and data lineage tracking. Establish data quality frameworks and monitor compliance. Ensure adherence to data privacy regulations (GDPR, CCPA, etc.).
- Collaboration & Stakeholder Engagement : Partner with business units, analysts, and engineering teams to understand data requirements. Translate business needs into scalable technical solutions. Provide technical leadership, mentorship, and guidance to data engineering teams.
- Emerging Technologies & Continuous Improvement : Evaluate and recommend new tools, platforms, and technologies. Drive the adoption of modern data architectures (e.g., cloud-native, serverless, streaming). Promote best practices for continuous integration, deployment, and monitoring.
Qualifications :
Education & Experience :
- 8+ years of experience in data architecture, data engineering, or database design roles.
- Proven experience with cloud platforms (AWS, Azure, GCP) and data services (Redshift, BigQuery, Snowflake).
- Expertise in SQL and NoSQL databases, data modeling, and performance tuning.
Skills & Competencies :
- Strong understanding of data warehousing concepts, star and snowflake schemas.
- Proficiency in ETL tools (e.g., Informatica, Talend, Apache NiFi, dbt).
- Knowledge of data governance frameworks and tools (Collibra, Alation, Apache Atlas).
- Familiarity with big data technologies (Spark, Hadoop, Kafka).
- Excellent communication skills to translate complex technical concepts for business stakeholders.
- Problem-solving mindset with a focus on delivering high-quality, scalable solutions.
Preferred Qualifications :
- Experience designing real-time streaming pipelines and event-driven architectures.
- Exposure to machine learning data pipelines and MLOps.
Key Attributes for Success :
- Ability to work cross-functionally in a fast-paced environment.
- Strong leadership and mentoring capabilities.
- Detail-oriented and committed to high standards of data integrity.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1512689
Interview Questions for you
View All