Posted on: 03/09/2025
About the Role :
We are seeking an experienced Data Architect to design and oversee the architecture of our enterprise data ecosystem. This role requires a deep understanding of data modeling, data integration, and cloud-based data solutions. You will collaborate with business and technology teams to create scalable, secure, and high-performing data platforms that enable analytics, reporting, and advanced insights.
Key Responsibilities :
- Design and implement enterprise-level data architecture strategies aligned with business needs.
- Define and maintain data models (conceptual, logical, and physical) for transactional, analytical, and big data systems.
- Lead the design of data pipelines, data lakes, and data warehouses ensuring scalability, performance, and cost optimization.
- Establish data governance, quality, and security standards across the organization.
- Collaborate with stakeholders to understand data requirements and translate them into architectural blueprints.
- Evaluate and recommend tools, technologies, and cloud platforms (AWS, Azure, GCP, Snowflake, etc.) to optimize data solutions.
- Guide ETL/ELT developers, data engineers, and BI teams in implementing best practices.
- Monitor, troubleshoot, and optimize data systems for performance and reliability.
- Act as a thought leader, mentoring team members and contributing to the companys overall data strategy.
Key Requirements :
- Bachelors or Masters degree in Computer Science, Information Systems, or a related field.
- 8+ years of experience in data engineering, data warehousing, or database development.
- Proven experience in designing and implementing data architectures at scale.
- Strong expertise in SQL, relational databases, and data modeling techniques (OLTP, OLAP, dimensional modeling).
- Hands-on experience with ETL/ELT tools (Informatica, Talend, DataStage, SSIS, etc.).
- Proficiency in cloud platforms (AWS Redshift, Azure Synapse, GCP BigQuery, Snowflake).
- Solid understanding of data governance, MDM, metadata management, and security frameworks.
- Strong analytical, problem-solving, and communication skills.
Good to Have :
- Familiarity with big data technologies (Hadoop, Spark, Kafka).
- Experience in NoSQL databases (MongoDB, Cassandra, etc.).
- Exposure to machine learning data pipelines and real-time analytics.
- Knowledge of DevOps, CI/CD, and infrastructure-as-code for data solutions.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1540514
Interview Questions for you
View All