HamburgerMenu
hirist

Job Description

Position : Data Architect Data Engineering Solutions

Experience : 10+ Years (7+ in Data Engineering/Architecture)


Location : Bengaluru, Telangana, Pune, Noida, Chennai, Faridabad, Hyderabad


Employment Type : Full-Time


Job Description :


Looking for an experienced Data Architect to design and optimize cloud-native, enterprise-scale data platforms. The role involves driving data strategy, building scalable pipelines, and enabling analytics & AI/ML initiatives.


Key Skills :


- 10+ yrs IT exp. (7+ in Data Engineering/Architecture)

- Cloud platforms : AWS, Azure, GCP (S3, Glue, Redshift, Synapse, BigQuery, etc.)

- Big Data : Spark, Hadoop, Databricks

- SQL & NoSQL, Data Modeling, Schema Optimization

- Python/Java/Scala programming

- Streaming : Kafka, Kinesis, Pub/Sub

- ETL tools : Informatica, Talend, Airflow, DBT

- Governance, lineage, data security frameworks

Preferred : Data Mesh/Fabric, Lakehouse, Docker/Kubernetes, MDM, Cloud certifications


Role Overview :


We are seeking an accomplished Data Architect with deep expertise in designing, building, and optimizing cloud-native, enterprise-scale data platforms. The ideal candidate will define and drive the organizations data strategy, ensure scalable and secure data architecture, and enable advanced analytics and AI/ML initiatives. This role requires strong technical depth, architectural leadership, and the ability to collaborate with diverse stakeholders to deliver high-performance data solutions.


Key Responsibilities :


- Define and own the data architecture vision, strategy, and roadmap for enterprise-level platforms.

- Design cloud-native, scalable, and secure data solutions across AWS, Azure, and/or GCP.

- Establish data modeling standards, schema optimization techniques, and data design patterns.

- Architect data lakes, data warehouses, lakehouses, and data mesh/fabric solutions.

- Build and optimize ETL/ELT pipelines using tools such as Informatica, Talend, Airflow, DBT, Glue.

- Leverage big data technologies (Spark, Hadoop, Databricks) for large-scale batch and streaming workloads.

- Implement real-time streaming pipelines with Kafka, Kinesis, or Pub/Sub.

- Guide teams in data ingestion, transformation, storage, and consumption frameworks.

- Implement data governance frameworks, metadata management, and lineage tracking.

- Define policies for data privacy, security, and compliance (GDPR, HIPAA, etc.).

- Enforce data quality standards, validation rules, and stewardship practices.

- Partner with business stakeholders, data scientists, and application teams to translate requirements into architectural solutions.

- Provide technical leadership and mentorship to data engineers and solution architects.

- Lead architectural reviews, POCs, and evaluations of new tools and technologies.

- Drive adoption of modern paradigms such as data mesh, data fabric, and lakehouse architectures.

- Stay updated on emerging trends in cloud, AI/ML, containerization, and data ecosystems.

- Continuously optimize cost, performance, and scalability of cloud-based data platforms.


Required Skills & Qualifications :


- 1015 years in IT, with at least 7+ years in data engineering and architecture.

- Proven track record of designing and delivering enterprise-scale, cloud-native data solutions.

- Cloud Platforms : AWS (S3, Glue, Redshift), Azure (Synapse, Data Lake, Data Factory), GCP (BigQuery, Pub/Sub).

- Big Data & Processing : Spark, Hadoop, Databricks.

- Databases : Strong SQL (RDBMS), NoSQL (MongoDB, Cassandra, DynamoDB).

- Programming : Proficiency in Python, Java, or Scala.

- Streaming : Kafka, Kinesis, or Google Pub/Sub.

- ETL/ELT Tools : Informatica, Talend, Airflow, DBT, or equivalent.

- Data Governance & Security : Data lineage, cataloging, encryption, access control.

- Experience with Data Mesh, Data Fabric, or Lakehouse architectures.

- Familiarity with Docker, Kubernetes, and container-based deployments.

- Exposure to MDM (Master Data Management) practices.

- Cloud certifications (AWS, Azure, or GCP) strongly preferred.

- Strong leadership and stakeholder management.

- Excellent communication and presentation abilities.

- Analytical mindset with problem-solving skills.

- Bachelors or Masters degree in Computer Science, Information Technology, or a related field.

- Competitive salary as per industry standards.

- Opportunity to architect solutions for large-scale, global enterprises.

- Work with cutting-edge cloud, big data, and AI/ML technologies.

- Leadership role with career advancement opportunities.


info-icon

Did you find something suspicious?