Posted on: 02/12/2025
Description :
Role : Data Architect.
Total Experience : 10-15 Years.
Location : Bhilai /Indore.
Job Description :
We are looking for a Data Architect with 10+ years of experience in designing, implementing, and governing modern data platforms.
The ideal candidate should have strong experience working with enterprise clients, hands-on expertise in Microsoft Fabric, and the ability to conceptualize and lead end-to-end Data Engineering and AI solution engagements.
This role is both consultative and technical, involving architecture design, solution roadmapping, client workshops, POC leadership, and guiding teams toward high-quality delivery.
Key Responsibilities :
Data Architecture & Engineering Leadership :
- Design scalable enterprise data architectures including lakehouse, data warehouse, Fabric-based analytics, and real-time processing layers.
- Architect and implement ingestion, transformation, modeling, governance, and data quality frameworks.
- Build reusable frameworks for metadata-driven pipelines, incremental loads, and large-scale data processing, similar to frameworks highlighted in enterprise-scale work.
- Define best practices for data modeling (OLTP/OLAP), data cataloging, security, storage optimization, and cost governance.
Microsoft Fabric & Modern Data Stack :
- Lead solutions leveraging Microsoft Fabric including Lakehouse, Data Factory, Data Pipelines, Power BI, OneLake, Direct Lake, and governance layers.
- Leverage experience with analogous platforms such as Azure Databricks, Synapse, and Data Factory for building high-performance pipelines.
- Fabric certification is a strong plus.
Consultative Client Engagement :
- Run client workshops, requirement discovery sessions, architecture reviews, and modernization roadmap discussions.
- Translate business needs into solution blueprints and work with technical teams for implementation.
- Advise clients on modernization strategies such as migrations from legacy systems to cloud platforms, lakehouses, or Fabric-based ecosystems.
- Present architectural options, tradeoffs, cost models, and delivery approaches.
POC Leadership & Technical Evangelism :
- Lead POCs and accelerated solution pilots focused on Data Engineering, Lakehouse modernization, and AI/ML adoption.
- Work directly with prospects and existing clients to demonstrate feasibility, value realization, and platform readiness.
- Stay updated on emerging trends such as streaming, DLT, governance, feature stores, and GenAI pipeline integration.
Hands-on Engineering Capability :
- Strong proficiency with Python, SQL, Spark, Fabric Data Factory, Delta/Parquet-based data processing, and cloud-native orchestration frameworks.
- Experience with enterprise-grade ETL/ELT pipelines (Spark, ADF, Databricks, Airflow, dbt) that handle large-scale volumes.
- Ability to get hands-on when needed during architecture validation, POCs, or team escalation scenarios.
Governance, Security & Compliance :
- Implement data governance, lineage, access controls, quality frameworks, and auditing consistent with enterprise standards.
- Familiarity with governance tools and practices across cloud platforms (Azure Purview, Unity Catalog, RBAC, IAM) as used in past enterprise roles.
Team Leadership :
- Mentor and guide engineers on architecture patterns, coding practices, cloud/data standards, and modern engineering approaches.
- Lead cross-functional Data Engineering squads through sprints, design sessions, and architecture boards.
- Establish engineering excellence practices across code quality, documentation, testing, and observability.
Required Skills & Experience :
- 8 to 12 years of hands-on Data Engineering and Data Architecture experience.
- Strong experience working for enterprise clients across domains such as finance, consulting, insurance, retail, or supply chain.
- Deep expertise in at least one cloud (Azure preferred) with strong Spark-based engineering background.
- Hands-on skills with Microsoft Fabric (Data Factory, Lakehouse, Direct Lake, Pipelines, Power BI).
- Strong experience with modern data ecosystems : Databricks, Synapse, ADF, Snowflake, BigQuery, Delta Lake, Kafka, APIs, distributed data systems.
- Solid understanding of data modeling, warehousing, and streaming architectures.
- Proven ability to lead client discussions, technical presentations, and architectural whiteboarding sessions.
- Strong problem-solving ability with the capability to design scalable, secure, and cost-optimized systems.
- Ability to work onsite from Bhilai or Indore as needed.
Preferred Qualifications :
- Microsoft Fabric certification (or ongoing).
- Databricks or Azure Data Engineering certifications.
- Experience integrating AI/ML or Generative AI pipelines into ETL workflows.
- Experience in leading migrations from legacy systems to modern platforms (Fabric, Databricks, Azure).
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1583627
Interview Questions for you
View All