Posted on: 27/01/2026
Description :
Data Engineering Solution Architect
Core Mission : Designing the Future of Fact-Based Intelligence
Location : Hyderabad (Strategic Hub - 5 Days On-site)
Notice Period : Immediate to 30 Days (Non-negotiable)
Key Tech Pillars : Snowflake, Databricks, dbt, Python, Airflow, & Multi-Cloud (AWS/Azure/GCP)
I. The Mandate : Architecting at Scale :
As a Solution Architect, you are the primary technical visionary for our Data Practice. You will not just oversee pipelines; you will engineer the "Digital Sovereignty" of our global clients. You will bridge the gap between complex business problems and elegant, scalable technical solutionsdesigning architectures that support millions of transactions and power real-time AI/ML insights. This is a role for a "Leader-Architect" who thrives in the high-pressure environment of strategic consulting.
II. Strategic Pillars of Responsibility :
1. End-to-End Architectural Governance :
- Modern Data Stack Design : Lead the blueprinting of high-performance Snowflake and Databricks ecosystems. You will own the transition from traditional ETL to high-velocity ELT frameworks using dbt and Matillion.
- Medallion & Layered Frameworks : Architect multi-layered data lakes (Bronze, Silver, Gold) following the Medallion Architecture, ensuring data quality, lineage, and governance are embedded at every stage.
- Hybrid & Multi-Cloud Orchestration : Design resilient data movement strategies across AWS, Azure, and GCP, leveraging cloud-native services like BigQuery, Redshift, and Athena.
2. Streaming, APIs, and Real-Time Integration :
- Event-Driven Ingestion : Architect solutions for real-time and micro-batch data streaming using Kafka, Kinesis, or Pub/Sub.
- API Sovereignty : Lead the design for API-based data integration, ensuring sub-second latency for critical reporting and AI-driven applications.
- Orchestration Mastery : Define global standards for workflow automation using Airflow, Control-M, or equivalent enterprise schedulers.
3. Thought Leadership & Team Multiplication :
- Capability Building : Act as the "Engineering Standard-Bearer," defining reusable assets and frameworks to improve delivery velocity across the organization.
- Mentorship : Directly mentor senior data engineers, fostering a culture of technical excellence and continuous learning.
- Visualization Strategy : Oversee the delivery of sophisticated reporting layers using Streamlit, Power BI, or Tableau, turning cold data into hot insights.
III. The Pedigree (Candidate Blueprint) :
- Experience : 8-12 years in core Data Engineering and Architecture. You must have led at least 23 large-scale migrations from legacy to modern cloud platforms.
- Technical Depth : - Expert-level Python/PySpark and SQL.
- Deep expertise in Snowflake or Databricks (Certifications like SnowPro or Databricks Architect are highly preferred).
- Familiarity with "Traditional" ETL (Informatica/SSIS) to manage legacy-to-cloud transitions.
- AI/ML Integration : Exposure to integrating AI/ML modules within data pipelines is a mandatory requirement for this evolution of the role.
IV. Performance Benchmarks :
- Architectural ROI : Reducing client data latency and cloud compute costs through efficient performance tuning.
- Delivery Excellence : Scaling the internal knowledge base of dbt macros, RBAC hierarchies, and CI/CD templates.
- Stakeholder Confidence : Leading successful client demos and technical defense of proposed blueprints.
V. Why Join the Practice ?
- C-Suite Influence : You will report into the leadership and have a direct say in the practices technology roadmap.
- Complexity : Solve "Big Four" level problems with the agility of a specialized tech firm.
- Wealth & Growth : A premium compensation package with aggressive career growth into Practice Leadership.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1606329