Posted on: 27/01/2026
Data Engineer (Snowflake & Analytics Engineering)
Core Mission : Architecting the Modern Data Stack
Primary Engine : Snowflake + dbt / Matillion
Location : Ahmedabad (Strategic Hub)
Availability : Immediate to 30 Days (Critical Path)
Strategic Context :
In an era where "Data is the new Oil," our firm acts as the refinery. We don't just move bytes; we engineer intelligence. As a Data Engineer, you are joining a high-performance consulting squad dedicated to transforming messy, fragmented data into high-fidelity, fact-based assets.
You will be the architect of ELT pipelines that empower global organizations to make multi-million dollar decisions with absolute confidence.
The Technical Sovereignty (What You Own) :
1. The Modern ELT Stack :
- Snowflake Mastery : You are the custodian of the cloud data warehouse. You will optimize warehouse sizing, implement RBAC security, and perform deep-dive query profiling to ensure sub-second response times.
- Transformation Logic (dbt/Matillion) : You will move away from brittle ETL. You will build modular, version-controlled transformations using dbt or Matillion (DPC), following the Medallion (Bronze/Silver/Gold) architecture.
- Hybrid Integration : While we lead with the cloud, you will bridge the gap with legacy systems using SSIS, ensuring a 360-degree data view.
2. Orchestration & Cloud Ecosystem :
- Azure Integration : You will leverage Azure Data Factory and Data Lake Storage to orchestrate complex data movements from APIs, flat files, and on-premise databases.
- Analytics Engineering : You will treat data as codeapplying CI/CD, Git-based workflows, and automated testing (dbt tests) to ensure data quality is never compromised.
Candidate Blueprint (The Pedigree) :
- Experience : 3- 5 years in Data Engineering. You must have "survived" at least 2 years of hands-on Snowflake and dbt/Matillion deployment in production.
Technical DNA :
- Expert-level SQL (The foundation of everything we do).
- Proficient Python (For automation and notebook-driven engineering).
- Deep understanding of Kimball/Dimensional Modeling.
- The Consultant Mindset : You are comfortable with ambiguity, can handle high-pressure client demos, and can explain a "partitioning strategy" to a non-technical stakeholder.
Performance Benchmarks (KPIs) :
- Pipeline Reliability : Ensuring 99% uptime of scheduled dbt/Matillion jobs.
- Performance Optimization : Reducing Snowflake credit consumption through efficient SQL and clustering strategies.
- Data Quality : Implementing automated tests that catch "bad data" before it reaches the client-facing dashboards.
Why This Role?
- The "Joining Bonus" Edge : For candidates who can hit the ground running immediately, we offer a ?1 Lakh joining bonus.
- Consulting Variety : You won't be bored. You will solve different problems for different global clients every quarter.
- Office Synergy : This is a 5-day-a-week collaborative office environment in Ahmedabad, designed for rapid peer-to-peer learning.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1606273