HamburgerMenu
hirist

Solution Architect - Data Engineering

Bluglint solutions
8 - 12 Years
Ahmedabad

Posted on: 24/02/2026

Job Description

Description :


We are looking for an experienced Data Engineering Solutions Architect to join our growing Data Practice.


The ideal candidate will have 812 years of hands-on experience designing, architecting, and delivering large-scale data warehousing, data lake, ETL, and reporting solutions across modern and traditional data platforms.


You will play a key role in defining scalable, secure, and cost-effective architectures that enable advanced analytics and AI-driven insights for our clients.

This role demands a balance of technical depth, solution leadership, and consulting mindset - helping customers solve complex data engineering challenges while also building internal capability and best practices within the organization.

Key Responsibilities :

- Design and architect end-to-end data solutions using technologies like Snowflake, Databricks, dbt, Matillion, Python, Airflow, Control-M, and cloud-native services on AWS/Azure/GCP.

- Define and implement data ingestion, transformation, integration and orchestration frameworks for structured and semi-structured data.

- Architect data lakes and data warehouses with an emphasis on scalability, cost optimization, performance, and governance.

- Support real-time and API-based data integration scenarios; design solutions for streaming, micro-batch, and event-driven ingestion.

- Lead design and delivery of data visualization and reporting solutions using tools such as Power BI, Tableau, and Streamlit.

- Collaborate with business and technical stakeholders to define requirements, design architecture blueprints, and ensure alignment with business objectives.

- Establish and enforce engineering standards, frameworks, and reusable assets to improve delivery efficiency and solution quality.

- Mentor data engineers and help build internal capability on emerging technologies.

- Provide thought leadership around modern data platforms, AI/ML integration, and data modernization strategies.

Required Qualifications :


- 8 to 12 years of experience in data engineering and architecture, including hands-on solution delivery.

- Deep expertise with Snowflake or Databricks, with strong working knowledge of tools like dbt, Matillion, SQL, and Python or PySpark.

- Experience designing and implementing data pipelines and orchestration using tools like Airflow, Control-M, or equivalent.

- Familiarity with cloud-native data engineering services (Such as AWS Glue, Redshift, Athena, GCP BigQuery, Dataflow, Pub/Sub, etc.) or similar.

- Strong understanding of data modelling, ELT/ETL design, and modern architecture frameworks (medallion, layered, or modular architectures).

- Experience integrating and troubleshooting APIs and real-time data ingestion technologies (Kafka, Kinesis, Pub/Sub, REST APIs).

- Familiarity with traditional ETL and data integration tools (Informatica, SSIS, Oracle Data Integrator, etc.).

- Excellent understanding of data governance, performance tuning, and DevOps for data (CI/CD, version control, monitoring).

- Strong communication, problem-solving, and stakeholder management skills.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in