HamburgerMenu
hirist

Sasken Technologies - Senior Data Architect - ETL/Data Warehousing

Sasken Technologies Limited
12 - 20 Years
Multiple Locations

Posted on: 12/02/2026

Job Description

Role : Senior Data Architect

Job Description :


Senior Data architect will work with Product Managers, Project Teams and Business analysts to finalize and improve data architecture, data lake architecture, data architecture modernization, data migration strategy and approach, data models for highly available data platforms. Data architect will also get involved in data architecture, data modelling for IIoT application projects that require data architecture of future scale.

Responsibilities include following :

- Create data architecture for digital platforms and applications using Microsoft Azure/AWS services.

- Architect analytics and reporting/BI applications above the datalake

- Identify performance bottlenecks from the data ingestion layer till the reporting layer and make improvements.

- Design and oversee the migration of data from legacy systems to Azure Fabric, Databricks creating modernization roadmaps.

- Architect data solutions : Define and implement enterprise data architectures on Azure/AWS, including data lakes, warehouses, and lakehouses using Microsoft Fabric, Databricks, AWS Redshift/Lakeformation.

- Design data pipelines : Develop and optimize ETL/ELT pipelines using tools like Azure Data Factory and Databricks for data integration and transformation.

- Implement data governance and security : Establish and enforce data governance, security, and compliance best practices within the Azure environment or Databricks environment.

- Ensure performance and scalability : Optimize the performance and cost-efficiency of data workflows, ensuring solutions are scalable and meet business needs.

- Provide technical leadership : Mentor junior team members, lead technical discussions, and collaborate with stakeholders to align technical solutions with business goals.

- Define data models and data architecture.

- Technology selection for data stores (such as SQL vs. NoSQL), data pipelines and data transformation

- Create data pipeline and architecture for IIoT applications based of Azure/AWS IoT stack.

- Define security and data governance method.

Skills :

- Experienced in Azure stack/AWS stack involved in data processing, data stores, data transformations.

- Experienced with Cloud Data Platforms such as Data bricks, Snowflake

- Good understanding of Data pipeline and data architecture including ingestion, data transformations, data processing, stores, providing API interfaces to applications for data access.

- Understanding of different data architecture patterns such as Lamba architecture

- Databases - Relational DBs, NoSQL DBs, Time Series DBs,

- Good understanding of AWS/Azure Cloud databases.

- Programming Languages - SQL, Python

- Real time & Batch ingestion of data.

- Azure/AWS Data Certification is a must

Location : Bangalore or Hyderabad, India

Experience Range : 12+ years

Educational Classification : Bachelor's degree or Master's in Computer Science or equivalent discipline

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in