HamburgerMenu
hirist

Job Description

Job Title : Senior Data Engineer (CTH)


Experience : 8 - 10 Years Work Mode : Onsite (1st Month), then Hybrid


Location : Trivandrum


Joining Preference : Immediate Joiners


About the Role :


We are seeking a highly experienced and results-driven Senior Data Engineer to join our team. This is a critical Contract-to-Hire (CTH) role for an individual with extensive experience designing, building, and optimizing large-scale data pipelines and platforms in the cloud. You will be responsible for ensuring data availability, quality, and performance for analytical and operational use cases, working within a complex, multi-cloud (Azure/AWS) environment. Immediate availability is strongly preferred.


Key Responsibilities :


Data Pipeline Development & Architecture :


- Design, build, and maintain robust, scalable, and efficient ETL/ELT data pipelines across Azure and/or AWS cloud ecosystems.


- Utilize Azure Data Factory (ADF), Databricks, and Azure Synapse Analytics to ingest, transform, and store large volumes of structured and unstructured data.


- Implement complex data transformation logic, including Slowly Changing Dimensions (SCD) types, data cleansing, and validation routines.


Coding & Scripting :


- Write highly optimized SQL and PL/SQL queries for data manipulation, stored procedures, and complex joins.


- Develop robust data processing scripts and automation workflows using Python.


BI & Data Modeling :


- Design and optimize data models for data marts and data warehouses, ensuring high performance for reporting.


- Develop and manage reporting structures using SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services).


- Collaborate with BI teams to ensure data is efficiently exposed for visualization via tools like Power BI.


DevOps & Operations :


- Utilize job scheduling tools like UC4 and version control systems like TFS for process management.


- Work with CI/CD tools such as Jenkins to automate deployment and release processes. Experience with infrastructure-as-code tools like Terraform is a plus.


Performance & Quality :


- Apply a deep understanding of data structures, algorithms, and indexing techniques for performance optimization of data processing and query execution.


- Ensure high standards of data quality, reliability, and governance.


Collaboration & Delivery :


- Act as a technical leader, working effectively within an onshore-offshore delivery model.


- Exhibit strong interpersonal and communication skills to interact with global teams and business stakeholders.


- Demonstrate the ability to work independently, navigating and resolving issues in ambiguous situations.


Required Skills & Experience :


Experience :


- 8 - 10 years of progressive experience in Data Engineering, ETL development, or Business Intelligence.


Core Technology Stack :


- Expert-level proficiency in SQL and PL/SQL for complex database operations.


- Strong proficiency in Python for scripting and data manipulation.


- Proven, hands-on experience with key Azure services : Azure Data Factory, Azure Databricks, and Azure Synapse Analytics.


BI/MS Stack Experience :


- Solid experience with Microsoft data tools : SSIS and SSAS.


Data Concepts :


- Deep understanding of data warehousing concepts, dimensional modeling, and implementing Slowly Changing Dimensions (SCD).


Tooling & DevOps :


- Familiarity with job scheduling tools like UC4.


- Experience with version control systems (e.g., TFS) and CI/CD tools (e.g., Jenkins).


Technical Acumen :


- Excellent grasp of data structures, algorithms, and techniques for performance tuning large data systems.


Soft Skills :


- Proven ability to work effectively in a global onshore-offshore delivery model. Exceptional analytical, problem-solving, and communication skills.

info-icon

Did you find something suspicious?