Posted on: 16/12/2025
Description:
About The Role
IKCON Digital IT Services is looking for a highly experienced Senior Snowflake Data Engineer to lead the design, development, and optimization of our modern cloud data platforms. The role involves driving end?to-end data engineering initiatives, mentoring a high?performing team, and partnering with business and technology stakeholders to enable data?driven decision making.
Key Responsibilities:
- Design, build, and optimize scalable data pipelines and Snowflake-based data warehouses to support analytics and reporting needs.
- Architect and implement robust ELT/ETL workflows using frameworks such as Airflow, dbt, Informatica, and Matillion.
- Lead multi cluster Snowflake implementations including performance tuning, cost optimization, Snowpark, UDFs, and advanced security features.
- Design and maintain enterprise?grade data models, data marts, and medallion/lakehouse architectures aligned with business requirements.
- Implement strong data governance, data quality, lineage, and security controls across cloud and on?prem ecosystems.
- Build and maintain data solutions on multi?cloud platforms (AWS, Azure, GCP) using Infrastructure as Code (Terraform, CloudFormation).
- Collaborate closely with data architects, analysts, data scientists, and application teams to deliver reliable, high?quality data products.
- Provide technical leadership to a team of engineers, driving best practices, code reviews, CI/CD, monitoring, and observability.
- Troubleshoot complex data issues, optimize SQL and pipeline performance, and ensure high availability and reliability of data platforms.
- Stay updated with Snowflake, data engineering, and cloud technology advancements and champion their adoption within the team.
Required Skills and Qualifications :
- 8 to 10 years of overall experience in Data Engineering, with 3 to 5 years of strong, hands?on Snowflake experience.
- 2 to 3 years of experience in leading and mentoring data engineering teams.
- Expert level knowledge of Snowflake including multi?cluster warehouses, performance tuning, cost optimization, Snowpark, UDFs, and advanced security.
- Strong experience in ETL/ELT and data pipeline architecture using tools such as Airflow, dbt, Informatica, and Matillion.
- 5+ years of experience in cloud platforms (AWS, Azure, GCP) with solid understanding of VPC, networking, security, and storage.
- Hands on experience with Infrastructure as Code (Terraform, CloudFormation) and CI/CD processes.
- Deep expertise in SQL, data modeling (Data Vault, Kimball), and building enterprise data warehouses and data marts.
- Strong programming skills in Python and/or Java, and experience with Big Data and streaming technologies (Spark, Kafka).
- Solid understanding of DevOps concepts (Docker, Kubernetes, monitoring, logging, alerting).
- Excellent communication, stakeholder management, problem's olving skills, and the ability to drive architectural decisions.
Nice To Have:
- Snowflake certifications such as SnowPro Core and Advanced Data Engineer.
- Experience with data mesh, data lakehouse, and modern analytics stacks.
- Exposure to data governance tools and frameworks.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1590607