Posted on: 28/11/2025
Senior Data Engineer (Snowflake / BigQuery / SQL / DBT) Coimbatore | Onsite Only
Experience : 5+ Years
Job Location : Hybrid - Coimbatore, India (Onsite Modern Workspace)
Shift : 2:00 PM 11:00 PM IST / 1 pm to 10 pm (Based on clients requirement)
Notice Period : Immediate Joiners Only
Job Type : Full-time (Permanent)
Soft Skills : Excellent Communication & Collaboration Skills
Why Join Us :
- Work with modern, cloud-native data stacks Snowflake, BigQuery, DBT, Prefect/Airflow, Python.
- Grow your career with enterprise-level projects in a fast-scaling engineering environment.
- Great work culture with supportive leadership and a tech-first mindset.
Onsite Coimbatore advantage :
- No congested city stress
- Affordable lifestyle
- Strong tech community
- Peaceful worklife environment
- Opportunity to own end-to-end data solutions and work on high-impact initiatives.
- Chance to upskill on GCP + modern orchestration frameworks.
- Stable, long-term role with clear career growth pathways.
Job Description :
We are seeking highly skilled Data Engineers with hands-on experience building scalable and reliable data pipelines. The ideal candidates will have solid expertise with modern cloud data platforms and orchestration tools, combined with strong problem-solving abilities and a proactive mindset.
Required Technical Skills :
Primary Skills (Must-Have) :
- Snowflake
- BigQuery
- SQL (Advanced)
- DBT
Secondary Skills (Good-to-Have) :
- Airflow / Prefect
- Python
- GCP (Google Cloud Platform)
- Terraform / IaC
- Git
Key Responsibilities :
- Design, build, and deploy scalable and high-performance data pipelines.
- Develop and manage data transformation workflows using DBT and Airflow/Prefect.
- Work extensively with Snowflake, BigQuery, and SQL for data modeling and warehousing.
- Write clean and efficient Python scripts for data processing and automation.
- Collaborate with multiple teams to deliver high-quality, reliable data solutions.
- Monitor, troubleshoot, and optimize production data pipelines.
- Ensure data quality, governance, and consistency across all systems.
Qualifications :
- 5+ years of hands-on experience as a Data Engineer or in related roles.
- Strong expertise in SQL, Snowflake, BigQuery, DBT, and Python.
- Deep understanding of data modeling, warehousing, and ETL/ELT processes.
- Experience working with GCP (AWS/Azure knowledge is a bonus).
- Familiarity with Terraform / IaC for environment automation.
- Experience with Git for version control.
- Excellent communication and teamwork skills.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1581368
Interview Questions for you
View All