Posted on: 08/12/2025
Description :
About this Position :
Position : Senior Data Engineer (Tableau)
Experience : 7+ Years
Notice Period : Immediate to 30 Days
Work Location : Hyderabad - Hybrid
About the Role :
We are seeking a highly skilled Senior Data Engineer to design, develop, and optimize our data systems and pipelines.
The ideal candidate will have strong expertise in SQL, Python, AWS, and Tableau, and a passion for transforming raw data into actionable insights that drive business decisions.
Key Responsibilities :
- Design, build, and maintain scalable ETL / ELT data pipelines for ingestion, transformation, and storage.
- Work with business stakeholders to identify data requirements and translate them into effective data solutions.
- Develop and optimize SQL queries, stored procedures, and data models for high performance.
- Implement data integration solutions using AWS services such as S3, Glue, Redshift, Lambda, and RDS.
- Collaborate with analysts and business users to enable data visualization and reporting in Tableau.
- Ensure data accuracy, quality, and consistency through validation, testing, and monitoring.
- Automate workflows and data quality checks using Python scripting.
- Support data governance, documentation, and adherence to security and compliance standards.
Required Skills & Qualifications :
- Bachelors or Masters degree in Computer Science, Information Technology, or related field.
- 5+ years of hands-on experience in Data Engineering or similar roles.
- Strong expertise in SQL for data extraction, transformation, and optimization.
- Proficiency in Python for data manipulation, automation, and scripting.
- Solid experience with AWS cloud ecosystem (S3, Glue, Redshift, Lambda, IAM, etc.
- Hands-on experience with Tableau for dashboard development and data visualization support.
- Deep understanding of data warehousing, modeling, and ETL design patterns.
- Strong problem-solving skills and ability to work in agile, collaborative environments.
Good to Have :
- Experience with Airflow or other workflow orchestration tools.
- Exposure to Snowflake, Athena, or Data Lake architectures.
- Knowledge of API integration and data streaming tools (Kafka, Kinesis).
- Understanding of CI/CD, Git, and modern DevOps practices in data environments
Did you find something suspicious?
Posted by
Naheda Begum
Senior Talent Acquisition Executive at BLUMETRA SOLUTIONS INDIA PRIVATE LIMITED
Last Active: 9 Dec 2025
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1586676
Interview Questions for you
View All