Posted on: 12/12/2025
Description :
Key Responsibilities :
- Design, develop, and maintain scalable ETL/ELT data pipelines using Python and SQL.
- Build, optimize, and manage Tableau dashboards and data visualizations.
- Work with data architects and analysts to design efficient data models and datasets for reporting.
- Manage and optimize data workflows on AWS services (e.g., S3, Redshift, Lambda, Glue, Athena).
- Collaborate with cross-functional teams to gather requirements and translate them into technical solutions.
- Ensure data quality, governance, performance, and reliability across data systems.
- Troubleshoot production issues, perform root cause analysis, and implement robust fixes.
Required Skills & Qualifications :
- 7+ years of experience as a Data Engineer or in a similar role.
- Strong hands-on experience with Tableau (dashboarding, performance tuning, custom calculations).
- Proficiency in Python for data processing, automation, and scripting.
- Advanced SQL skills including query optimization, stored procedures, and data modeling.
- Experience working with AWS cloud services for data engineering.
- Strong understanding of data warehousing and ETL concepts.
- Ability to work in a hybrid environment and collaborate with global teams.
Nice-to-Have Skills :
- Experience with Snowflake / Redshift / BigQuery.
- Knowledge of CI/CD pipelines for data workflows.
- Familiarity with Airflow or other orchestration tools.
- Exposure to advanced Tableau features (Extensions, Prep, Server Admin)
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1588661