Posted on: 17/09/2025
Mode of Work : On-site
Experience : 5-10 Years
Location : Hyderabad
Shift : 2 PM - 11 PM
Job Summary :
- Design, develop, and optimize ETL workflows using DataStream, Cloud Composer, and Dataplex.
- Integrate and transform data from diverse sources into data warehouses/lakes.
- Ensure data quality, governance, and performance tuning of pipelines.
- Collaborate with cross-functional teams to define requirements and deliver solutions.
- Support deployment, monitoring, and troubleshooting of data integration processes.
Key Responsibilities :
- Infrastructure support & monitoring of ETL tasks.
- Performance optimization and proactive production support.
- Develop automation scripts (Python, Bash, PowerShell).
- Manage security & compliance for data/report access.
- Build ad-hoc Tableau dashboards for executive management.
- Ensure cost-effective cloud implementations (GCP Big Query).
Required Skills & Qualifications :
- Hands-on with DataStream, Cloud Composer, Dataplex.
- Strong PL/SQL expertise (Big Query preferred).
- Proficiency in Google Cloud Big Query.
- Hands-on with Python, Tableau, SAP Business Objects (Webi).
- Strong problem-solving, analytical, and communication skills.
- Education : Bachelors in CS/Engineering (8+ years in Data Analytics).
- Experience in data pipelines & cloud migration (on-prem ? cloud DWs).
Key Skills :
- Python
- PL/SQL (Big Query preferred)
- GCP services DataStream, Cloud Composer, Dataplex
Did you find something suspicious?
Posted By
Posted in
Data Analytics & BI
Functional Area
Data Analysis / Business Analysis
Job Code
1548165
Interview Questions for you
View All