Posted on: 24/09/2025
Roles & Responsibilities :
- Design, develop, and maintain data pipelines for ingestion, processing, and storage using SQL, Python, and Big Data technologies.
- Build scalable solutions on Google BigQuery and manage large datasets efficiently.
- Collaborate with Data Scientists, Analysts, and Business teams to understand requirements and deliver actionable data insights.
- Optimize SQL queries, Python scripts, and BigQuery jobs for performance and cost efficiency.
- Implement data governance, quality checks, and security standards across all data assets.
- Monitor and troubleshoot ETL pipelines and data workflows to ensure reliability.
- Participate in architecture discussions and recommend best practices for Big Data solutions.
Required Skills :
- Strong experience in SQL, Python, and Big Data frameworks (Hadoop, Spark, or similar).
- Hands-on experience with Google BigQuery and cloud-based data platforms.
- Expertise in ETL/ELT processes and data warehousing concepts.
- Experience with data modeling, data pipelines, and large-scale data processing.
- Knowledge of cloud platforms (GCP, AWS, or Azure) is a plus.
- Strong problem-solving, analytical, and communication skills.
Desired Skills :
- Familiarity with Airflow, Dataflow, or other workflow orchestration tools.
- Experience with BI tools like Tableau, Looker, or Power BI.
- Certification in Google Cloud Data Engineering is a plus
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1551557
Interview Questions for you
View All