Posted on: 03/10/2025
Key Responsibilities :
- Design, build, and maintain scalable data pipelines and platform solutions on Google Cloud Platform.
- Develop and manage workflows using Cloud Composer (Apache Airflow).
- Manage and optimize datasets in BigQuery using complex SQL and PL/SQL logic.
- Work with Cloud Storage for data ingestion, staging, and archival processes.
- Create and maintain tables, views, materialized views, and other database objects.
- Develop and optimize stored procedures and functions for application support.
- Ensure data quality, integrity, and consistency across systems.
- Perform query performance tuning using execution plans and best practices.
- Troubleshoot and resolve database performance issues.
- Write clean, maintainable, and well-documented Python scripts for automation and orchestration.
- Collaborate with developers, analysts, and stakeholders to define and deliver robust platform solutions.
- Document build processes and support materials for deployment and maintenance.
- Create test/sample data and perform bug fixes as needed.
- Follow and implement best practices for platform development, data design, and security.
Required Skills :
- Strong SQL/PLSQL skills including :
- Advanced joins, inline views, correlated queries
- Use of analytical and arithmetic functions
- Solid understanding of data warehousing concepts and ETL fundamentals
- Proficiency in Python programming
- Ability to read and optimize query execution plans
- Experience with database performance tuning
- Good documentation practices for build and deployment processes
- Strong problem-solving, analytical thinking, and debugging skills
Good to Have :
- Experience with Google Dataflow
- Exposure to CI/CD pipelines
- Familiarity with data modeling and data governance
- Prior work in agile environments or DevOps cultures
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1555061