HamburgerMenu
hirist

Job Description

Key Responsibilities :

- Build and maintain scalable ETL/ELT data pipelines using Python and cloud-native tools.

- Design and optimize data models and queries on Google BigQuery for analytical workloads.

- Develop, schedule, and monitor workflows using orchestration tools like Apache Airflow or Cloud Composer.

- Ingest and integrate data from multiple structured and semi-structured sources, including MySQL, MongoDB, APIs, and cloud storage.

- Ensure data integrity, security, and quality through validation, logging, and monitoring systems.

- Collaborate with analysts and data consumers to understand requirements and deliver clean, usable datasets.

- Implement data governance, lineage tracking, and documentation as part of platform hygiene.

Must-Have Skills :

- 1 to 7 years of experience in data engineering or backend development.

- Strong experience with Google BigQuery and GCP (Google Cloud Platform).

- Proficiency in Python for scripting, automation, and data manipulation.

- Solid understanding of SQL and experience with relational databases like MySQL.

- Experience working with MongoDB and semi-structured data (e.g., JSON, nested formats).

- Exposure to data warehousing, data modeling, and performance tuning.

- Familiarity with Git-based version control and CI/CD practices.


info-icon

Did you find something suspicious?