HamburgerMenu
hirist

Job Description

Description :

What Were Looking For :

- Bachelor's or master's degree in computer science, Data Engineering, Information Systems, or related technical field.


- 4+ years of hands-on experience in data engineering with a focus on data integrations, warehousing, and analytics pipelines.

- Hands-on experience with :


- Google Big Query as a centralized data warehousing and analytics platform.


- Python for scripting, data processing, and integration logic.


- SQL for data transformation, complex querying, and performance tuning.


- DBT for building modular, maintainable, and reusable transformation models.


- Airflow / Cloud Composer for orchestration, dependency management, and job scheduling.

- Solid understanding of ETL/ELT pipeline development and cloud-based data architecture.


- Strong knowledge of data quality frameworks, validation methods, and governance best practices.

Preferred Skills (Optional) :


- Experience with Ascend.io or comparable ETL platforms such as Databricks, Dataflow, or Fivetran.


- Familiarity with data cataloging and governance tools like Collibra.


- Knowledge of CI/CD practices, Git-based workflows, and infrastructure automation tools.


- Exposure to event-driven or real-time streaming pipelines using tools like Pub/Sub or Kafka.


- Strong problem-solving and analytical mindset with the ability to think broadly and identify innovative solutions and able to quick to learn new technologies, programming languages, and frameworks.


- Excellent communication skills, both written and verbal.


- Ability to work in a fast-paced and collaborative environment.


- Ability to provide on ground troubleshooting and diagnosis to architecture and design challenges


- Good experience in Agile Methodologies like Scrum, Kanban, and managing IT backlogs.


- Be a go-to expert for data technologies and solutions


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in