Posted on: 19/08/2025
Job Description :
Data Automation Engineer to join our growing team. This role is ideal for a self-starter thrives in a collaborative environment and is passionate about automation, scripting, and data operations. You will be responsible for developing and maintaining automation scripts, auditing database log tables, researching and developing new data pipeline solutions, and ensuring data consistency and reliability
across systems
Roles & Responsibilities :
Automation & Scripting :
- Design, write, and maintain Bash and Python scripts automate data workflows and operational processes.
- Build robust and reusable automation frameworks reduce manual effort and streamline deployments.
Database & Audit Table Management :
- Monitor and manage database audit/log tables for integrity, compliance, and traceability.
- Implement logging mechanisms for data pipeline transparency and error tracking.
SQL Development & Optimization :
- Write and optimize complex SQL queries for data extraction, transformation, and reporting.
- Perform database health checks and troubleshoot performance issues.
Data Pipeline R&D :
- Conduct research and proof-of-concepts (PoCs) for new data pipeline technologies and frameworks.
- Evaluate and integrate new tools or platforms intexisting workflows based on team needs.
Collaboration & Support :
- Work closely with Data Engineers, Analysts, and DevOps teams support data delivery and infrastructure.
- Participate in code reviews, stand-ups, and sprint planning To ensure quality and velocity.
Fast Turnaround & Problem Solving :
- Own end-to-end task execution with minimal supervision.
- Resolve data issues quickly, often under tight deadlines, while maintaining high quality.
Required Skills & Qualifications :
- Strong proficiency in Bash scripting and Python for automation
- Solid SQL skills, with the ability twrite and debug complex queries
- Experience managing and auditing DB log tables, history tables, or change data capture (CDC)
- Exposure to modern data pipeline tools (Airflow, DBT, custom ETL jobs, etc.) is a plus
- Familiarity with Linux environments, version control (e.g., Git), and CI/CD workflows
- Strong analytical mindset and the ability to conduct independent research
- Excellent communication skills and a collaborative team player attitude
- Demonstrated ability manage multiple priorities and deliver fast, reliable results
- Experience with cloud platforms like AWS Or GCP, or Azure
Skills to be evaluated on :
- SQL PLSQL BASH ELT DBT-Airflow-Python Bash-DB-Monitoring
Mandatory Skills :
- SQL, PLSQL, BASH, ELT, DBT, Airflow, Python, Bash, DB Monitoring
Desirable Skills :
- SQL, PLSQL, BASH, ELT, DBT, Airflow, Python, Bash, DB Monitoring
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1531694
Interview Questions for you
View All