Posted on: 27/11/2025
Description :
We are seeking a highly skilled DataOps Engineer to join our growing platform team.
The successful candidate will play a critical role in delivering innovative and customer-centric data solutions, with a focus on hands-on execution and quick delivery of reports and insights.
We are in an exciting period of evolution and looking for a DevOps Engineer to support the build of our new Data Platform which comprises Data Lakes in AWS and Data Warehouses in Snowflake.
As a Data DevOps Engineer you will productionise data pipelines as well as data & analytics solutions.
You will evangelise and continuously improve Collinson DevOps capabilities as well as ensuring that all Data products and projects are designed, delivered and supported according to business requirements.
Key Responsibilities :
- Productionize and maintain data pipelines to ensure data quality and operationalization.
- Be at the centre of the build of the company's new data platform, which includes Data Lakes in AWS and Data Warehouses with cutting-edge tools and technology.
- Evangelize and continuously improve the company's DataOps capabilities.
- Ensure that all data products and projects are designed, delivered, and supported according to business requirements.
- Collaborate with cross-functional teams to understand and interpret business requirements and translate them into technical solutions.
- Implement and maintain processes for data quality, data validation, and data governance.
- Monitor and maintain the performance and reliability of data pipelines and data systems.
- Investigate and troubleshoot data issues and implement solutions to resolve them.
- Continuously improve the data operations processes and tools to ensure scalability and efficiency.
- Act as a subject matter expert and provide guidance to other teams on best practices for data operations.
Day to Day Activities will include :
- Monitoring and maintaining data quality: This includes performing regular checks on the data pipelines and ensuring that all data is accurate, consistent and meets the required standards.
- Automating data operations: Implementing automation tools and processes to streamline data operations, reduce manual errors, and improve efficiency.
- Managing data security and privacy: Ensuring that all data is securely stored and processed, and that proper security and privacy measures are in place to protect sensitive data.
- Managing data storage and retrieval: Designing and implementing data storage solutions that are scalable, efficient, and optimized for quick retrieval.
- Optimizing data pipeline performance: Constantly monitoring and optimizing data pipeline performance to ensure that data is delivered to stakeholders in a timely and accurate manner.
- Collaborating with cross-functional teams: Working closely with stakeholders, such as data analysts and data scientists, to understand their data needs and to help them get the data they need to support their work.
- Staying current with new technologies: Continuously researching and staying current with new data technologies and trends, and recommending new tools and processes to improve data operations.
- Ensuring data compliance: Ensuring that all data operations comply with industry regulations, such as GDPR and HIPAA.
Knowledge, skills and experience required :
Knowledge :
- Experience with AWS cloud services and Data Lakes, Snowflake, and other data warehousing technologies
- In-depth understanding of data pipeline architecture and data modelling practices
- Understanding of data governance and data security practices
- Knowledge of data-driven analytical and reporting activities
- Understanding of the impact of changes to business rules on data processes
- Strong knowledge of SQL and scripting languages (e. Python, Bash)
- Experience with Infrastructure as Code, including a working knowledge of Terraform
- Familiarity with CI/CD driven data pipeline and infrastructure
Skills :
- Hands-on experience deploying, maintaining and monitoring data pipelines.
- Excellent problem-solving skills and ability to provide clear recommendations
- Innovative thinking and the ability to continuously improve data processes
- Strong communication and influencing skills, especially in regard to data solutions and outcomes
- Ability to manage and lead a small team of data engineers
Experience :
- Extensive experience leading cloud data platform transformations, particularly in AWS
- Proven track record of delivering large-scale data and analytical solutions in a cloud environment
- Experience in developing Data Warehouses
- Experience in cost-effective management of data pipelines
- Experience with Agile delivery approach using Scrum and Kanban methodologies
- Experience in supporting QA and user acceptance testing processes
Did you find something suspicious?
Posted By
Britney Dias
Talent Acquisition Specialist - India at COLLINSON LOYALTY BENEFITS PRIVATE LIMITED
Last Active: 28 Nov 2025
Posted in
DevOps / SRE
Functional Area
DevOps / Cloud
Job Code
1581681
Interview Questions for you
View All