HamburgerMenu
hirist

VML - Data Engineer - Google Cloud Platform

VML Enterprise Solutions
Others
5 - 7 Years
star-icon
3.8white-divider33+ Reviews

Posted on: 02/12/2025

Job Description

Description :

Who We Are.

At VML, we are a beacon of innovation and growth in an ever-evolving world.

Our heritage is built upon a century of combined expertise, where creativity meets technology, and diverse perspectives ignite inspiration.

With the merger of VMLY&R and Wunderman Thompson, we have forged a new path as a growth partner that is part creative agency, part consultancy, and part technology powerhouse.

Our global family now encompasses over 30,000 employees across 150+ offices in 64 markets, each contributing to a culture that values connection, belonging, and the power of differences.

Our expertise spans the entire customer journey, offering deep insights in communications, commerce, consultancy, CRM, CX, data, production, and technology.

We deliver end-to-end solutions that result in revolutionary work.

We are looking for a Data Engineer with experience working with GCP to join us on a permanent basis.

You can either join us in our growing Mumbai team or join us remotely from across India.

Step into a pivotal Data Engineer role at VML, where you'll be instrumental in building and maintaining robust data pipelines and infrastructure across various critical workstreams, including media data warehousing, business operations, and marketing experimentation.

You'll leverage cutting-edge Google Cloud Platform (GCP) services to design scalable data solutions, implement API integrations, and contribute to event-driven architectures.

This is an excellent opportunity to apply strong technical problem-solving skills, advance your data engineering capabilities, and gain a foundational understanding of diverse data domains.

You'll contribute to creating centralized, reliable data sources that inform critical business decisions, working within a collaborative team environment to drive innovation and efficiency, directly impacting how we use data to test hypotheses, optimize strategies, and achieve measurable outcomes.

What Youll Be Doing :

- Designing, developing, and deploying robust ETL/ELT data pipelines using Python, JavaScript, and SQL to integrate various 3rd party data sources and power critical business functions.

- Automating the integration of diverse data sources at scale, ensuring data accuracy and timeliness for media data warehouses and other critical systems.

- Leveraging a comprehensive suite of Google Cloud Platform (GCP) services including BigQuery, Cloud Run, and Scheduler for efficient data processing, orchestration, and designing scalable data solutions.

- Performing data modeling and transformations using dbt and SQL to structure and refine data, supporting data-driven decision-making and ensuring data quality.

- Managing and version controlling pipeline code, configurations, and infrastructure changes using Git, ensuring collaborative development and deployment.

- Troubleshooting and optimizing existing data pipelines to enhance reliability, performance, and scalability.

- Applying infrastructure-as-code principles for provisioning and managing data resources and environments, potentially using tools like Terraform.

- Developing and integrating APIs for complex data ingestion, seamless data flow, and connectivity across various systems.

- Exploring and integrating streaming data technologies like Kafka or Pub/Sub for real-time data needs and contributing to event-driven architectures.

- Implementing and maintaining CI/CD pipelines to automate the testing, deployment, and monitoring of data solutions.

- Translating marketing experiment business briefs into actionable data solutions, including building and deploying audiences for marketing experiments.

- Collaborating closely with stakeholders to understand requirements, provide insights, and ensure data solutions meet business needs, applying an analytical background to deepen expertise in data analysis and experimentation.

What We Want From You:

- Professional experience in data engineering or a similar role.

- Hands-on experience with Google Cloud Platform (GCP) services including BigQuery, Cloud Run, and Scheduler.

- Proficiency in SQL for data manipulation and analysis.

- Strong programming skills in Python.

- Experience with dbt.

- Understanding of data warehousing concepts, ETL/ELT processes, and data modeling.

- Experience with version control systems (e.g., Git).

- Familiarity with infrastructure-as-code principles.

- Experience with API Development.

- Excellent problem-solving skills, attention to detail, and a proactive attitude.

- Ability to work independently and as part of a team in a fast-paced environment.

Desirable Attributes:

- Programming skills in JavaScript.

- Knowledge of streaming data processing concepts (e.g., Kafka, Pub/Sub) and experience with event-driven architecture.

- Familiarity with CI/CD pipelines.

- Familiarity with Terraform.

- Relevant GCP certifications are a plus.

- An analytical background.

- Hands-on experience with additional GCP services such as Pub/Sub and Firestore.

- Familiarity with Postman.

What We Can Offer You :

Alongside the opportunity to work with some of the most exciting brands around the world, well also prioritise your career development and help you grow your skills.

Well empower you to make a difference, allow you to be yourself, and respect who you are.


info-icon

Did you find something suspicious?