Posted on: 12/01/2026
Description :
- Consumers care first and foremost about having their time valued by brands.
- Brands need insights into their customer service operation to serve their consumers effectively.
- Such insights and analytics are delivered through various data products like in-app analytics dashboards and data-sharing integrations.
- The data platform team is responsible for designing, building, and maintaining the data infrastructure that enables such data and analytics products at scale.
- We build and manage data pipelines, databases, and other data structures to ensure that the data is reliable, accurate, and easily accessible.
- We also enable internal stakeholders with business intelligence and machine learning teams with data ops.
- This team manages the platform that handles 2 Million events per minute and processes 1+ terabytes of data daily.
About the role :
- Building maintainable data pipelines both for data ingestion and operational analytics for data collected from 2 billion devices and 900M Monthly active users.
- Building customer-facing analytics products that deliver actionable insights and data, easily detect anomalies.
- Collaborating with data stakeholders to see what their data needs are and being a part of the analysis process.
- Write design specifications, test, deployment, and scaling plans for the data pipelines.
- Mentor people in the team & organization.
Experience & Requirements :
- 3+ years of experience in building and running data pipelines that scale for TBs of data.
- Proficiency in high-level object-oriented programming language (Python or Java) is must.
- Experience in Cloud data platforms like Snowflake and AWS, EMR/Athena is a must.
- Experience in building modern data lakehouse architectures using Snowflake and columnar formats like Apache Iceberg/Hudi, Parquet, etc.
- Proficiency in Data modeling, SQL query profiling, and data warehousing skills is a must.
- Experience in distributed data processing engines like Apache Spark, Apache Flink, Datalfow/Apache Beam, etc.
- Knowledge of workflow orchestrators like Airflow, Dasgter, etc is a plus.
- Data visualization skills are a plus (PowerBI, Metabase, Tableau, Hex, Sigma, etc).
- Excellent verbal and written communication skills.
- Bachelors Degree in Computer Science (or equivalent).
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1600329