Posted on: 20/11/2025
Description :
Your Role Accountabilities :
- Gain an understanding of brand and enterprise data warehouse/reporting requirements that will inform the data architecture and modeling approach.
- Employ these learnings to construct appropriate ETL processes, database queries, and data/permission models that support data aggregation and reporting on actionable insights.
- Passion to write code which is efficient, organized, simple and scalable meeting business requirements.
- Enthusiasm to learn and find opportunities to enhance and adapt in daily engineering activities is highly desired.
- Debug, troubleshoot, design, and implement solutions to complex technical issues.
- Deliver end-to-end JIRA User stories meeting quality expectations of the team.
- Familiarity with BI Tools and able to create semantic layer models for the business users.
- Participate in QA testing for data pipeline projects as well as implementation changes to the suite of analytical tools.
- Monitor batch data loads to meet SLAs.
- You should be able to quickly respond and resolve production issues.
- Ability to thrive in a team-based environment and Flexibility to work in second shift.
Qualifications & Experiences :
- Bachelor's degree in computer science, information systems, or information technology.
- 5 to 8 years of experience in data engineering
- Knowledge of supporting data sets for Home Entertainment, Games DVD/Digital business, Content sales and Licensing.
- Knowledge of SAP supply chain, APO, Order management, Trade spends, promotions, POS (Point of Sale), Royalty, Forecasting and cash collections.
- Experience with programming languages SQL, Python, AWS (Amazon Web Services) Glue
- Strong experience with MPP databases (Teradata & Snowflake)
- Experience with Snow pipe, tasks, streams, clustering, Time travel, Cache, data sharing
- Experience in Conceptual/Logical/Physical Data Modelling & expertise in Relational and Dimensional Data Modelling
- Experience with AWS cloud services Kinesis, Lambda, IAM (Identity and Access Management) Policies
- Experience in SQL query tuning and cost optimization
- Experienced in software delivery through continuous integration (for example git, bitbucket, Jenkins, etc.)
- Experienced in one or more automation and scheduling tools (for example Redwood, Airflow, etc.)
- Must be comfortable working in a Linux/Unix environment.
- Familiarity with ASW Developer tools services like Code Deploy, Data Pipeline
- Experience with public/private API integration, web scrapping, data streaming architecture
- Knowledge of Business content interchange (BCI).
Not Required but preferred experience :
- Public speaking and presentation skills.
- Experience with DBT
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1578159
Interview Questions for you
View All