HamburgerMenu
hirist

Job Description

Description :



As a Data Engineer at Baazi Games, you will be focused on delivering data-driven insights to various functional teams enabling them to make strategic decisions that add value to the top or bottom line of the business.

What you will do :



- Design, build and own all the components of a high-volume data hub.


- Build efficient data models using industry best practices and metadata for ad hoc and pre-built reporting.


- Interface with business customers, gathering requirements and delivering complete data solutions & reporting.


- Work on solutions owning the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.


- Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers.


- Interface with other technology teams to extract, transform, and load (ETL) data from a wide variety of data sources.


- Own the functional and non-functional scaling of software systems in your area.


- Provides input and recommendations on technical issues to BI Engineers, Business & Data Analysts, and Data Scientists.

What we are looking for :



- 4- 7 years of experience in data engineering.


- Strong understanding of ETL concepts and experience building them with large-scale, complex datasets using distributed computing technologies.


- Strong data modelling skills with solid knowledge of various industry standards such as dimensional modelling, star schemas etc.


- Extremely proficient in writing performant SQL working with large data volumes.


- Experience designing and operating very large Datalakes/Data Warehouses.


- Experience with scripting for automation (e.g., UNIX Shell scripting, Python).


- Good to have experience working on the AWS stack.


- Clear thinker with superb problem-solving skills to prioritize and stay focused on big needle movers.


- Curious, self-motivated & a self-starter with a can-do attitude.


- Comfortable working in a fast-paced dynamic environment.

Key technologies :



- Must have excellent knowledge of Advanced SQL working with large data sets.


- Must have knowledge of Apache Spark.


- Should be proficient with any of the following languages : Java/Scala/Python.


- Must have knowledge of working with Apache Airflow or Nifi.


- Should be comfortable with any of the MPP querying engines like Impala, Presto or Athena.


- Good to have experience with AWS technologies including Redshift, RDS, S3, EMR, Glue, Athena etc.


info-icon

Did you find something suspicious?