Posted on: 09/10/2025
Responsibilities for Data Engineer :
- Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements using Python and SQL / AWS / Snowflakes.
- Identify, design, and implement internal process improvements through: automating manual processes using Python, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL / AWS / Snowflakes technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
Desired Skillset :
- 3+ years of experience in a Python Scripting and Data specific role, with a Bachelor degree.
- Experience with data processing and cleaning libraries e.g. Pandas, numpy, etc., web scraping/ web crawling for automation of processes, API's and how they work.
- Debugging code if it fails and find the solution. Should have basic knowledge of SQL server job activity monitoring and of Snowflake.
- Experience with relational SQL and NoSQL databases, including PostgreSQL and Cassandra.
- Experience with most or all the following cloud services: AWS, Azure, Snowflake,
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1557750
Interview Questions for you
View All