Posted on: 05/11/2025
What will you be doing at Atrium?
As a Senior Data Engineering Consultant, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams.
You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
Responsibilities include :
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Identify, design, and implement internal process improvements : automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, AWS, and Big Data technologies
- Development of ETL processes to ensure timely delivery of required data for customers
- Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data
- Design, implement, and maintain data models that can support the organization's data storage and analysis needs
- Deliver technical and functional specifications to support data governance and knowledge sharing
Requirements for this role :
- B.Tech degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education
- 3 to 6 years of experience delivering consulting services to medium and large enterprises.
- Implementations must have included a combination of the following experiences : Data Warehousing or Big Data consulting for mid-to-large-sized organizations.
- Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture
- Strong experience with Snowflake and Data Warehouse architecture
- SnowPro Core certification is highly desired
- Hands-on experience with Python (Pandas, Dataframes, Functions)
- Hands-on experience with SQL (Stored Procedures, functions) including debugging, performance optimization, and database design
- Strong Experience with Apache Airflow and API integrations
- Solid experience in any one of the ETL tools (Informatica, Talend, SAP BODS, DataStage, Dell Boomi, Mulesoft, FiveTran, Matillion, etc.)
Nice to have :
- Experience in Docker, DBT, data replication tools (SLT, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies
- Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment
- Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions
- Strong presentation and communication skills
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1570229
Interview Questions for you
View All