Posted on: 27/01/2026
About Newpage Solutions :
Newpage Solutions is a global digital health innovation company helping people live longer, healthier lives.
We partner with life sciences organizations including pharmaceutical, biotech, and healthcare leaders to build transformative AI and data-driven technologies addressing real-world health challenges.
From strategy and research to UX design and agile development, we deliver and validate impactful solutions using lean, human-centered practices.
We are proud to be Great Place to Work certified for three consecutive years, hold a top Glassdoor rating, and were named among the "Top 50 Most Promising Healthcare Solution Providers" by CIO Review.
As a remote-first company, we foster creativity, continuous learning, and inclusivity, creating an environment where bold ideas thrive and make a measurable difference in peoples lives.
Newpage looks for candidates who are invested in long-term impact.
Applications with a pattern of frequent job changes may not align with the values we prioritize.
Your Mission :
As a Snowflake Data Engineer, your mission is to design and implement robust, scalable, and high-performing data pipelines that empower business teams with reliable, timely, and secure data.
Youll work across cloud and on-prem environments to integrate diverse data sources, optimize Snowflake performance, and enable advanced analytics and reporting.
Your expertise in SQL, Python/R, and modern ETL frameworks will help transform raw data into actionable insights, driving smarter decisions across the organization.
What Youll Do :
- Design and implement high-quality ETL/ELT data pipelines in Snowflake using python, R or sql scripting.
- Develop and maintain data models, schemas, and warehouse structures optimized for performance and scalability.
- Integrate Snowflake with various cloud and on-prem sources via APIs, connectors, and data ingestion frameworks.
- Collaborate with Data Scientists, Analysts, and BI teams to provide reliable, timely data for reporting and analytics.
- Manage and optimize warehouse compute resources, query performance, and cost efficiency.
- Implement robust data governance and quality frameworks, including security, privacy, and compliance standards.
- Support and maintain CI/CD pipelines for data workflows (Github).
- Troubleshoot complex data issues and recommend process improvements to enhance data reliability.
What You Bring :
- Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field.
- 5-7 years of experience in Data Engineering.
- 3+ years of hands-on experience in Snowflake design, development, and optimization.
- Strong expertise in SQL, data modelling (3NF, star schema, data vault), and performance tuning.
- Proficiency in Python, R or SQL scripting for automation and data transformation.
- Experience with ETL/ELT orchestration tools (Airflow, dbt or equivalent).
- Familiarity with cloud platforms like AWS, Azure, or GCP, and their native data ecosystems.
- Understanding of data governance, access control (RBAC), and encryption in Snowflake.
- Strong analytical mindset with the ability to translate business requirements into technical solutions.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1606618