Posted on: 29/01/2026
Key Responsibilities :
- Snowflake Instance Management: Set up, configure, and manage between three to four new Snowflake data warehouse instances per quarter, tailored to support emerging products and analytics requirements.
- Segment Integration: Establish and maintain reliable connections between Segment and Snowflake, ensuring that data flow is accurate, timely, and consistent across all integrated products.
- Feature Exploration and Adoption: Proactively learn about new and existing Segment features.
- Assess, test, and implement these features to continuously improve the quality and utility of data being funneled into Snowflake.
- Collaboration with Data Engineers: Work closely with two data engineers based in Dallas, who support the Marketing team's Segment/Snowflake needs.
- Team Meetings and Communication: Attend one or two meetings each week, primarily in the US time zone.
Primary Duties & Tasks :
- Coordinate with product managers and analytics leads during the design and rollout of new Snowflake instances, ensuring each instance meets product-specific and organizational requirements.
- Evaluate business needs and recommend optimal schema designs and data models in Snowflake, in partnership with data engineers and analytics stakeholders.
- Develop and maintain automated pipelines for ingesting, transforming, and validating data from Segment into Snowflake.
- Monitor data pipelines for latency, errors, and data quality issues, responding swiftly to resolve problems and minimize downtime.
- Document setup processes, data flows, and schema definitions to ensure ongoing maintainability and knowledge sharing across teams.
- Stay current on developments in Segment and Snowflake platforms, exploring pilot programs or beta features that could provide analytic advantages.
- Propose and implement new processes to streamline data onboarding for future products, reducing manual effort and increasing reliability.
- Participate in code reviews and contribute to the documentation and refinement of engineering standards related to data integration.
- Communicate findings, risks, and opportunities clearly to both technical and non-technical stakeholders.
Required Skills & Qualifications :
- Technical Expertise: Demonstrated experience setting up and managing Snowflake data warehouse environments, including user roles, security, and advanced data modeling.
- Segment Mastery: Familiarity with Segment (or similar CDP tools), including source and destination setup, event tracking, and feature utilization to ensure high data fidelity.
- Data Engineering Fundamentals: Proficiency in SQL, data transformation, and ETL pipeline management.
Understanding of data quality best practices and troubleshooting methodologies.
- Collaboration & Communication: Proven ability to collaborate with remote engineers and cross-functional teams, and to communicate complex technical concepts with clarity and confidence.
- Documentation & Process: Strong skills in documenting procedures, data schemas, and integration standards for a technical audience.
- Project Management: Organized and detail-oriented, able to manage multiple parallel product launches and integration tasks efficiently.
Preferred Qualifications :
- Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or related field.
- Experience working in a fast-paced, product-centric environment and supporting marketing analytics teams.
- Understanding of privacy requirements (such as GDPR or CCPA) as they relate to data collection and warehousing.
- Familiarity with additional data pipeline tools (e.g, dbt, Airflow) and cloud environments (AWS, GCP, or Azure).
- Experience with business intelligence platforms and data visualization tools is a plus
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1607431