HamburgerMenu
hirist

Job Description

Role : Sr. Data Engineer.


Location : Bangalore Onsite.


Duration : Full-Time.


Timings : 1-10 PM IST (Cab; Food & other facilities provided).


Experience : 6-8 yrs.


Skills to crack : Snowflake, SQL, Python, ETL, Matillion / Informatica.


Desired Skills : PowerBI, CRM (Salesforce), ERP (Oracle).


About the Role :


We are seeking an experienced and proactive Senior Data Engineer to drive the design, development, and optimization of our enterprise data infrastructure, with a core focus on Snowflake. You will lead complex data integration projects from CRM (preferably Salesforce), ERP systems (such as Oracle), and other enterprise sources. This role will play a critical part in shaping our data platform, ensuring scalability, performance, and compliance to support advanced analytics and AI initiatives.


Required Qualifications :


- Bachelors or Masters degree in computer science, Engineering, Information Systems, or a

related technical field.

- 6+ years of hands-on experience as a Data Engineer, with a proven track record of delivering

production-grade data solutions.

- Expertise in Snowflake : performance tuning, data modelling, security features, and best

practices.

- Deep experience integrating complex systems like CRM (Salesforce) and ERP platforms

(NetSuite, etc.

- Advanced proficiency in SQL, including optimization, CTEs, window functions, and complex

joins.

- Strong experience with Python for building and orchestrating data pipelines.

- Expertise with data pipeline tools or custom-built ETL frameworks.

- Solid experience in cloud ecosystems (AWS, Azure, GCP) - including storage, compute, and

serverless services.

- Strong understanding of data governance, metadata management, and data cataloguing

solutions.

- Certifications in Snowflake, Salesforce Data Architecture, or Cloud Architect certifications

(AWS, Azure, GCP).

- Experience working with real-time data ingestion tools (Kafka, Kinesis, Pub/Sub).

- Knowledge of Data Lakehouse architecture and experience with Delta Lake, Apache Iceberg,

or similar technologies.

- Familiarity with ML Ops or supporting Data Science initiatives is a plus.


Responsibilities :


- Architect, develop, and optimize highly scalable, reliable, and secure data pipelines and

workflows on Snowflake.

- Lead and oversee the integration of data from CRM (Salesforce), ERP platforms, and other

third-party systems into the enterprise data warehouse.

- Define best practices for data modelling, data quality, security, and operational efficiency.

- Collaborate with cross-functional teams (Product, Engineering, BI, Data Science) to

understand business needs and deliver comprehensive data solutions.

- Mentor and guide junior and mid-level data engineers through code reviews, technical

guidance, and architectural discussions.

- Evaluate and recommend modern data tools, frameworks, and patterns (e.g., ELT, Data Mesh,

Data Vault modelling).

- Implement and manage orchestration tools and CI/CD pipelines for data engineering projects.

- Own monitoring, logging, and alerting strategies for data pipelines, ensuring high uptime and

performance.

- Ensure compliance with data governance, privacy, and security standards (GDPR, HIPAA, SOX).

- Contribute to technical documentation, runbooks, and knowledge-sharing initiatives within

the team.


Soft Skills :


- Strong leadership capabilities with a hands-on attitude.

- Excellent problem-solving and decision-making abilities.

- Strong written and verbal communication skills able to explain technical concepts to

business stakeholders.

- Ability to balance speed and quality, pushing for sustainable engineering excellence.

- A mindset of continuous learning and innovation.


info-icon

Did you find something suspicious?