HamburgerMenu
hirist

Data Engineer - SQL/Python

hashxpert
Multiple Locations
6 - 8 Years

Posted on: 09/10/2025

Job Description

Description :



We are seeking a highly skilled and security-conscious System Security Engineer with deep expertise in designing, implementing, and securing scalable data pipelines on Google Cloud Platform (GCP). The ideal candidate will have hands-on experience with GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage, and will be responsible for ingesting and processing real-time data from connected vehicles and IoT devices. This role demands a strong understanding of ETL/ELT workflows, data governance, and cloud-native security practices.



Key Responsibilities :



Data Pipeline Engineering :



- Design and implement scalable, fault-tolerant data pipelines using GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage.



- Develop real-time ingestion frameworks to process data from IoT devices and connected vehicle systems.



- Optimize data flow performance and ensure high availability across distributed systems.



ETL/ELT Development :



- Build and maintain robust ETL/ELT workflows using Python, SQL, Apache Beam, and Spark.



- Automate data transformation processes and integrate with downstream analytics platforms.



- Ensure modular, reusable, and testable code for data operations.



Security & Governance :



- Implement security controls across data pipelines, including IAM policies, encryption, and secure API integrations.



- Monitor and enforce data quality, lineage, and governance standards in the data lake.



- Collaborate with InfoSec teams to align with compliance frameworks such as ISO 27001, NIST, and GDPR.



DevOps & CI/CD :


- Integrate data workflows with CI/CD pipelines using tools like Jenkins, GitHub Actions, or Cloud Build.



- Manage deployment automation and rollback strategies for data infrastructure.



- Maintain version control and environment consistency across dev, test, and production.



Monitoring & Troubleshooting :



- Set up alerting and monitoring using Stackdriver, Prometheus, or equivalent tools.



- Troubleshoot pipeline failures, latency issues, and data anomalies.



- Conduct root cause analysis and implement preventive measures.



Required Skills & Qualifications :



- Bachelors or Masters degree in Computer Science, Information Security, or related field.



- 5+ years of experience in cloud-based data engineering and security.



- Strong proficiency in Python, SQL, and GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Storage).



- Experience with Apache Beam, Spark, and Airflow.



- Familiarity with API integration, CI/CD practices, and infrastructure-as-code (Terraform preferred).



- Knowledge of cloud security principles and data governance frameworks.



- Excellent problem-solving, communication, and documentation skills.



Preferred Certifications :



- Google Professional Data Engineer or Cloud Security Engineer



- Certified Information Systems Security Professional (CISSP)



- ITIL Foundation or equivalent


info-icon

Did you find something suspicious?