Posted on: 28/07/2025



Job Summary :
We are seeking an experienced and highly skilled Informatica Big Data Developer with a strong background in big data technologies, security architecture, and distributed systems. The ideal candidate will possess hands-on experience in managing large-scale data processing platforms, ensuring robust data governance, and implementing secure data pipelines in compliance with organizational and regulatory standards.
This role is ideal for someone who thrives in a fast-paced, collaborative environment and is passionate about integrating cutting-edge data solutions with resilient security protocols.
Key Responsibilities :
- Design, develop, and maintain scalable data integration pipelines using Informatica Big Data Management (BDM).
- Manage and optimize Hive, HQL, and HDFS usage, including advanced storage controller configurations and data partitioning strategies.
- Develop and schedule workflows using Apache Oozie and other orchestration tools.
- Work with both SQL and NoSQL databases to enable data ingestion, transformation, and storage across
various environments.
- Design and implement shell scripts and automated workflows for data operations on Linux platforms.
- Build resilient data solutions using modern design patterns, implementing reusable and secure components.
- Collaborate with DevOps and Agile teams, using tools such as GitHub, Jenkins, Maven, and Ansible, for continuous integration and deployment pipelines.
- Implement robust security protocols for data movement, storage, and processing including encryption, access control, auditing, and data masking.
- Identify and mitigate security design gaps; work closely with security and compliance teams to ensure all data workflows adhere to internal and external regulatory requirements.
- Implement wrapper solutions around third-party or legacy components lacking native security controls to ensure compliance.
- Monitor distributed services for performance, availability, and security using enterprise-grade monitoring and alerting tools (e.g., Splunk, Prometheus, Grafana).
- Design and maintain data transfer mechanisms via CRON jobs, ETLs, and custom scripts using JDBC/ODBC.
- Apply strong knowledge of networking principles, including DNS, proxies, ACLs, and policy enforcement, to troubleshoot connectivity and data flow issues.
- Ensure secure logging practices, avoiding sensitive data exposure (e.g., no PII or card numbers in logs or memory).
- Participate in code reviews, documentation efforts, and knowledge sharing across the engineering team.
Required Skills & Experience :
- Minimum 5+ years of development and design experience in Informatica Big Data Management (BDM).
- Proficiency in big data ecosystem tools : Hive, HDFS, Oozie, HBase, Kafka, Spark, etc.
- Strong Linux skills including shell scripting, troubleshooting, and process management.
- Experience with CI/CD pipelines, especially in distributed data environments.
- Solid understanding of security architectures for enterprise data platforms.
- Familiarity with compliance frameworks such as GDPR, HIPAA, PCI-DSS, and related data handling standards.
- Expertise in data anonymization, encryption, and policy-based data control at scale.
- Demonstrated ability to work in agile teams, participate in stand-ups, and handle multiple concurrent priorities.
- Strong analytical and problem-solving skills, with a proactive and detail-oriented mindset.
Preferred Qualifications :
- Certifications in Informatica, Hadoop, or security frameworks (e.g., CISSP, CISM, or cloud security certs).
- Experience with cloud-based big data platforms (AWS EMR, Azure Data Lake, GCP Big Query).
- Familiarity with containerization and orchestration tools like Docker and Kubernetes.
- Experience integrating with SIEM tools and log analysis frameworks.
- Background in banking, fintech, or other highly regulated industries.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1520743
Interview Questions for you
View All