HamburgerMenu
hirist

Job Description

Description :

Position : Staff, Data Engineer Data Architecture.

Location : Spire T110, Hyderabad Knowledge City, Madhapur, Hyderabad, Telangana, India, 500081.

Duration : Full time.

Complete onsite (5 days/week).

Job Description :

Roles & Responsibilities :

- Shape and drive enterprise-wide data architecture strategy : Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.

- Lead large-scale, cross-functional initiatives : Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.

- Pioneer advanced and forward-looking solutions : Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.

- Mentor and develop senior technical leaders : Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.

- Establish cross-org technical standards : Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.

- Solve the most complex, ambiguous challenges : Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.

- Serve as a strategic advisor to executive leadership : Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.

- Represent the organization as a thought leader : Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.

Technical Skills :

- 15+ years of experience.

- Mastery of data architecture and distributed systems at enterprise scale : Deep experience in GCP

- Advanced programming and infrastructure capabilities : Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.

- Leadership in streaming and big data systems : Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.

- Enterprise-grade governance and compliance expertise : Design and implement standards for data quality, lineage, security, privacy (e.g, GDPR, HIPAA), and auditability across the organization.

- Strategic integration with AI/ML ecosystems : Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).

- Exceptional ability to influence across all levels : Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.

- Recognized industry leader : Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.

Must Have Skills :

- Deep expertise in data architecture, distributed systems, and GCP.

- Python or Java, infrastructure-as-code (e.g Terraform).

- Big data tools : BigQuery(Expert level.

- Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming).

- Data governance, privacy, and compliance (e.g GDPR, HIPAA).

- Data modeling and architecture level expert, have experience on hybrid architectures.

- SQL Skills level Expert.

- Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations.

- Experience on Cloud function, Composer and Cloud run, dataflow flex templates should be able to write.

- Understanding of full concepts of cloud architecture.


info-icon

Did you find something suspicious?