Posted on: 04/12/2025
Position Overview :
We are seeking a hands-on Data Engineering Architect to design and build scalable data pipelines, implement cutting-edge Generative AI features, and architect robust data solutions. This role requires a technical leader who can translate business requirements into sophisticated data architectures while actively contributing to code development and system implementation.
Roles & Responsibilities :
Architecture & Design :
- Design and implement end-to-end data pipelines supporting batch and real-time processing
- Architect scalable data solutions using modern cloud-native patterns and microservices
- Develop comprehensive data strategies integrating traditional databases with cloud data platforms
- Lead technical decision-making for data platform evolution and technology stack optimization
Generative AI & Machine Learning :
- Build and deploy Generative AI features using AWS Bedrock foundation models
- Implement RAG (Retrieval-Augmented Generation) architectures with vector databases
- Design ML inference pipelines with proper monitoring, scaling, and cost optimization
- Integrate AI/ML capabilities into existing data workflows and applications
Hands-On Development :
- Write production-quality code in Java and Python for data processing and API development
- Develop data transformation logic, ETL/ELT processes, and data quality frameworks
- Implement event-driven architectures using messaging systems and stream processing
- Build and maintain data integration connectors and APIs
Data Platform Management :
- Optimize data storage strategies across relational databases and cloud data warehouses
- Implement data governance, security, and compliance frameworks
- Monitor and optimize system performance, reliability, and cost efficiency
- Establish CI/CD pipelines for data infrastructure and applications
Required Technical Skills :
Programming Languages :
- Java : 5+ years of experience with Spring Framework, microservices, and enterprise applications
- Python : Strong proficiency in data processing libraries (Pandas, NumPy), API frameworks (FastAPI, Flask)
Cloud & AWS Services :
- AWS Bedrock : Experience with foundation models, model fine-tuning, and inference endpoints
- Core AWS Services : S3, EC2, Lambda, IAM, VPC, CloudFormation/CDK
- Messaging & Streaming : SQS, SNS, Kinesis, and Apache Kafka
- Search & Analytics : OpenSearch/Elasticsearch for full-text search and analytics
Database Technologies :
- Snowflake : Data warehouse design, performance optimization, and integration patterns
- MySQL : Advanced SQL, query optimization, replication, and high-availability configurations
- SQL Server : T-SQL, stored procedures, SSIS/ETL development, and performance tuning
Data Engineering Tools :
- Workflow orchestration (Apache Airflow, AWS Step Functions, or similar)
- Data processing frameworks (DBT, Apache Spark, Dask, or similar)
- Container technologies (Docker, Kubernetes, ECS/EKS)
- Version control and CI/CD (Git, Jenkins, GitLab CI, BitBucket, etc. )
Education / Qualifications :
Experience & Background :
- 7+ years in data engineering, software architecture, or related technical roles
- 3+ years of hands-on experience with AWS services in production environments
- Experience with large-scale data processing (TB/PB scale datasets)
- Background in building real-time analytics or ML-powered applications
Domain Knowledge :
- Experience with building reporting solutions that provides insights to the end users.
- Experience working with LLMs and building ML features for the users.
- Understanding of Quality Management Systems, data privacy regulations (GDPR, CCPA) and security best practices
- Experience with data mesh, data fabric, or modern data architecture patterns
- Knowledge of DevOps practices and infrastructure-as-code methodologies
- Familiarity with monitoring and observability tools (CloudWatch, Datadog, ELK stack)
Soft Skills :
- Strong analytical and problem-solving abilities with attention to detail
- Excellent communication skills with ability to explain complex technical concepts
- Experience mentoring junior developers and leading technical initiatives
- Collaborative mindset with cross-functional teams (product, data science, engineering)
- Proven ability to convert POCs into production-grade solutions.
Education Requirements :
- Bachelor's degree in Computer Science, Engineering, or related technical field
- Master's degree preferred but not required with sufficient experience
- Relevant AWS certifications (Solutions Architect, Data Engineer) are a plus
What You'll Build :
- Scalable data pipelines processing millions of events daily
- GenAI-powered features enhancing user experiences and business insights
- Real-time analytics systems supporting critical business decisions
- Robust data infrastructure supporting multiple business units and use cases
Growth Opportunities :
- Lead architecture decisions for next-generation data platform
- Drive adoption of emerging AI/ML technologies and best practices
- Mentor and grow a team of talented data engineers
- Shape data strategy and technical roadmap for the organization
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1584439
Interview Questions for you
View All