Posted on: 19/01/2026
Description :
Work youll do :
- Lead discussions with business and functional analysts to understand requirements and assess integration impacts on business architecture.
- Prepare technical architecture and design, clarify requirements, and resolve ambiguities.
- Develop solutions in line with established technical designs, integration standards, and quality processes.
- Create and enforce technical design and development guides, templates, and standards.
- Facilitate daily scrum meetings, manage deliverables, and prepare weekly status reports for leadership review.
- Conduct detailed deliverable reviews and provide technical guidance to team members.
- Collaborate with onsite clients, coordinators, analysts, and cross-functional teams.
- Design templates or scripts to automate routine development or operational tasks.
- Analyze legacy data sources and architect migration strategies to modern platforms (e.g., cloud databases, data lakes).
- Develop ETL (Extract, Transform, Load) pipelines to move and transform data efficiently.
- Ensure data integrity, quality, and security throughout the migration process.
- Design and implement scalable, cloud-native data solutions (e.g., AWS, Azure, GCP).
- Build and maintain data models, APIs, and microservices that support modernized applications.
- Collaborate with application developers to ensure seamless data access and performance.
- Evaluate existing data infrastructure, identifying technical debt and modernization opportunities.
- Document data flows, dependencies, and business logic embedded in legacy systems.
- Tune data pipelines and storage solutions for speed, reliability, and cost-effectiveness.
- Implement monitoring and alerting data operations.
- Work closely with business analysts, application architects, and cloud engineers.
- Translate business requirements into technical data solutions.
- Provide guidance and best practices for data modernization.
- Develop, test, and refine prompts to elicit desired responses from AI models.
- Experiment with prompt structures, context windows, and input formatting to improve output quality.
- Test and compare different models and APIs to select the best fit for specific tasks.
The team :
Engineering as a Service provides complete design, implementation, and technology operations, leveraging our core engineering expertise. We transform engineering teams, modernize technology, and deliver complex programs with a product engineering approach. Our flexible delivery modelstraditional teams, pools, or podsare tailored to each clients needs, offering engineering-led advisory, implementation, and operational capabilities to accelerate innovation.
Qualifications
Must Have Skills/Project Experience/Certifications :
- 10+ years of industry experience, with at least 6+ years in database solution development, architecture, and programming, with a focus on data migration or modernization projects.
- At least 4 years experience leading technical teams in database migration, modernization, or cloud adoption projects.
- Demonstrated expertise in executing full data migration lifecycles, including requirements gathering, planning, schema conversion, data transformation, migration, and validation for large-scale systems.
- Hands-on experience with relational database systems (such as DB2, Oracle, MS SQL, PostgreSQL), including schema design and optimization.
- Proficiency in SQL, Python, and/or Scala for data engineering tasks.
- Experience with ETL tools (e.g., Informatica, Talend, Apache NiFi).
- Experience with scripting languages for data extraction, transformation, automation, and performance benchmarking (shell scripting, Windows PowerShell, PySpark ).
- Hands-on experience with cloud platforms (AWS, Azure, GCP) and their data services.
- Familiarity with big data technologies (e.g., Spark, Hadoop) and data warehousing (e.g., Snowflake, Redshift).
- Proven track record in migrating legacy data systems to cloud or modern architectures.
- Understanding data governance, security, and compliance in modernization contexts.
- Strong problem-solving and analytical abilities.
- Excellent communication and collaboration skills.
- Ability to work in agile, cross-functional teams.
- Stay current with advancements in generative AI, LLMs, and prompt engineering techniques.
- Relevant certifications (e.g., AWS Certified Data Analytics, Azure Data Engineer Associate
Education :
BE/B.Tech/M.C.A./M.Sc (CS) degree or equivalent from accredited university.
Location :
Bengaluru/Hyderabad
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1602927