HamburgerMenu
hirist

Job Description

Description :

Full Stack Developer AI & Data Applications

Role Overview :

The Full Stack Developer AI & Data Applications is a senior engineering role requiring 6+ years of experience to design and deliver highly scalable, modern, data-centric applications that power enterprise innovation.

This position demands strong expertise across the entire application stack, from building intuitive, performant user interfaces to deploying cloud-native microservices that integrate directly with AI/ML models and data pipelines.

The incumbent will be critical in translating complex data science models and analytical capabilities into production-ready, business-facing products.

Job Summary :

We are seeking a Full Stack Developer with strong expertise in modern web technologies, backend development, and cloud-native architectures to design and deliver scalable AI-driven and data-centric applications. The ideal candidate will be proficient in React, TypeScript, and Python/Node.js, with hands-on experience deploying and managing containerized applications on AWS (Lambda, ECS/EKS). This role requires close collaboration with AI engineers and data scientists to operationalize models and ensure enterprise-grade security and governance.

Key Responsibilities and Application Deliverables :

- Application Development : Design, develop, and maintain end-to-end full stack applications integrating complex data, AI, and analytics capabilities using modern frameworks and APIs.

- Frontend Engineering : Build intuitive, responsive, and performant user interfaces utilizing modern stacks, specifically React, TypeScript/JavaScript, and styling frameworks like Tailwind CSS (or functionally equivalent modern UI frameworks).

- Backend & API Development : Develop and optimize robust RESTful and GraphQL APIs using Python (FastAPI/Flask), Node.js, or Java, ensuring secure, high-performance integration with AI and data systems.

- Cloud & Infrastructure (DevOps) : Deploy and manage cloud-native applications on AWS, leveraging core services such as Lambda, ECS, EKS, API Gateway, S3, and CloudFront. Implement DevOps best practices for CI/CD, monitoring (Observability), and auto-scaling.

- Data and AI Integration : Work closely with data engineers and data scientists to seamlessly integrate backend services with data pipelines, data warehouses, and AI/ML model serving endpoints (e.g., via SageMaker, Bedrock, or internal Model APIs).

- Security & Governance : Implement authentication (AuthN), authorization (AuthZ), and data security measures (e.g., encryption, access controls) strictly aligned with enterprise standards and compliance requirements.

- Collaboration & Delivery : Partner effectively with AI engineers, data scientists, and product managers to successfully translate ambiguous business and technical requirements into structured, scalable, and production-ready applications.

- Continuous Improvement : Adopt and enforce best practices in coding standards, unit/integration testing, observability, and performance tuning across the full stack.

Mandatory Skills & Qualifications :

- Education : Bachelors or Masters degree in Computer Science, Engineering, or related field.

- Experience : 6+ years of proven experience in full stack application development.

- Core Proficiency : Strong proficiency in React, TypeScript/JavaScript, and proven expertise in at least one modern backend language : Python or Node.js.

- Cloud Infrastructure : Hands-on experience with AWS Cloud, utilizing services like Lambda, S3, API Gateway, ECS/EKS.

- Containerization & CI/CD : Practical experience with containerization (Docker, Kubernetes) and implementing automated CI/CD pipelines (e.g., GitLab CI, Jenkins, GitHub Actions).

- AI/Data Familiarity : Familiarity with AI/ML integration principles, model APIs, and modern data architectures (Data Lakes, Warehouses).

- Soft Skills : Excellent problem-solving, communication, and collaboration skills.

Preferred Skills :

- Direct experience operationalizing ML models via AWS SageMaker or Bedrock.

- Expertise in GraphQL development (Apollo or similar).

- Knowledge of database technologies such as PostgreSQL, DynamoDB, or Snowflake.

- Familiarity with observability tools like Prometheus, Grafana, or Datadog.


info-icon

Did you find something suspicious?