HamburgerMenu
hirist

Job Description

Description :

About Trademo.

At Trademo, we are transforming global trade and supply chains by leveraging cutting-edge AI technology to provide businesses with unparalleled visibility, compliance, and intelligence solutions.

Our AI-driven platform simplifies the complexities of international trade, helping companies mitigate risks, enhance efficiency, and make data-driven decisions with confidence.

Our AI-Enhanced Solutions :

- Trademo Intel AI-powered trade intelligence to uncover market trends and competitive insights.

- Trademo Sanctions Screener AI-driven compliance with 650+ global sanctions and PEP lists.

- Trademo Global Trade Compliance Real-time regulatory and tariff data for 140+ countries, with AI workflows for HS/ECN classification, controls determination, and licensing.

- Trademo Map AI-powered global supply chain mapping and screening, detecting risks like Forced Labor (UFLPA) and sanctions in deep-tier networks.

- Trademo TradeScreen AI-powered trade transaction digitization, financial crime screening and compliance platform.

Key Responsibilities :

- Oversee and contribute to the design and development of scalable systems for ETL/ELT, batch processing, and real-time streaming.

- Work closely with cloud/DevOps teams to optimize AWS infrastructure, deployment pipelines, and cost efficiency.

- Build robust pipelines for data ingestion, transformation, modeling, quality checks, and lineage tracking.

- Lead, mentor, and grow a team of data engineers, drive a culture of ownership, technical excellence, and collaboration.

- Act as the technical backbone of the data engineering team-actively contribute to architecture, design, and complex engineering tasks.

- Solve high-impact technical problems related to scalability, reliability, and data quality.

- Participate in writing/reviewing code when needed, especially for critical components or new system foundations.

- Contribute to long-term data platform vision and drive continuous improvement.

Desired Profile :

- Bachelors degree in Computer Science, Engineering, Information Technology, or equivalent is required.

- 9-12 years of overall experience in software/data engineering.

- At least 2 years of experience in technical leadership or engineering management roles.

- Strong expertise in data engineering, distributed systems, or backend platform development.

- Familiarity with Spark, Apache Kafka/MSK, Airflow, Kubernetes.

- Exposure to ML pipelines, analytics engineering, or data governance frameworks.


info-icon

Did you find something suspicious?