Posted on: 13/12/2025
Description :
Requirements :
- 8+ years as a hands-on Solutions Architect and/or Data Engineer, designing and implementing data solutions.
- Team lead, and/or mentorship of other engineers.
- Ability to develop end-to-end technical solutions into production and to help ensure performance, security, scalability, and robust data integration.
- Programming expertise in Java, Python, and/or Scala.
- Core cloud data platforms, including Snowflake, AWS, Azure, Databricks, and GCP SQL, and the ability to write, debug, and optimize SQL queries.
- Client-facing written and verbal communication skills and experience.
- Create and deliver detailed presentations.
- Detailed solution documentation (e. g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc. ).
- 4-year Bachelor's degree in Computer Science or a related field.
Prefer any of the following :
- Production experience in core data platforms : Snowflake, AWS, Azure, GCP, Hadoop, Databricks.
- Cloud and Distributed Data Storage : S3 ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems.
- Data integration technologies : Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure Data Factory, Informatica Intelligent Cloud Services (IICS), Google DataProc, or other data integration technologies.
- Multiple data sources (e. g. queues, relational databases, files, search, API). Complete software development lifecycle experience, including design, documentation, implementation, testing, and deployment.
- Automated data transformation and data curation : dbt, Spark, Spark streaming, automated pipelines.
- Workflow Management and Orchestration : Airflow, AWS Managed Airflow, Luigi, NiFi.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1589391
Interview Questions for you
View All