Job Overview A leading financial institution is undergoing a technology transformation and is seeking a Data Analytics Engineer to support the development and migration of data pipelines to a modern on-premise data lakehouse, with potential cloud migration in the future.
Key Responsibilities
Develop end-to-end data pipelines for structured and unstructured data
Implement data ingestion, transformation, and integration using Spark, SQL, Hadoop, Kafka
Collaborate with business analysts and stakeholders to refine data requirements
Contribute to data architecture and design decisions
Ensure compliance with security, governance, and data retention policies
Stay updated on emerging data technologies
Support proof-of-concept initiatives and innovation within the team
Requirements:
5+ years' experience with modern data technologies
Strong Python development skills
Hands-on experience with data lake/lakehouse architectures
Proficiency in Hadoop, Spark, Hive, Kafka, ELK, Cloudera
Expertise in data modelling, design patterns, and scalable applications
Preferred:
Experience in banking/financial sector (financial instruments, trade life cycle, risk management)
Familiarity with Apache Iceberg and lakehouse technologies
Experience with CI/CD pipelines, automated testing, and Agile methodologies
Exposure to cloud-based solutions (Databricks, Snowflake, Azure Synapse)
Knowledge of API development, microservices, .NET/C#