Mid Fullstack Developer Leeds - hybrid Salary: £40k - £60k D.O.E Fruition are working with a reputable tech-first business who are embarking on a large-scale transformation in a completely greenfield environment. This client is looking to bring in a Mid-level Fullstack Developer to drive the technical direction of a greenfield initiative, designing and building scalable solutions. What will I be doing? Turn concepts and requirements into a bespoke software solutions. Work closely with the agile teams to meet software deliverables. Support continuous integration to enhance deployment. Build and deploy CI/CD pipelines. Working closely with the senior development team Writing high-performance, scalable code. Working in an AWS environment. Key requirements: End to end software development experience. Strong communication skills and stakeholder engagement experience. Modern tech stack experience - JavaScript, Typescript, Python, Java, GoLang. Strong experience of Infrastructure as Code - Terraform or Helm. Cloud knowledge - AWS, Azure or GCP. Strong experience of platform engineering, working with CI/CD pipelines. Strong experience in microservices and REST APIs. If this role sounds of interest, please apply and someone will be in touch regarding the role. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
07/03/2025
Full time
Mid Fullstack Developer Leeds - hybrid Salary: £40k - £60k D.O.E Fruition are working with a reputable tech-first business who are embarking on a large-scale transformation in a completely greenfield environment. This client is looking to bring in a Mid-level Fullstack Developer to drive the technical direction of a greenfield initiative, designing and building scalable solutions. What will I be doing? Turn concepts and requirements into a bespoke software solutions. Work closely with the agile teams to meet software deliverables. Support continuous integration to enhance deployment. Build and deploy CI/CD pipelines. Working closely with the senior development team Writing high-performance, scalable code. Working in an AWS environment. Key requirements: End to end software development experience. Strong communication skills and stakeholder engagement experience. Modern tech stack experience - JavaScript, Typescript, Python, Java, GoLang. Strong experience of Infrastructure as Code - Terraform or Helm. Cloud knowledge - AWS, Azure or GCP. Strong experience of platform engineering, working with CI/CD pipelines. Strong experience in microservices and REST APIs. If this role sounds of interest, please apply and someone will be in touch regarding the role. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
Senior Fullstack Developer Leeds - hybrid Salary: £60k - £80k D.O.E Fruition are working with a reputable tech-first business who are embarking on a large-scale transformation in a completely greenfield environment. This client is looking to bring in a Senior Fullstack Developer to drive the technical direction of a greenfield initiative, designing and building scalable solutions. What will I be doing? Turn concepts and requirements into a bespoke software solutions. Work closely with the agile teams to meet software deliverables. Support continuous integration to enhance deployment. Build and deploy CI/CD pipelines. Coaching and mentoring more junior developers. Writing high-performance, scalable code. Performing code reviews. Working in an AWS environment. Key requirements: End to end software development experience. Strong communication skills and stakeholder engagement experience. Modern tech stack experience - JavaScript, Typescript, Python, Java, GoLang. Strong experience of Infrastructure as Code - Terraform or Helm. Cloud knowledge - AWS, Azure or GCP. Strong experience of platform engineering, working with CI/CD pipelines. Strong experience in microservices and REST APIs. If this role sounds of interest, please apply and someone will be in touch regarding the role. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
07/03/2025
Full time
Senior Fullstack Developer Leeds - hybrid Salary: £60k - £80k D.O.E Fruition are working with a reputable tech-first business who are embarking on a large-scale transformation in a completely greenfield environment. This client is looking to bring in a Senior Fullstack Developer to drive the technical direction of a greenfield initiative, designing and building scalable solutions. What will I be doing? Turn concepts and requirements into a bespoke software solutions. Work closely with the agile teams to meet software deliverables. Support continuous integration to enhance deployment. Build and deploy CI/CD pipelines. Coaching and mentoring more junior developers. Writing high-performance, scalable code. Performing code reviews. Working in an AWS environment. Key requirements: End to end software development experience. Strong communication skills and stakeholder engagement experience. Modern tech stack experience - JavaScript, Typescript, Python, Java, GoLang. Strong experience of Infrastructure as Code - Terraform or Helm. Cloud knowledge - AWS, Azure or GCP. Strong experience of platform engineering, working with CI/CD pipelines. Strong experience in microservices and REST APIs. If this role sounds of interest, please apply and someone will be in touch regarding the role. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
Knowledge Engineer, Fully Remote, £80,000 - £100,000 per annum My client, a leading AI solutions company are seeking a Mid-Senior Python Backend Engineer with a passion for knowledge graphs and semantic web technologies. In this role, you will own the full Back End development for an RDF-intensive platform - designing and optimising systems around triple stores (AWS Neptune), Real Time data processing and validation with SHACL, and advanced query capabilities. You will integrate AI-driven SPARQL generation models (LLMs/NLP) to enable intelligent querying of the knowledge graph. Working in a cross-functional squad of 3-8 team members using a Lean Kanban approach, you'll collaborate closely with product, data scientists, and DevOps to deliver high-quality features in a fast-paced, agile environment. Key Responsibilities: Design and Develop Knowledge Graph Backends: Build robust Back End services to manage RDF data in triple stores (AWS Neptune) and vector embeddings in Milvus. Ensure Real Time processing of graph data, including on-the-fly validation with SHACL to maintain data integrity. SPARQL Query Implementation & AI Integration: Create efficient SPARQL queries and endpoints for data retrieval. Integrate NLP/AI models (eg Hugging Face transformers, OpenAI APIs, LlamaIndex AgentFlow) to translate natural language into SPARQL queries, enabling AI-driven query generation and semantic search. API & Microservices Development: Develop and maintain RESTful APIs and GraphQL endpoints (using FastAPI or Flask) to expose knowledge graph data and services. Follow microservices architecture best practices to ensure components are modular, scalable, and easy to maintain. Database & State Management: Manage data storage solutions including PostgreSQL (for application/session state) and caching layers as needed. Use SQLAlchemy or similar ORM for efficient database interactions and maintain data consistency between the relational and graph data stores. Performance Optimisation & Scalability: Optimise SPARQL queries, data indexing (including vector indices in Milvus), and service architecture for low-latency, Real Time responses. Ensure the system scales to handle growing knowledge graph data and high query volumes. DevOps and Deployment: Collaborate with DevOps to containerize and deploy services using Docker and Kubernetes. Implement CI/CD pipelines for automated testing and deployment. Monitor services on cloud platforms (AWS/Azure) for reliability, and participate in performance tuning and troubleshooting as needed. Team Collaboration: Work closely within a small, cross-functional squad (engineers, QA, product, data scientists) to plan and deliver features. Participate in Lean Kanban rituals (eg stand-ups, continuous flow planning) to ensure steady progress. Mentor junior developers when necessary and uphold best practices in code quality, testing, and documentation. Required Skills and Experience: Programming Languages: Strong proficiency in Python (Back End development focus). Solid experience writing and optimizing SPARQL queries for RDF data. Knowledge Graph & Semantic Web: Hands-on experience with RDF and triple stores- ideally AWS Neptune or similar graph databases. Familiarity with RDF schemas/ontologies and concepts like triples, graphs, and URIs. SHACL & Data Validation: Experience using SHACL (Shapes Constraint Language) or similar tools for Real Time data validation in knowledge graphs. Ability to define and enforce data schemas/constraints to ensure data quality. Vector Stores: Practical knowledge of vector databases such as Milvus (or alternatives like FAISS, Pinecone) for storing and querying embeddings. Understanding of how to integrate vector similarity search with knowledge graph data for enhanced query results. Frameworks & Libraries: Proficiency with libraries like RDFLib for handling RDF data in Python and PySHACL for running SHACL validations. Experience with SQLAlchemy (or other ORMs) for PostgreSQL. Familiarity with LlamaIndex (AgentFlow) or similar frameworks for connecting language models to data sources. API Development: Proven experience building Back End RESTful APIs (FastAPI, Flask or similar) and/or GraphQL APIs. Knowledge of designing API contracts, versioning, and authentication/authorization mechanisms. Microservices & Architecture: Understanding of microservices architecture and patterns. Ability to design decoupled services and work with message queues or event streams if needed for Real Time processing. AI/ML Integration: Experience integrating NLP/LLM models (Hugging Face transformers, OpenAI, etc.) into applications. Specifically, comfort with leveraging AI to generate or optimize queries (eg, natural language to SPARQL translation) and working with frameworks like LlamaIndex to bridge AI and the knowledge graph. Databases: Strong SQL skills and experience with PostgreSQL (for transactional data or session state). Ability to write efficient queries and design relational schemas that complement the knowledge graph. Basic understanding of how relational data can link to graph data. Cloud & DevOps: Experience deploying applications on AWS or Azure. Proficiency with Docker for containerization and Kubernetes for orchestration. Experience setting up CI/CD pipelines (GitHub Actions, Jenkins, or similar) to automate testing and deployment. Familiarity with cloud services (AWS Neptune, S3, networking, monitoring tools etc.) is a plus. Agile Collaboration: Comfortable working in an Agile/Lean Kanban software development process. Strong collaboration and communication skills to function effectively in a remote or hybrid work environment. Ability to take ownership of tasks and drive them to completion with minimal supervision, while also engaging with the team for feedback and knowledge sharing.
07/03/2025
Full time
Knowledge Engineer, Fully Remote, £80,000 - £100,000 per annum My client, a leading AI solutions company are seeking a Mid-Senior Python Backend Engineer with a passion for knowledge graphs and semantic web technologies. In this role, you will own the full Back End development for an RDF-intensive platform - designing and optimising systems around triple stores (AWS Neptune), Real Time data processing and validation with SHACL, and advanced query capabilities. You will integrate AI-driven SPARQL generation models (LLMs/NLP) to enable intelligent querying of the knowledge graph. Working in a cross-functional squad of 3-8 team members using a Lean Kanban approach, you'll collaborate closely with product, data scientists, and DevOps to deliver high-quality features in a fast-paced, agile environment. Key Responsibilities: Design and Develop Knowledge Graph Backends: Build robust Back End services to manage RDF data in triple stores (AWS Neptune) and vector embeddings in Milvus. Ensure Real Time processing of graph data, including on-the-fly validation with SHACL to maintain data integrity. SPARQL Query Implementation & AI Integration: Create efficient SPARQL queries and endpoints for data retrieval. Integrate NLP/AI models (eg Hugging Face transformers, OpenAI APIs, LlamaIndex AgentFlow) to translate natural language into SPARQL queries, enabling AI-driven query generation and semantic search. API & Microservices Development: Develop and maintain RESTful APIs and GraphQL endpoints (using FastAPI or Flask) to expose knowledge graph data and services. Follow microservices architecture best practices to ensure components are modular, scalable, and easy to maintain. Database & State Management: Manage data storage solutions including PostgreSQL (for application/session state) and caching layers as needed. Use SQLAlchemy or similar ORM for efficient database interactions and maintain data consistency between the relational and graph data stores. Performance Optimisation & Scalability: Optimise SPARQL queries, data indexing (including vector indices in Milvus), and service architecture for low-latency, Real Time responses. Ensure the system scales to handle growing knowledge graph data and high query volumes. DevOps and Deployment: Collaborate with DevOps to containerize and deploy services using Docker and Kubernetes. Implement CI/CD pipelines for automated testing and deployment. Monitor services on cloud platforms (AWS/Azure) for reliability, and participate in performance tuning and troubleshooting as needed. Team Collaboration: Work closely within a small, cross-functional squad (engineers, QA, product, data scientists) to plan and deliver features. Participate in Lean Kanban rituals (eg stand-ups, continuous flow planning) to ensure steady progress. Mentor junior developers when necessary and uphold best practices in code quality, testing, and documentation. Required Skills and Experience: Programming Languages: Strong proficiency in Python (Back End development focus). Solid experience writing and optimizing SPARQL queries for RDF data. Knowledge Graph & Semantic Web: Hands-on experience with RDF and triple stores- ideally AWS Neptune or similar graph databases. Familiarity with RDF schemas/ontologies and concepts like triples, graphs, and URIs. SHACL & Data Validation: Experience using SHACL (Shapes Constraint Language) or similar tools for Real Time data validation in knowledge graphs. Ability to define and enforce data schemas/constraints to ensure data quality. Vector Stores: Practical knowledge of vector databases such as Milvus (or alternatives like FAISS, Pinecone) for storing and querying embeddings. Understanding of how to integrate vector similarity search with knowledge graph data for enhanced query results. Frameworks & Libraries: Proficiency with libraries like RDFLib for handling RDF data in Python and PySHACL for running SHACL validations. Experience with SQLAlchemy (or other ORMs) for PostgreSQL. Familiarity with LlamaIndex (AgentFlow) or similar frameworks for connecting language models to data sources. API Development: Proven experience building Back End RESTful APIs (FastAPI, Flask or similar) and/or GraphQL APIs. Knowledge of designing API contracts, versioning, and authentication/authorization mechanisms. Microservices & Architecture: Understanding of microservices architecture and patterns. Ability to design decoupled services and work with message queues or event streams if needed for Real Time processing. AI/ML Integration: Experience integrating NLP/LLM models (Hugging Face transformers, OpenAI, etc.) into applications. Specifically, comfort with leveraging AI to generate or optimize queries (eg, natural language to SPARQL translation) and working with frameworks like LlamaIndex to bridge AI and the knowledge graph. Databases: Strong SQL skills and experience with PostgreSQL (for transactional data or session state). Ability to write efficient queries and design relational schemas that complement the knowledge graph. Basic understanding of how relational data can link to graph data. Cloud & DevOps: Experience deploying applications on AWS or Azure. Proficiency with Docker for containerization and Kubernetes for orchestration. Experience setting up CI/CD pipelines (GitHub Actions, Jenkins, or similar) to automate testing and deployment. Familiarity with cloud services (AWS Neptune, S3, networking, monitoring tools etc.) is a plus. Agile Collaboration: Comfortable working in an Agile/Lean Kanban software development process. Strong collaboration and communication skills to function effectively in a remote or hybrid work environment. Ability to take ownership of tasks and drive them to completion with minimal supervision, while also engaging with the team for feedback and knowledge sharing.
Data Engineering Manager (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a hands-on Data Engineering Manager/Architect/Technical Lead Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Data Engineering Manager (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead and co-manager of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the lead engineer/manager/solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Salary: $220k - $260k + 25% Bonus + $25k Share Options
07/03/2025
Full time
Data Engineering Manager (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a hands-on Data Engineering Manager/Architect/Technical Lead Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Data Engineering Manager (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead and co-manager of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the lead engineer/manager/solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Salary: $220k - $260k + 25% Bonus + $25k Share Options
Lead Data Engineer (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Lead Data Engineer/Data Engineering Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Lead Data Engineer (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Salary: $190k - $220k + 25% Bonus + $25k Share Options
07/03/2025
Full time
Lead Data Engineer (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Lead Data Engineer/Data Engineering Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Lead Data Engineer (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Salary: $190k - $220k + 25% Bonus + $25k Share Options
Data Engineering Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Java Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Data Engineering Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Salary: $220k - $260k + 25% Bonus + $25k Share Options
07/03/2025
Full time
Data Engineering Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Java Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Data Engineering Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Salary: $220k - $260k + 25% Bonus + $25k Share Options
Java Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Java Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Neo4J Python An understanding of data mesh architecture Role: Java Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. Hours are 8.30am - 5.30pm. Salary: $180k - $220k + 25% Bonus + $25k Share Options
07/03/2025
Full time
Java Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Java Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Neo4J Python An understanding of data mesh architecture Role: Java Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. Hours are 8.30am - 5.30pm. Salary: $180k - $220k + 25% Bonus + $25k Share Options
We are looking for a Python AWS Developer to join us as we build our next-generation strategic book of record (VBOR). You will be part of a newly formed team, working closely with US teams to build the next generation platforms and gain valuable skills understanding the business and contribute towards achieving significant efficiencies and new capabilities that result in higher returns for our clients. You will participate in configuration, implementation, and integration of both new and existing technologies and tools. You will work with the latest technologies within a microservice architecture on our public cloud (AWS), utilizing a development stack that includes Python. Core Responsibilities 1. Provides senior-level system analysis, design, development, and implementation of applications and databases, including third-party product integration. 2. Translates technical specifications into code for complex projects, writes programs, develops code, tests artifacts, and produces reports, ensuring automation support. 3. Elevates code to development, test, and production environments on schedule, provides production support, and submits change control requests with documentation, including peer reviews. 4. Understands software development methodology and architecture standards, trains and mentors less experienced staff, and resolves elevated issues. 5. Participates in design, code, and test inspections throughout the life cycle, explains technical considerations at meetings, and performs systems analysis activities. 6. Understands client business functions and technology needs, with a broad knowledge of Client's technologies, tools, and applications. 7. Interfaces with cross-functional team members and communicates system issues at the appropriate technical level for each audience. 8. Works with business-facing IT teams to deliver new solutions, reviews functional specifications, translates them into program specifications, liaises with end users for acceptance testing, and provides 3rd line support. 9. Builds thought leadership and expertise in best-practice solution design and implementation. 10. Manages time effectively across multiple projects with competing business demands and priorities. Primary Skills: * Experience with micro-services and service-oriented architecture. Familiarity with public cloud computing technologies (preferably Amazon Web Services) and proficiency in programming languages such as Python. * Knowledge of SimCorp's Dimension, IBOR, and Data solution offerings (preferred). * Agile development experience (using tools like Jira) and a solid understanding of the full software development life cycle. * Middle Office/Financial industry experience (preferably on the buy side and/or sell side). * Experience in developing applications for a global user base. * Involvement in transformation projects, demonstrating the ability to drive and manage significant changes within an organization * Understanding of OKTA Single Sign-On (SAML 2.0). Languages Python (Library: Boto3 & Moto). AWS technologies * Lambda (Python 3.7 and above), * S3 buckets * Step functions * Kinesis and/or Kafka (or similar non-AWS technologies, like Apache Flink, or AWS Glue) * SNS/SQS * GraphQL * RDS or Aurora (Athena is a nice to have) * Redshift * IAM Roles * CloudWatch/CloudTrail * CloudFormation * API gateway (similar observability tools) Qualifications * Minimum of five years related work experience, with at least two years of development experience. * Undergraduate degree or equivalent combination of training and experience. We are committed to offering an inclusive recruitment experience. If you require accommodations because of disability or health condition, please email. This position is being sourced through our Outsourcing service line.
06/03/2025
Full time
We are looking for a Python AWS Developer to join us as we build our next-generation strategic book of record (VBOR). You will be part of a newly formed team, working closely with US teams to build the next generation platforms and gain valuable skills understanding the business and contribute towards achieving significant efficiencies and new capabilities that result in higher returns for our clients. You will participate in configuration, implementation, and integration of both new and existing technologies and tools. You will work with the latest technologies within a microservice architecture on our public cloud (AWS), utilizing a development stack that includes Python. Core Responsibilities 1. Provides senior-level system analysis, design, development, and implementation of applications and databases, including third-party product integration. 2. Translates technical specifications into code for complex projects, writes programs, develops code, tests artifacts, and produces reports, ensuring automation support. 3. Elevates code to development, test, and production environments on schedule, provides production support, and submits change control requests with documentation, including peer reviews. 4. Understands software development methodology and architecture standards, trains and mentors less experienced staff, and resolves elevated issues. 5. Participates in design, code, and test inspections throughout the life cycle, explains technical considerations at meetings, and performs systems analysis activities. 6. Understands client business functions and technology needs, with a broad knowledge of Client's technologies, tools, and applications. 7. Interfaces with cross-functional team members and communicates system issues at the appropriate technical level for each audience. 8. Works with business-facing IT teams to deliver new solutions, reviews functional specifications, translates them into program specifications, liaises with end users for acceptance testing, and provides 3rd line support. 9. Builds thought leadership and expertise in best-practice solution design and implementation. 10. Manages time effectively across multiple projects with competing business demands and priorities. Primary Skills: * Experience with micro-services and service-oriented architecture. Familiarity with public cloud computing technologies (preferably Amazon Web Services) and proficiency in programming languages such as Python. * Knowledge of SimCorp's Dimension, IBOR, and Data solution offerings (preferred). * Agile development experience (using tools like Jira) and a solid understanding of the full software development life cycle. * Middle Office/Financial industry experience (preferably on the buy side and/or sell side). * Experience in developing applications for a global user base. * Involvement in transformation projects, demonstrating the ability to drive and manage significant changes within an organization * Understanding of OKTA Single Sign-On (SAML 2.0). Languages Python (Library: Boto3 & Moto). AWS technologies * Lambda (Python 3.7 and above), * S3 buckets * Step functions * Kinesis and/or Kafka (or similar non-AWS technologies, like Apache Flink, or AWS Glue) * SNS/SQS * GraphQL * RDS or Aurora (Athena is a nice to have) * Redshift * IAM Roles * CloudWatch/CloudTrail * CloudFormation * API gateway (similar observability tools) Qualifications * Minimum of five years related work experience, with at least two years of development experience. * Undergraduate degree or equivalent combination of training and experience. We are committed to offering an inclusive recruitment experience. If you require accommodations because of disability or health condition, please email. This position is being sourced through our Outsourcing service line.
Quant C++ Developer, Equities technology, Trading systems. Flagship hedgefund client is looking for a Quantitative C++ Developer, with experience working on low latency execution systems. Please let me know if you might be interested. 7+ years of C++ development experience, including multi-threading. Experience working for buy side companies required. Strong low-latency understanding and experience. Strong experience and understanding of linux systems. Python experience preferred. Full time, 5 days in office, client London based. Please reply ASAP with CV if interested. Scope AT acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you. Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.
06/03/2025
Full time
Quant C++ Developer, Equities technology, Trading systems. Flagship hedgefund client is looking for a Quantitative C++ Developer, with experience working on low latency execution systems. Please let me know if you might be interested. 7+ years of C++ development experience, including multi-threading. Experience working for buy side companies required. Strong low-latency understanding and experience. Strong experience and understanding of linux systems. Python experience preferred. Full time, 5 days in office, client London based. Please reply ASAP with CV if interested. Scope AT acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you. Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.
Senior Python Developer Contract 6-12 months Dublin. My client a leading global firm is in urgent need of a talented and experienced Senior Python Developer to join them on a contract basis for 6 months with view for extensions. You will write reusable, testable, and efficient Python code. You will be expected to design and develop ETL pipelines according to business requirements. You will design and implement robust database solutions using both SQL as well as NoSQL databases; familiarity MongoDB is huge plus. You will Use Grafana or other UI tools for data visualisation and to create intuitive UI dashboards for end-user interactions. You will collaborate with the team to plan, design and develop the next phases of the project, potentially involving system migration and integration of additional database systems. You will also ensure consistent use of Git for version control and contribute to the enhancement of development processes. Successful candidates will have Python programming experience, you will also have database skills - SQL/No SQL database. You will have UI/UX Skills for web interfaces in general, and ideally Grafana experience.Development of analytics platform - some knowledge of building dashboards, visualisation for analytics would be highly desirable.You will have a Familiarity with Linux environment. You will have an understanding of version control tools, specifically Git Experience in full life cycle SDLC or Agile methodologies. DevOps Experience - CI/CD tools like Horizon, Bitbucket, Ansible etc. Experience of implementation in Cloud/Containerisation would be a real plus. If this role sounds of interest drop me a CV so that we can speak in more detail.
06/03/2025
Project-based
Senior Python Developer Contract 6-12 months Dublin. My client a leading global firm is in urgent need of a talented and experienced Senior Python Developer to join them on a contract basis for 6 months with view for extensions. You will write reusable, testable, and efficient Python code. You will be expected to design and develop ETL pipelines according to business requirements. You will design and implement robust database solutions using both SQL as well as NoSQL databases; familiarity MongoDB is huge plus. You will Use Grafana or other UI tools for data visualisation and to create intuitive UI dashboards for end-user interactions. You will collaborate with the team to plan, design and develop the next phases of the project, potentially involving system migration and integration of additional database systems. You will also ensure consistent use of Git for version control and contribute to the enhancement of development processes. Successful candidates will have Python programming experience, you will also have database skills - SQL/No SQL database. You will have UI/UX Skills for web interfaces in general, and ideally Grafana experience.Development of analytics platform - some knowledge of building dashboards, visualisation for analytics would be highly desirable.You will have a Familiarity with Linux environment. You will have an understanding of version control tools, specifically Git Experience in full life cycle SDLC or Agile methodologies. DevOps Experience - CI/CD tools like Horizon, Bitbucket, Ansible etc. Experience of implementation in Cloud/Containerisation would be a real plus. If this role sounds of interest drop me a CV so that we can speak in more detail.
Quantitative Developer - Energy Trading - London TurleyWay are partnering with a Global Energy Trading Firm in London who are looking to appoint a Quantitative Developer who has worked as part of a trading desk pod structure. They operate a hybrid model with 3 days in the office. You will work on-desk directly with Traders to build systematic and quantitative solutions to provide an edge to their current environment. Your profile: MSc or PhD in the area of Finance, Physics or STEM with strong quantitative focus Hands on coding with Python (Flask), NumPy/pandas and Front End experience with Angular/React.js Git, CI/CD pipelines. Familiarity with Docker/Kubernetes Experience working in a Trading environment delivering quantitative solutions Strong coding, model development and data management skills Experience with data intensive modelling relating to time-series data and vendor data Experience with Cloud based systems, preferably Azure Your responsibilities will include: Partner with the Trading teams to translate their requirements into technology solutions, tools, models and analytical libraries to support analysis/trading decisions Build and develop models and data pipelines for the Trading team Experience of re-factoring code and developing/maintaining trading infrastructure Work with trading teams to gather requirements for analytics, integrating both Real Time and historical market data Migrating Legacy analytical applications to containerised processes (Docker/Kubernetes) Dashboarding and data visualization of analytics with Plotly If you possess the relevant skills and experience please apply today to schedule a confidential discussion. We look forward to hearing from you.
06/03/2025
Full time
Quantitative Developer - Energy Trading - London TurleyWay are partnering with a Global Energy Trading Firm in London who are looking to appoint a Quantitative Developer who has worked as part of a trading desk pod structure. They operate a hybrid model with 3 days in the office. You will work on-desk directly with Traders to build systematic and quantitative solutions to provide an edge to their current environment. Your profile: MSc or PhD in the area of Finance, Physics or STEM with strong quantitative focus Hands on coding with Python (Flask), NumPy/pandas and Front End experience with Angular/React.js Git, CI/CD pipelines. Familiarity with Docker/Kubernetes Experience working in a Trading environment delivering quantitative solutions Strong coding, model development and data management skills Experience with data intensive modelling relating to time-series data and vendor data Experience with Cloud based systems, preferably Azure Your responsibilities will include: Partner with the Trading teams to translate their requirements into technology solutions, tools, models and analytical libraries to support analysis/trading decisions Build and develop models and data pipelines for the Trading team Experience of re-factoring code and developing/maintaining trading infrastructure Work with trading teams to gather requirements for analytics, integrating both Real Time and historical market data Migrating Legacy analytical applications to containerised processes (Docker/Kubernetes) Dashboarding and data visualization of analytics with Plotly If you possess the relevant skills and experience please apply today to schedule a confidential discussion. We look forward to hearing from you.
Senior Java Developer Salary starting at: $150k + bonus Location: Chicago, IL or Dallas, TX Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications 7+ years of Front End, User Experience, development Memory Model, Runtime Environment, Concurrency and Multithreading experience 5+ years of experience in JavaScript 3+ years of experience in React application development 5+ years of hands-on HTML5/CSS3 Experience with Kafka Experience with popular JavaScript frameworks such as React, Node JS, Vue, Angular 2.0 Experience of working with Websockets, HTTP 1.1 and HTTP/2 Experience with RESTful APIs and JSON RPC AWS cloud experience Responsibilities Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Integrating disparate data from REST and WebSocket services within a cohesive user interface Participating in innovative design and proof of concepts with emerging technologies and solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc. Follows agreed upon SDLC procedures to ensure that all information system products and services meet both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Writes unit and integration tests based on chosen DevOps frameworks
06/03/2025
Full time
Senior Java Developer Salary starting at: $150k + bonus Location: Chicago, IL or Dallas, TX Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications 7+ years of Front End, User Experience, development Memory Model, Runtime Environment, Concurrency and Multithreading experience 5+ years of experience in JavaScript 3+ years of experience in React application development 5+ years of hands-on HTML5/CSS3 Experience with Kafka Experience with popular JavaScript frameworks such as React, Node JS, Vue, Angular 2.0 Experience of working with Websockets, HTTP 1.1 and HTTP/2 Experience with RESTful APIs and JSON RPC AWS cloud experience Responsibilities Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Integrating disparate data from REST and WebSocket services within a cohesive user interface Participating in innovative design and proof of concepts with emerging technologies and solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc. Follows agreed upon SDLC procedures to ensure that all information system products and services meet both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Writes unit and integration tests based on chosen DevOps frameworks
Senior Java Developer Salary starting at: $150k + bonus Location: Chicago, IL or Dallas, TX Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications 7+ years of Front End, User Experience, development Memory Model, Runtime Environment, Concurrency and Multithreading experience 5+ years of experience in JavaScript 3+ years of experience in React application development 5+ years of hands-on HTML5/CSS3 Experience with Kafka Experience with popular JavaScript frameworks such as React, Node JS, Vue, Angular 2.0 Experience of working with Websockets, HTTP 1.1 and HTTP/2 Experience with RESTful APIs and JSON RPC AWS cloud experience Responsibilities Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Integrating disparate data from REST and WebSocket services within a cohesive user interface Participating in innovative design and proof of concepts with emerging technologies and solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc. Follows agreed upon SDLC procedures to ensure that all information system products and services meet both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Writes unit and integration tests based on chosen DevOps frameworks
06/03/2025
Full time
Senior Java Developer Salary starting at: $150k + bonus Location: Chicago, IL or Dallas, TX Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications 7+ years of Front End, User Experience, development Memory Model, Runtime Environment, Concurrency and Multithreading experience 5+ years of experience in JavaScript 3+ years of experience in React application development 5+ years of hands-on HTML5/CSS3 Experience with Kafka Experience with popular JavaScript frameworks such as React, Node JS, Vue, Angular 2.0 Experience of working with Websockets, HTTP 1.1 and HTTP/2 Experience with RESTful APIs and JSON RPC AWS cloud experience Responsibilities Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Integrating disparate data from REST and WebSocket services within a cohesive user interface Participating in innovative design and proof of concepts with emerging technologies and solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc. Follows agreed upon SDLC procedures to ensure that all information system products and services meet both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Writes unit and integration tests based on chosen DevOps frameworks
Senior Python developer is required by this well recognised charity working on the development of ground-breaking solutions. You'll be part of a newly-built development function and lead on software design and engineering, managing multiple projects and mentor less-experienced developers. Ideally you can demonstrate the following - Python/Django development HTML/CSS, modern Front End frameworks JavaScript - React/Typescript/Node.js MySQL/PostgreSQL Git (Gitlab) AWS/Cloud Docker, CI/CD pipelines Agile Unit testing, QA support Manage 3rd party upgrade *Please note that this role is hybrid with two days per week in their London office, please only apply if you are comfortable with this.*
05/03/2025
Project-based
Senior Python developer is required by this well recognised charity working on the development of ground-breaking solutions. You'll be part of a newly-built development function and lead on software design and engineering, managing multiple projects and mentor less-experienced developers. Ideally you can demonstrate the following - Python/Django development HTML/CSS, modern Front End frameworks JavaScript - React/Typescript/Node.js MySQL/PostgreSQL Git (Gitlab) AWS/Cloud Docker, CI/CD pipelines Agile Unit testing, QA support Manage 3rd party upgrade *Please note that this role is hybrid with two days per week in their London office, please only apply if you are comfortable with this.*
My client are a global IT consultancy working with a large corporation in the oil and gas industry This role is a 6 month contract (potential to extend) based in London (2 days per week) Rate - £675pd inside ir35 a team of developers working closely with the Electronic Market Making (EMM)/Trading Business to build the next generation electronic trading system. You must have direct experience and skills to be considered for this position. You will be a Technology leader, a strong people leader who can manage an incredibly talented team of expert C++ software engineers. Passionate developer, hands on coder designing and developing the core components of the high-performance trading stack. Operational excellence driver, ensuring platform stability to ensure maximum uptime for trading systems across markets. Expertise expectation Expertise and deep proficiency in C/C++ Programming Track record of significant contribution to high-performance and sophisticated Algorithmic/Electronic/Real Time Trading Systems at either Hedge Funds, Proprietary Traders, Specialist liquidity providers, or large financial institutions. Excellent communication skills with the ability to drive technical agenda, lead a team, and influence business stakeholders. Requirement Minimum 5 years of experience contributing to Algorithmic/Electronic/Real Time Trading Systems. Deep expertise in C/C++ Programming, Systems Design, Architecture, Distributed Systems, DSA, Performance and Latency Optimisation. Excellent domain knowledge and experience working on Linux platforms. Excellent academic track record in Computer Science, Engineering or equivalent Specific Software Skills are: Strong expertise in C++ development, with a deep understanding of object-oriented programming, data structures, and algorithms Experience with version control systems (eg, Git), build systems, and continuous integration/continuous deployment (CI/CD) pipelines Knowledge of other programming languages (eg, Python, Java) and software development tools is a plus. Ability to translate business needs into functional code Performing PR reviews on other developers' code Clearly able to demonstrate and report on progress in delivering code Experience: Bachelor's or master's degree in computer science, Software Engineering, or a related field Proven experience as a Technical Lead or similar role in software engineering Typical Years of Experience: > 5-10 years in Software Development & >5 years of Technical Leadership Experience with software development methodologies, such as Agile or Scrum Excellent problem-solving skills and the ability to think critically and creatively Strong communication and interpersonal skills, with the ability to collaborate effectively with diverse teams Unique Skills & Requirements of the Posting Lead and mentor a team of software engineers, fostering a collaborative and innovative environment Provide technical guidance and expertise in C++ development, ensuring best practices and high standards are maintained Drive the design, development, and implementation of complex software solutions Experience in building robust enterprise software systems Flexible and pragmatic leader & team player Excellent communicator Open learning mindset
05/03/2025
Project-based
My client are a global IT consultancy working with a large corporation in the oil and gas industry This role is a 6 month contract (potential to extend) based in London (2 days per week) Rate - £675pd inside ir35 a team of developers working closely with the Electronic Market Making (EMM)/Trading Business to build the next generation electronic trading system. You must have direct experience and skills to be considered for this position. You will be a Technology leader, a strong people leader who can manage an incredibly talented team of expert C++ software engineers. Passionate developer, hands on coder designing and developing the core components of the high-performance trading stack. Operational excellence driver, ensuring platform stability to ensure maximum uptime for trading systems across markets. Expertise expectation Expertise and deep proficiency in C/C++ Programming Track record of significant contribution to high-performance and sophisticated Algorithmic/Electronic/Real Time Trading Systems at either Hedge Funds, Proprietary Traders, Specialist liquidity providers, or large financial institutions. Excellent communication skills with the ability to drive technical agenda, lead a team, and influence business stakeholders. Requirement Minimum 5 years of experience contributing to Algorithmic/Electronic/Real Time Trading Systems. Deep expertise in C/C++ Programming, Systems Design, Architecture, Distributed Systems, DSA, Performance and Latency Optimisation. Excellent domain knowledge and experience working on Linux platforms. Excellent academic track record in Computer Science, Engineering or equivalent Specific Software Skills are: Strong expertise in C++ development, with a deep understanding of object-oriented programming, data structures, and algorithms Experience with version control systems (eg, Git), build systems, and continuous integration/continuous deployment (CI/CD) pipelines Knowledge of other programming languages (eg, Python, Java) and software development tools is a plus. Ability to translate business needs into functional code Performing PR reviews on other developers' code Clearly able to demonstrate and report on progress in delivering code Experience: Bachelor's or master's degree in computer science, Software Engineering, or a related field Proven experience as a Technical Lead or similar role in software engineering Typical Years of Experience: > 5-10 years in Software Development & >5 years of Technical Leadership Experience with software development methodologies, such as Agile or Scrum Excellent problem-solving skills and the ability to think critically and creatively Strong communication and interpersonal skills, with the ability to collaborate effectively with diverse teams Unique Skills & Requirements of the Posting Lead and mentor a team of software engineers, fostering a collaborative and innovative environment Provide technical guidance and expertise in C++ development, ensuring best practices and high standards are maintained Drive the design, development, and implementation of complex software solutions Experience in building robust enterprise software systems Flexible and pragmatic leader & team player Excellent communicator Open learning mindset
Lead Data Engineer (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Lead Data Engineer/Data Engineering Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Lead Data Engineer (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Hybrid working is 2 days/week in the office. Comp: $260k - $340k + 401k
04/03/2025
Full time
Lead Data Engineer (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Lead Data Engineer/Data Engineering Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Lead Data Engineer (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Hybrid working is 2 days/week in the office. Comp: $260k - $340k + 401k
Data Engineering Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Java Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Data Engineering Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Hybrid working is 2 days/week in the office. Comp: $320k - 420k + 401k
04/03/2025
Full time
Data Engineering Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a Java Solutions Architect Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Data Engineering Solutions Architect (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Hybrid working is 2 days/week in the office. Comp: $320k - 420k + 401k
Data Engineering Manager (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a hands-on Data Engineering Manager/Architect/Technical Lead Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Data Engineering Manager (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead and co-manager of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the lead engineer/manager/solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Hybrid working is 2 days/week in the office. Comp: $320k - $420k + 401k
04/03/2025
Full time
Data Engineering Manager (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You MUST have the following: Good experience as a hands-on Data Engineering Manager/Architect/Technical Lead Excellent design and architecture ability for systems involving large amounts of data Advanced Java Amazon Web Services (AWS) or GCP CI/CD pipelines TDD Enterprise-scale SQL or Oracle Terraform, Kubernetes, Docker The following is DESIRABLE, not essential: Experience delivery projects in data management, governance and regulation Python An understanding of data mesh architecture Kafka, Iceberg, Hoodie Role: Data Engineering Manager (Architecture Architect Solutions Java Python Automation Data Lake Datalake Data Mesh CI/CD Big Data AWS SQL Oracle Java Kafka Apache Iceberg Hoodie Finance Trading Financial Services Banking Remote Working Governance Management Regulation) required by our financial services client in Manhattan, New York City. You will be hired to be the technical lead and co-manager of a new team that is being assembled to build a new data management platform on AWS. The greenfield project will include the automation of data catalogue population and the implementation of data governance policies. You will be the lead engineer/manager/solutions architect in a team that has a senior developer, mid-level developer and business lead. You and the business lead will share responsibility of the team. He will be responsible for the interpretation of data regulation, building of road maps and strategy and creation of policies. You will do the design, architecture and technical delivery of this strategy and his data policies. Over the course of the next year, you will hire more developers into the team and the workload grows. The technology is Java on AWS with some Python. You will be very hands-on and as part of a small team, you will also be involved in DevOps and testing. You will be confident with CI/CD pipelines, IaC and containerization. You will also be comfortable with enterprise-scale SQL and/or Oracle databases. As the data environment moves from an AWS based data lake to a data mesh architecture, any understanding of data mesh would also be highly desirable. You will also contribute to the two other teams in the data engineering space within the company- the data platform team which operates a Hoodie based data lake and the team working with Iceberg and Kafka to create the new data mesh architecture- but the data governance programme will be your priority. Hours are 8.30am - 5.30pm. Hybrid working is 2 days/week in the office. Comp: $320k - $420k + 401k
Request Technology - Craig Johnson
New York, New York
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Global Firm is currently seeking a Senior HR Saas Technology Engineer. Candidate will have a deep understanding of HR technology landscape and familiarity managing SaaS based applications. This role will be instrumental in driving solution selection, implementation, configuration, and on-going support of our global HR solutions while ensuring they align with our strategic objectives and deliver maximum value to the organization. This position requires a blend of technical proficiency supporting HR systems and functional understanding of HR Business Processes to deliver innovative solutions to meet the business needs of our organization. This position is also responsible for troubleshooting application/system issues, performing application maintenance and upgrade processes, coordinating the installs, developing, and maintaining integrations and documentation, and providing overall general support of HR applications and data. Responsibilities: Lead the evaluation and implementation of third-party cloud solutions and/or features based on the business needs. Requirements gathering, design, develop, test and implement high-quality, user-friendly solutions to meet the Firms goals and strategic objectives. Responsible for migration of employee data and other relevant information from Legacy systems into new HR Technology systems. Design, develop and implement well-structured RESTful APIs to enable seamless integrations. Coordinate and manage HR technology projects, ensuring they are delivered on time, within budget and to the desired quality standards. Provide Level 3 operational or systematic support and maintenance for the Firms HR applications, resolving issues promptly and efficiently. Identify opportunities to streamline HR Processes using technology and implement improvements to enhance efficiency and productivity. Collaborate with business users in designing and implementing analytics to help them interpret their data to make informed business decisions. Adhere to the Firms Security and Governance requirements across all administered applications. Adhere to the Firms IT Service Delivery standards and Change Control processes. Qualifications: Bachelors degree is required. Preferably in Computer Science, Mathematics or equivalent. 10+ years experience of application support/configuration in HR Technology solutions such as Learning Management, CLE management, Recruiting and development systems. 2+ years with Cloud technologies (Azure preferred). The ideal candidate must have good judgment, problem-solving, oral, written, and interpersonal communication skills, as well as the ability to work in a fast-paced environment and build positive working relationships. In addition, candidates must be self-motivated, organized, and able to multi-task and effectively prioritize competing demands. Experience with SQL, including stored procedures, functions and triggers is required. Experience with programming languages is required (Ex: Java, .Net, Python, Shell Scripts). In-depth knowledge of SSO standards and protocols (eg SAML, OAuth) and RESTful API design patterns. Proactive in escalating issues and pulling in support from other technical experts as required. Ability to train and guide junior software developers/analysts. Embrace a nimble mindset and adapt quickly to changing requirements and goals in a fast-paced, dynamic environment. Must be a self-starter and able to work independently with little direction/supervision. Preferred Skills: Developing with Intapp Integration Builder High-level understanding of Active Directory Domain Services Experience with large scale ERP systems like PeopleSoft, Oracle HCM, Workday Developing integrations with MuleSoft
03/03/2025
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Global Firm is currently seeking a Senior HR Saas Technology Engineer. Candidate will have a deep understanding of HR technology landscape and familiarity managing SaaS based applications. This role will be instrumental in driving solution selection, implementation, configuration, and on-going support of our global HR solutions while ensuring they align with our strategic objectives and deliver maximum value to the organization. This position requires a blend of technical proficiency supporting HR systems and functional understanding of HR Business Processes to deliver innovative solutions to meet the business needs of our organization. This position is also responsible for troubleshooting application/system issues, performing application maintenance and upgrade processes, coordinating the installs, developing, and maintaining integrations and documentation, and providing overall general support of HR applications and data. Responsibilities: Lead the evaluation and implementation of third-party cloud solutions and/or features based on the business needs. Requirements gathering, design, develop, test and implement high-quality, user-friendly solutions to meet the Firms goals and strategic objectives. Responsible for migration of employee data and other relevant information from Legacy systems into new HR Technology systems. Design, develop and implement well-structured RESTful APIs to enable seamless integrations. Coordinate and manage HR technology projects, ensuring they are delivered on time, within budget and to the desired quality standards. Provide Level 3 operational or systematic support and maintenance for the Firms HR applications, resolving issues promptly and efficiently. Identify opportunities to streamline HR Processes using technology and implement improvements to enhance efficiency and productivity. Collaborate with business users in designing and implementing analytics to help them interpret their data to make informed business decisions. Adhere to the Firms Security and Governance requirements across all administered applications. Adhere to the Firms IT Service Delivery standards and Change Control processes. Qualifications: Bachelors degree is required. Preferably in Computer Science, Mathematics or equivalent. 10+ years experience of application support/configuration in HR Technology solutions such as Learning Management, CLE management, Recruiting and development systems. 2+ years with Cloud technologies (Azure preferred). The ideal candidate must have good judgment, problem-solving, oral, written, and interpersonal communication skills, as well as the ability to work in a fast-paced environment and build positive working relationships. In addition, candidates must be self-motivated, organized, and able to multi-task and effectively prioritize competing demands. Experience with SQL, including stored procedures, functions and triggers is required. Experience with programming languages is required (Ex: Java, .Net, Python, Shell Scripts). In-depth knowledge of SSO standards and protocols (eg SAML, OAuth) and RESTful API design patterns. Proactive in escalating issues and pulling in support from other technical experts as required. Ability to train and guide junior software developers/analysts. Embrace a nimble mindset and adapt quickly to changing requirements and goals in a fast-paced, dynamic environment. Must be a self-starter and able to work independently with little direction/supervision. Preferred Skills: Developing with Intapp Integration Builder High-level understanding of Active Directory Domain Services Experience with large scale ERP systems like PeopleSoft, Oracle HCM, Workday Developing integrations with MuleSoft
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Global Firm is currently seeking a Senior HR Saas Technology Engineer. Candidate will have a deep understanding of HR technology landscape and familiarity managing SaaS based applications. This role will be instrumental in driving solution selection, implementation, configuration, and on-going support of our global HR solutions while ensuring they align with our strategic objectives and deliver maximum value to the organization. This position requires a blend of technical proficiency supporting HR systems and functional understanding of HR Business Processes to deliver innovative solutions to meet the business needs of our organization. This position is also responsible for troubleshooting application/system issues, performing application maintenance and upgrade processes, coordinating the installs, developing, and maintaining integrations and documentation, and providing overall general support of HR applications and data. Responsibilities: Lead the evaluation and implementation of third-party cloud solutions and/or features based on the business needs. Requirements gathering, design, develop, test and implement high-quality, user-friendly solutions to meet the Firms goals and strategic objectives. Responsible for migration of employee data and other relevant information from Legacy systems into new HR Technology systems. Design, develop and implement well-structured RESTful APIs to enable seamless integrations. Coordinate and manage HR technology projects, ensuring they are delivered on time, within budget and to the desired quality standards. Provide Level 3 operational or systematic support and maintenance for the Firms HR applications, resolving issues promptly and efficiently. Identify opportunities to streamline HR Processes using technology and implement improvements to enhance efficiency and productivity. Collaborate with business users in designing and implementing analytics to help them interpret their data to make informed business decisions. Adhere to the Firms Security and Governance requirements across all administered applications. Adhere to the Firms IT Service Delivery standards and Change Control processes. Qualifications: Bachelors degree is required. Preferably in Computer Science, Mathematics or equivalent. 10+ years experience of application support/configuration in HR Technology solutions such as Learning Management, CLE management, Recruiting and development systems. 2+ years with Cloud technologies (Azure preferred). The ideal candidate must have good judgment, problem-solving, oral, written, and interpersonal communication skills, as well as the ability to work in a fast-paced environment and build positive working relationships. In addition, candidates must be self-motivated, organized, and able to multi-task and effectively prioritize competing demands. Experience with SQL, including stored procedures, functions and triggers is required. Experience with programming languages is required (Ex: Java, .Net, Python, Shell Scripts). In-depth knowledge of SSO standards and protocols (eg SAML, OAuth) and RESTful API design patterns. Proactive in escalating issues and pulling in support from other technical experts as required. Ability to train and guide junior software developers/analysts. Embrace a nimble mindset and adapt quickly to changing requirements and goals in a fast-paced, dynamic environment. Must be a self-starter and able to work independently with little direction/supervision. Preferred Skills: Developing with Intapp Integration Builder High-level understanding of Active Directory Domain Services Experience with large scale ERP systems like PeopleSoft, Oracle HCM, Workday Developing integrations with MuleSoft
03/03/2025
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Global Firm is currently seeking a Senior HR Saas Technology Engineer. Candidate will have a deep understanding of HR technology landscape and familiarity managing SaaS based applications. This role will be instrumental in driving solution selection, implementation, configuration, and on-going support of our global HR solutions while ensuring they align with our strategic objectives and deliver maximum value to the organization. This position requires a blend of technical proficiency supporting HR systems and functional understanding of HR Business Processes to deliver innovative solutions to meet the business needs of our organization. This position is also responsible for troubleshooting application/system issues, performing application maintenance and upgrade processes, coordinating the installs, developing, and maintaining integrations and documentation, and providing overall general support of HR applications and data. Responsibilities: Lead the evaluation and implementation of third-party cloud solutions and/or features based on the business needs. Requirements gathering, design, develop, test and implement high-quality, user-friendly solutions to meet the Firms goals and strategic objectives. Responsible for migration of employee data and other relevant information from Legacy systems into new HR Technology systems. Design, develop and implement well-structured RESTful APIs to enable seamless integrations. Coordinate and manage HR technology projects, ensuring they are delivered on time, within budget and to the desired quality standards. Provide Level 3 operational or systematic support and maintenance for the Firms HR applications, resolving issues promptly and efficiently. Identify opportunities to streamline HR Processes using technology and implement improvements to enhance efficiency and productivity. Collaborate with business users in designing and implementing analytics to help them interpret their data to make informed business decisions. Adhere to the Firms Security and Governance requirements across all administered applications. Adhere to the Firms IT Service Delivery standards and Change Control processes. Qualifications: Bachelors degree is required. Preferably in Computer Science, Mathematics or equivalent. 10+ years experience of application support/configuration in HR Technology solutions such as Learning Management, CLE management, Recruiting and development systems. 2+ years with Cloud technologies (Azure preferred). The ideal candidate must have good judgment, problem-solving, oral, written, and interpersonal communication skills, as well as the ability to work in a fast-paced environment and build positive working relationships. In addition, candidates must be self-motivated, organized, and able to multi-task and effectively prioritize competing demands. Experience with SQL, including stored procedures, functions and triggers is required. Experience with programming languages is required (Ex: Java, .Net, Python, Shell Scripts). In-depth knowledge of SSO standards and protocols (eg SAML, OAuth) and RESTful API design patterns. Proactive in escalating issues and pulling in support from other technical experts as required. Ability to train and guide junior software developers/analysts. Embrace a nimble mindset and adapt quickly to changing requirements and goals in a fast-paced, dynamic environment. Must be a self-starter and able to work independently with little direction/supervision. Preferred Skills: Developing with Intapp Integration Builder High-level understanding of Active Directory Domain Services Experience with large scale ERP systems like PeopleSoft, Oracle HCM, Workday Developing integrations with MuleSoft