*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Director, Software Engineering - QRM. This director will manage 6 people and will help develop software applications and solutions for the quantitative management platform. This director will need hands-on experience with Java, DevOps, CICD, AWS, Containers, terraform, Etc. Responsibilities: Develop and maintain software and environments used to implement and test systems for pricing, margin risk and stress testing of financial products and derivatives. Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Configure, execute, and monitor execution pipelines for model testing, backtesting and monitoring. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Provide hands-on technical leadership and active coordination of tasks and priorities. Provide guidance and support for the team and reporting for the management. Qualifications: Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 10+ years of experience as a software developer with exposure to the cloud or high-performance computing areas Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Experience with logging, profiling, monitoring, telemetry (eg Splunk, OpenTelemetry). Good command of database technology and query languages (SQL) and non-relational DB and other Big Data technology, including efficient storage and serialization protocols (eg Parquet, Avro, Protocol Buffers). Experience with automated quality assurance frameworks (eg, Junit, TestNG, PyTest, etc.). Experience with high performance and distributed computing. Experience with productivity tools such as Jira, Confluence, MS Office.
10/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Director, Software Engineering - QRM. This director will manage 6 people and will help develop software applications and solutions for the quantitative management platform. This director will need hands-on experience with Java, DevOps, CICD, AWS, Containers, terraform, Etc. Responsibilities: Develop and maintain software and environments used to implement and test systems for pricing, margin risk and stress testing of financial products and derivatives. Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Configure, execute, and monitor execution pipelines for model testing, backtesting and monitoring. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Provide hands-on technical leadership and active coordination of tasks and priorities. Provide guidance and support for the team and reporting for the management. Qualifications: Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 10+ years of experience as a software developer with exposure to the cloud or high-performance computing areas Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Experience with logging, profiling, monitoring, telemetry (eg Splunk, OpenTelemetry). Good command of database technology and query languages (SQL) and non-relational DB and other Big Data technology, including efficient storage and serialization protocols (eg Parquet, Avro, Protocol Buffers). Experience with automated quality assurance frameworks (eg, Junit, TestNG, PyTest, etc.). Experience with high performance and distributed computing. Experience with productivity tools such as Jira, Confluence, MS Office.
GCP Data Streaming Engineer - 3 months - Inside IR35 - Hybrid Are you passionate about leveraging data to drive transformational change? Here's your chance to make a meaningful impact as a GCP Data Streaming Engineer with a renowned global consultancy. Join a market leading consultancy for an exhilarating 3-month contract (with prospects for extension) where you'll play a pivotal role in shaping the future of data streaming solutions. As our GCP Data Streaming Engineer, you'll be at the forefront of harnessing the power of Google Cloud Platform (GCP) to design, develop, and implement cutting-edge data streaming solutions. Leveraging your expertise in GCP technologies, including Pub/Sub, Dataflow, and BigQuery. Key Responsibilities: Develop Scalable Solutions: Lead the creation of scalable and dependable data streaming solutions on GCP using Apache Kafka and associated technologies. Optimize Kafka Setup: Customize Kafka brokers, topics, partitions, and replication to guarantee the highest performance and reliability of data streams. Configure Kafka Connectors: Apply your expertise to set up Kafka connectors for batch processing, managing both source and sink connectors to seamlessly integrate data. Python and Apache Beam Proficiency: Utilize Python and Apache Beam to craft tailored data processing logic and transformations within pipelines, enabling swift and effective data analysis. Ensure Security: Implement SSL/TLS encryption, SASL authentication, and ACL-based authorization to fortify Kafka clusters and communication channels, ensuring data integrity and privacy. What you will Ideally Bring: Hands-On Kafka Configuration: Proven expertise in configuring Kafka connectors for batch processing, optimizing their number for improved performance. Python and DataFlow/Apache Beam Proficiency: Skilled in Python and DataFlow/Apache Beam, adept at developing custom data processing logic within pipelines Streaming Data Management: Demonstrated ability in handling streaming data, ensuring timely processing and Real Time analysis, employing various techniques like windowing and buffering. Secured Cloud Environment Experience: Experienced in deploying Kafka in secure cloud environments, implementing SSL/TLS encryption, SASL authentication, and ACL-based authorization. Kafka Configuration and Governance: Proficient in configuring Kafka brokers, managing security, and enforcing schema governance to ensure reliability, scalability, and compliance. Contract Details: Duration: 3 months Day Rate: Up to £550 Per Day (All Inclusive) Location: Cardiff/Remote GCP Data Streaming Engineer - 3 months - Inside IR35 - Hybrid
10/05/2024
Project-based
GCP Data Streaming Engineer - 3 months - Inside IR35 - Hybrid Are you passionate about leveraging data to drive transformational change? Here's your chance to make a meaningful impact as a GCP Data Streaming Engineer with a renowned global consultancy. Join a market leading consultancy for an exhilarating 3-month contract (with prospects for extension) where you'll play a pivotal role in shaping the future of data streaming solutions. As our GCP Data Streaming Engineer, you'll be at the forefront of harnessing the power of Google Cloud Platform (GCP) to design, develop, and implement cutting-edge data streaming solutions. Leveraging your expertise in GCP technologies, including Pub/Sub, Dataflow, and BigQuery. Key Responsibilities: Develop Scalable Solutions: Lead the creation of scalable and dependable data streaming solutions on GCP using Apache Kafka and associated technologies. Optimize Kafka Setup: Customize Kafka brokers, topics, partitions, and replication to guarantee the highest performance and reliability of data streams. Configure Kafka Connectors: Apply your expertise to set up Kafka connectors for batch processing, managing both source and sink connectors to seamlessly integrate data. Python and Apache Beam Proficiency: Utilize Python and Apache Beam to craft tailored data processing logic and transformations within pipelines, enabling swift and effective data analysis. Ensure Security: Implement SSL/TLS encryption, SASL authentication, and ACL-based authorization to fortify Kafka clusters and communication channels, ensuring data integrity and privacy. What you will Ideally Bring: Hands-On Kafka Configuration: Proven expertise in configuring Kafka connectors for batch processing, optimizing their number for improved performance. Python and DataFlow/Apache Beam Proficiency: Skilled in Python and DataFlow/Apache Beam, adept at developing custom data processing logic within pipelines Streaming Data Management: Demonstrated ability in handling streaming data, ensuring timely processing and Real Time analysis, employing various techniques like windowing and buffering. Secured Cloud Environment Experience: Experienced in deploying Kafka in secure cloud environments, implementing SSL/TLS encryption, SASL authentication, and ACL-based authorization. Kafka Configuration and Governance: Proficient in configuring Kafka brokers, managing security, and enforcing schema governance to ensure reliability, scalability, and compliance. Contract Details: Duration: 3 months Day Rate: Up to £550 Per Day (All Inclusive) Location: Cardiff/Remote GCP Data Streaming Engineer - 3 months - Inside IR35 - Hybrid
Business Development Manager - French-speaking We have teamed with one of the biggest IT distributors in the UK which is looking for a French-speaking sales professional to join their growing team Responsibilities Generate qualified leads for our clients, by confidently using SPIN and other selling techniques; Develop specific and extensive client and product knowledge depending on each campaign to ensure client needs are met; Identify new business opportunities and quick win situations, and nurture database; French speaking
10/05/2024
Full time
Business Development Manager - French-speaking We have teamed with one of the biggest IT distributors in the UK which is looking for a French-speaking sales professional to join their growing team Responsibilities Generate qualified leads for our clients, by confidently using SPIN and other selling techniques; Develop specific and extensive client and product knowledge depending on each campaign to ensure client needs are met; Identify new business opportunities and quick win situations, and nurture database; French speaking
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Director of Risk Management Software Engineering. Candidate will be responsible for functions within Quantitative Risk Management for developing and maintaining risk models for margin, clearing fund and stress testing with the focus on developing and maintaining risk model software in production, and environments and infrastructure used in model implementation and testing. Responsibilities: Collaborate with other developers, quantitative analysts, business users, data & technology staff to expand QRM's technical capabilities for model development, back-testing and monitoring. Develop and maintain software and environments used to implement and test systems for pricing, margin risk and stress testing of financial products and derivatives. Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Configure, execute, and monitor execution pipelines for model testing, back-testing and monitoring. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Provide hands-on technical leadership and active coordination of tasks and priorities. Provide guidance and support for the team and reporting for the management. Qualifications: Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. Track record of complex production implementations and a demonstrated ability in developing and maintaining enterprise level software, including in the cloud environment. Proficiency in technical and/or scientific documentation (eg, white papers, user guides, etc.) Strong problem-solving skills: Be able to accurately identify a problem's source, severity, and impact to determine possible solutions and needed resources. Experience with Agile/SCRUM or another rapid development framework. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 10+ years of experience as a software developer with exposure to the cloud or high-performance computing areas Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Experience with logging, profiling, monitoring, telemetry (eg Splunk, OpenTelemetry). Good command of database technology and query languages (SQL) and non-relational DB and other Big Data technology, including efficient storage and serialization protocols (eg Parquet, Avro, Protocol Buffers). Experience with automated quality assurance frameworks (eg, Junit, TestNG, PyTest, etc.). Experience with high performance and distributed computing. Experience with productivity tools such as Jira, Confluence, MS Office. Experience with Scripting languages such as Python is a plus. Experience with numerical libraries and/or scientific computing is a plus.
09/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Director of Risk Management Software Engineering. Candidate will be responsible for functions within Quantitative Risk Management for developing and maintaining risk models for margin, clearing fund and stress testing with the focus on developing and maintaining risk model software in production, and environments and infrastructure used in model implementation and testing. Responsibilities: Collaborate with other developers, quantitative analysts, business users, data & technology staff to expand QRM's technical capabilities for model development, back-testing and monitoring. Develop and maintain software and environments used to implement and test systems for pricing, margin risk and stress testing of financial products and derivatives. Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Configure, execute, and monitor execution pipelines for model testing, back-testing and monitoring. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Provide hands-on technical leadership and active coordination of tasks and priorities. Provide guidance and support for the team and reporting for the management. Qualifications: Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. Track record of complex production implementations and a demonstrated ability in developing and maintaining enterprise level software, including in the cloud environment. Proficiency in technical and/or scientific documentation (eg, white papers, user guides, etc.) Strong problem-solving skills: Be able to accurately identify a problem's source, severity, and impact to determine possible solutions and needed resources. Experience with Agile/SCRUM or another rapid development framework. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 10+ years of experience as a software developer with exposure to the cloud or high-performance computing areas Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Experience with logging, profiling, monitoring, telemetry (eg Splunk, OpenTelemetry). Good command of database technology and query languages (SQL) and non-relational DB and other Big Data technology, including efficient storage and serialization protocols (eg Parquet, Avro, Protocol Buffers). Experience with automated quality assurance frameworks (eg, Junit, TestNG, PyTest, etc.). Experience with high performance and distributed computing. Experience with productivity tools such as Jira, Confluence, MS Office. Experience with Scripting languages such as Python is a plus. Experience with numerical libraries and/or scientific computing is a plus.
We are Global IT Recruitment specialist that provides support to the clients across UK, Europe and Australia. We have an excellent job opportunity for you. Role- Debt Manager, Tallyman platforms Duration - Through till the end of 2024 Location- Knutsford - Hybrid position - 2 days in office Mandatory Skill Proven experience in running large data migrations for complex business services - with big bang and phased approach Strong background in presenting migration approaches, getting buy-in and executing migration plans Experience in managing and communicating with a variety of business, operations and technical stakeholders Demonstrates a high level of personal responsibility, pragmatism, and autonomy, planning own work to meet given objectives within a defined framework Excellent leadership and communication skills. Ability to navigate internal hierarchies, agendas and deliver what is required Ability to work effectively under tight deadlines Desired Skill Experience on Debt Manager, Tallyman platforms Roles and Responsibilities This role will lead and manage the Data migration delivery for the BFA Cards portfolio, working closely with the business, change and engineering teams. They will collaborate with cross-functional teams to define migrations strategies, engaging all key stakeholders and manage all the migration events
09/05/2024
Project-based
We are Global IT Recruitment specialist that provides support to the clients across UK, Europe and Australia. We have an excellent job opportunity for you. Role- Debt Manager, Tallyman platforms Duration - Through till the end of 2024 Location- Knutsford - Hybrid position - 2 days in office Mandatory Skill Proven experience in running large data migrations for complex business services - with big bang and phased approach Strong background in presenting migration approaches, getting buy-in and executing migration plans Experience in managing and communicating with a variety of business, operations and technical stakeholders Demonstrates a high level of personal responsibility, pragmatism, and autonomy, planning own work to meet given objectives within a defined framework Excellent leadership and communication skills. Ability to navigate internal hierarchies, agendas and deliver what is required Ability to work effectively under tight deadlines Desired Skill Experience on Debt Manager, Tallyman platforms Roles and Responsibilities This role will lead and manage the Data migration delivery for the BFA Cards portfolio, working closely with the business, change and engineering teams. They will collaborate with cross-functional teams to define migrations strategies, engaging all key stakeholders and manage all the migration events
Our client is seeking Senior Consultant in Data Engineering with extensive experience in MDM. This is a one year FTE, hybrid role in London, UK. Experience details: Must have 12+ Experience in Architecting Data & Analytics Platforms Minimum 5+ years of Experience in Banking MDM implementation with atleast 2 implementation experience Must have 5+ years in Data Governance Solutions Must Have strong understanding of Banking Regulations & their applicability for Data & Analytics Platforms Must have 8+ years' Experience with Relational Databases like Oracle, NoSQL Databases and/or Big Data technologies (eg Oracle, SQL Server, Postgres, Spark, Hadoop, other Open Source). Must have experience in Data Security Solutions (Identity and Access Management and Data Security Access Management) Must have 3+ years' experience of DevOps (CI/CD) Certifications - MDM Certified Must have experience with SLCD (Agile/Waterfall). Drive the architecture of a project, including authoring functional and design specifications, scalability, testing, quality data flow, and interface. Ability to Lead and Manage team and Interact with End users clients. Worked in Onsite/Offshore model. Demonstrated excellent communication, presentation, and problem-solving skills. Experience in project governance and enterprise customer management Role Details: Design Customer/Party MDM Solutions Understanding of Market leading MDM Platform with comparative view of capability/offerings/limitations & Accuracy Understanding of Out of box AI/Ml solutions of COTS products and their limitations Design to address MDM limitations Setup Customer 360 Setup Single Global Customer ID for historic customer where Multiple Customer ID's generated with LOB due to silos operations of Retails, Wholesale Businesses Design Integrated Ecosystems (CRM, KYC, Screening, Third Party) with Customer MDM/Customer 360 Define integration patterns of Surrounding systems with MDM Understanding of Customer Screening and KYC requirements from Banking perspective Conduct MVP/POC
09/05/2024
Full time
Our client is seeking Senior Consultant in Data Engineering with extensive experience in MDM. This is a one year FTE, hybrid role in London, UK. Experience details: Must have 12+ Experience in Architecting Data & Analytics Platforms Minimum 5+ years of Experience in Banking MDM implementation with atleast 2 implementation experience Must have 5+ years in Data Governance Solutions Must Have strong understanding of Banking Regulations & their applicability for Data & Analytics Platforms Must have 8+ years' Experience with Relational Databases like Oracle, NoSQL Databases and/or Big Data technologies (eg Oracle, SQL Server, Postgres, Spark, Hadoop, other Open Source). Must have experience in Data Security Solutions (Identity and Access Management and Data Security Access Management) Must have 3+ years' experience of DevOps (CI/CD) Certifications - MDM Certified Must have experience with SLCD (Agile/Waterfall). Drive the architecture of a project, including authoring functional and design specifications, scalability, testing, quality data flow, and interface. Ability to Lead and Manage team and Interact with End users clients. Worked in Onsite/Offshore model. Demonstrated excellent communication, presentation, and problem-solving skills. Experience in project governance and enterprise customer management Role Details: Design Customer/Party MDM Solutions Understanding of Market leading MDM Platform with comparative view of capability/offerings/limitations & Accuracy Understanding of Out of box AI/Ml solutions of COTS products and their limitations Design to address MDM limitations Setup Customer 360 Setup Single Global Customer ID for historic customer where Multiple Customer ID's generated with LOB due to silos operations of Retails, Wholesale Businesses Design Integrated Ecosystems (CRM, KYC, Screening, Third Party) with Customer MDM/Customer 360 Define integration patterns of Surrounding systems with MDM Understanding of Customer Screening and KYC requirements from Banking perspective Conduct MVP/POC
French Speaking Data Cloud Full Stack Solution Architect/Paris Hybrid 3 days per week onsite/8 months/Start ASAP Role & Responsibilities: In the context of a big Data Transformation initiative on complete set of our data capabilities: Data Architecture and Engineering, Datamodelling, Storage for Data & Analytics, Data Visualisation, Data Science, Data Integration, Metadata Management, Data Storage and Warehousing, support the future Data Foundation platform technical architecture activities, including: Provide technical guidance and establish best practices for Snowflake account setup and configuration Manage Infrastructure-as-code and maximise automation Manage enhancements and deployments to support a fully federated Platform Subject Matter Expert (SME) for all Snowflake related questions on the project Own platform specific Snowflake documentation (decisions, best practices, features) Communication and demonstration of new features Design the cloud environment from a holistic point of view, ensuring it meets all functional and non-functional requirements Carry out deployment, maintenance, monitoring, and management tasks Oversee cloud security for the account Complete the integration of new applications into the cloud environment Education: * Higher education completed, inevitably with a degree in Computer Sciences. Experience: * 5 to 10 years' experience * Experience in putting in place Data Platforms in a cloud environment Skills: Fluent in French & English (must) Deep Snowflake expertise Platform architecture DBA experience Cloud database admin AWS Architecture Cloud networking specialist Excellence in communication, coordination & collaboration, stakeholder & risk management and especially with drive & leadership Open minded and accepting challenges Highly motivated, adaptable and flexible. Willing to integrate an existing environment and an existing project team
08/05/2024
Project-based
French Speaking Data Cloud Full Stack Solution Architect/Paris Hybrid 3 days per week onsite/8 months/Start ASAP Role & Responsibilities: In the context of a big Data Transformation initiative on complete set of our data capabilities: Data Architecture and Engineering, Datamodelling, Storage for Data & Analytics, Data Visualisation, Data Science, Data Integration, Metadata Management, Data Storage and Warehousing, support the future Data Foundation platform technical architecture activities, including: Provide technical guidance and establish best practices for Snowflake account setup and configuration Manage Infrastructure-as-code and maximise automation Manage enhancements and deployments to support a fully federated Platform Subject Matter Expert (SME) for all Snowflake related questions on the project Own platform specific Snowflake documentation (decisions, best practices, features) Communication and demonstration of new features Design the cloud environment from a holistic point of view, ensuring it meets all functional and non-functional requirements Carry out deployment, maintenance, monitoring, and management tasks Oversee cloud security for the account Complete the integration of new applications into the cloud environment Education: * Higher education completed, inevitably with a degree in Computer Sciences. Experience: * 5 to 10 years' experience * Experience in putting in place Data Platforms in a cloud environment Skills: Fluent in French & English (must) Deep Snowflake expertise Platform architecture DBA experience Cloud database admin AWS Architecture Cloud networking specialist Excellence in communication, coordination & collaboration, stakeholder & risk management and especially with drive & leadership Open minded and accepting challenges Highly motivated, adaptable and flexible. Willing to integrate an existing environment and an existing project team
Subject: Cloud Consultant/Architect - On-Site - Gloucestershire/Bristol - £65 to £95K - AWS - IaaS - PaaS - Kubernetes - Automation Job Title: Cloud Technical Consultant/Architect Location: Gloucestershire/Bristol Salary: £65 - £95K Per Annum Benefits: Bonus, flexible working hours, career opportunities, private medical, excellent pension, and social benefits Active DV Clearance is highly desirable. Please note that candidates will need to be eligible to undergo DV Clearance. The Client: Curo are collaborating with a global edge-to-cloud company advancing the way people live and work. They help companies connect, protect, analyse, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today's complex world. The Candidate: This is a fantastic opportunity for someone who has big ambitions and an outstanding ability to create strong relationships - or for a dynamic & seasoned Technologist who is looking for new & exciting opportunities to make a difference. Your focus will be to provide clients with the optimal consultative service and experience, resulting in business outcomes that meeting core client values and business requirements. If you are looking for challenges in a fast paced, thriving, international work environment, then we definitely want to hear from you. The Role: This is a brand new opportunity for a bright, driven, customer focussed professional to join our clients Cloud Delivery' team, and work alongside our Enterprise Cloud specialists to drive forward the design, deployment & operations of Cloud Infrastructure, Automation and Containerisation projects for the end-client. The delivery team help deliver valued clients the most effective Cloud solution to suit the organisational requirements of dynamic and fast-paced business. They support them to exploit maximum business benefit from Cloud solutions, leveraging best in class internal and Partner technologies to create relevant and engaging experiences. Duties: Support the design and development of new capabilities, preparing solution options, investigating technology, designing and running proof of concepts, providing assessments, advice and solution options, providing high level and low level design documentation. Cloud engineering capability to leverage Public Cloud platform using automated build processes deployed using Infrastructure as Code. Provide technical challenge and assurance throughout development and delivery of work. Develop re-useable common solutions and patterns to reduce development lead times, improve commonality and lowering Total Cost of Ownership. Work independently and/or within a team using a DevOps way of working. Required Technical skills & experience: Experienced in Cloud native technologies in AWS. Experienced in deploying IaaS/PaaS in Multi Cloud Environments. Experienced in Cloud and Infrastructure Engineering building and testing new capabilities, and supporting the development of new solutions and common templates. Experienced in being able to act as bridge from the infrastructure through to user facing systems. Desirable Technical Skills & Experience: Experienced in Kubernetes Containers. Experienced in the use of Automation tools eg Terraform, Ansible, Foreman, Puppet and Python. Experienced in different flavours of Linux platform and services. To apply for this Cloud Consultant/Architect permanent job, please click the button below and submit your latest CV. Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience. Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.
06/05/2024
Full time
Subject: Cloud Consultant/Architect - On-Site - Gloucestershire/Bristol - £65 to £95K - AWS - IaaS - PaaS - Kubernetes - Automation Job Title: Cloud Technical Consultant/Architect Location: Gloucestershire/Bristol Salary: £65 - £95K Per Annum Benefits: Bonus, flexible working hours, career opportunities, private medical, excellent pension, and social benefits Active DV Clearance is highly desirable. Please note that candidates will need to be eligible to undergo DV Clearance. The Client: Curo are collaborating with a global edge-to-cloud company advancing the way people live and work. They help companies connect, protect, analyse, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today's complex world. The Candidate: This is a fantastic opportunity for someone who has big ambitions and an outstanding ability to create strong relationships - or for a dynamic & seasoned Technologist who is looking for new & exciting opportunities to make a difference. Your focus will be to provide clients with the optimal consultative service and experience, resulting in business outcomes that meeting core client values and business requirements. If you are looking for challenges in a fast paced, thriving, international work environment, then we definitely want to hear from you. The Role: This is a brand new opportunity for a bright, driven, customer focussed professional to join our clients Cloud Delivery' team, and work alongside our Enterprise Cloud specialists to drive forward the design, deployment & operations of Cloud Infrastructure, Automation and Containerisation projects for the end-client. The delivery team help deliver valued clients the most effective Cloud solution to suit the organisational requirements of dynamic and fast-paced business. They support them to exploit maximum business benefit from Cloud solutions, leveraging best in class internal and Partner technologies to create relevant and engaging experiences. Duties: Support the design and development of new capabilities, preparing solution options, investigating technology, designing and running proof of concepts, providing assessments, advice and solution options, providing high level and low level design documentation. Cloud engineering capability to leverage Public Cloud platform using automated build processes deployed using Infrastructure as Code. Provide technical challenge and assurance throughout development and delivery of work. Develop re-useable common solutions and patterns to reduce development lead times, improve commonality and lowering Total Cost of Ownership. Work independently and/or within a team using a DevOps way of working. Required Technical skills & experience: Experienced in Cloud native technologies in AWS. Experienced in deploying IaaS/PaaS in Multi Cloud Environments. Experienced in Cloud and Infrastructure Engineering building and testing new capabilities, and supporting the development of new solutions and common templates. Experienced in being able to act as bridge from the infrastructure through to user facing systems. Desirable Technical Skills & Experience: Experienced in Kubernetes Containers. Experienced in the use of Automation tools eg Terraform, Ansible, Foreman, Puppet and Python. Experienced in different flavours of Linux platform and services. To apply for this Cloud Consultant/Architect permanent job, please click the button below and submit your latest CV. Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience. Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.