Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Enterprise Financial Institution is currently seeking a Atlassian SaaS Platform Administrator Candidate will coach others through education of platform development best practice. The Software Engineer must have a background in SaaS/Low-Code implementation, system administration and can help others to contribute to platform improvements. Responsibilities: Provide technical leadership for planning, designing, installing, testing, and implementing solutions. Provide subject matter expertise on the SDLC platforms we maintain (Confluence, Jira, SpiraTest). Implementing Atlassian plugins and supporting integration with other enterprise software. Ensure best practice adherence for core system development, security, tuning and performance. Supports Knowledge Management (KM) program strategy, transformation, and technical implementation. Creates knowledge documentation related to requirements and solution design. Facilitates knowledge transfer sessions for administration and self-service. Develop a train the trainer model for support and administration. Overall ownership of solution implementation and working with the team to ensure quality solutions that provide delightful experiences. Ensure systems and process compliance to regulatory and organizational requirements. Ability to stay on top of industry trends and best practices and propose process and tool changes to take advantage of new developments in the industry. Qualifications: [Required] 5+ years of working experience in IT. [Required] 3+ years of experience in implementing Atlassian products. [Required] Experience with RESTful APIs, JSON, and XML. [Required] Experience with Agile/Scrum or DevOps methodologies. [Preferred] Experience working in Financial Services or otherwise regulated environment. [Preferred] Experience working with SharePoint. [Preferred] Experience with SQL, Python, PowerShell, or other Scripting languages [Preferred] Experience with System and Data Architecture [Preferred] Experience or knowledge of SDLC pipeline tools such as Git, Jenkins, SonarQube or similar tools Technical Skills: Problem-solving skills and solution-oriented attitude. Requires a complete understanding of the system development life cycle. Highly motivated to learn new things and motivated by challenges. Using analysis and critical thinking skills to determine and assess the needs of the user and then create software to meet the requirements. Proactive attitude in automating processes as much as possible. Ability to understand the strategic goals of the platforms we support and evaluate customer requests in that context. Providing clear instructions to the project team, clearly explaining how the software works to the customer and being available to answer any questions that may arise using exceptional communication skills. Working well with others on the team using effective interpersonal skills. Being able to efficiently identify and resolve issues that arise during the design, testing and maintenance processes using problem-solving skills. Experience: Bachelors degree in a STEM field preferred, 4 years of additional related work experience may be substituted for degree. 3-5 years of experience of SaaS platform implementation and/or system administration. 3+ years of hands-on experience developing and maintaining cloud platform technologies. Certifications in Atlassian products are preferred. Low code/COTS implementation certifications are desired.
08/07/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Enterprise Financial Institution is currently seeking a Atlassian SaaS Platform Administrator Candidate will coach others through education of platform development best practice. The Software Engineer must have a background in SaaS/Low-Code implementation, system administration and can help others to contribute to platform improvements. Responsibilities: Provide technical leadership for planning, designing, installing, testing, and implementing solutions. Provide subject matter expertise on the SDLC platforms we maintain (Confluence, Jira, SpiraTest). Implementing Atlassian plugins and supporting integration with other enterprise software. Ensure best practice adherence for core system development, security, tuning and performance. Supports Knowledge Management (KM) program strategy, transformation, and technical implementation. Creates knowledge documentation related to requirements and solution design. Facilitates knowledge transfer sessions for administration and self-service. Develop a train the trainer model for support and administration. Overall ownership of solution implementation and working with the team to ensure quality solutions that provide delightful experiences. Ensure systems and process compliance to regulatory and organizational requirements. Ability to stay on top of industry trends and best practices and propose process and tool changes to take advantage of new developments in the industry. Qualifications: [Required] 5+ years of working experience in IT. [Required] 3+ years of experience in implementing Atlassian products. [Required] Experience with RESTful APIs, JSON, and XML. [Required] Experience with Agile/Scrum or DevOps methodologies. [Preferred] Experience working in Financial Services or otherwise regulated environment. [Preferred] Experience working with SharePoint. [Preferred] Experience with SQL, Python, PowerShell, or other Scripting languages [Preferred] Experience with System and Data Architecture [Preferred] Experience or knowledge of SDLC pipeline tools such as Git, Jenkins, SonarQube or similar tools Technical Skills: Problem-solving skills and solution-oriented attitude. Requires a complete understanding of the system development life cycle. Highly motivated to learn new things and motivated by challenges. Using analysis and critical thinking skills to determine and assess the needs of the user and then create software to meet the requirements. Proactive attitude in automating processes as much as possible. Ability to understand the strategic goals of the platforms we support and evaluate customer requests in that context. Providing clear instructions to the project team, clearly explaining how the software works to the customer and being available to answer any questions that may arise using exceptional communication skills. Working well with others on the team using effective interpersonal skills. Being able to efficiently identify and resolve issues that arise during the design, testing and maintenance processes using problem-solving skills. Experience: Bachelors degree in a STEM field preferred, 4 years of additional related work experience may be substituted for degree. 3-5 years of experience of SaaS platform implementation and/or system administration. 3+ years of hands-on experience developing and maintaining cloud platform technologies. Certifications in Atlassian products are preferred. Low code/COTS implementation certifications are desired.
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Financial IT Infrastructure Architect. Candidate will be part of a small Innovation team of Architects that will collaborate with development teams, Solutions Architects, vendors, and other stakeholders to define and drive architectural vision, implementation and continuous improvement of solutions running on the core Real Time data streaming and compute infrastructure platforms such Kafka, Flink and K8s in a Hybrid Environment. Responsibilities: Collaborate with cross-functional teams to design, create and review software application architectures specifically tailored for streaming use cases. Ensure fault tolerance, scalability, and low-latency processing in streaming applications. Collaborate with DevOps teams to define deployment strategies and manage scalability. Drive optimization of streaming application performance by fine-tuning configurations, monitoring resource utilization, and identifying bottlenecks. Drive Implementation of best practices for efficient data serialization, compression, and network communication. Create and maintain architecture documentation, including system diagrams, data flow, and component interactions. Maintain vendor relationships and participate in escalation sessions and postmortems Evaluate and recommend tools and frameworks that enhance the performance and reliability of our streaming systems. Stay informed about industry trends related to Kafka, Flink, and Kubernetes. Qualifications: [Required] Effective communication skills to effectively collaborate and evangelize best practices with technical stakeholders. [Required] Advanced problem-solving skills and logical approach to solving problems [Required] Ability to execute spikes and provide code samples demonstrating best practices when developing solutions on Kafka and Flink. [Required] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Technical Skills: Expert level knowledge of Kafka Expert level knowledge of Flink In depth knowledge of on-premises networking as well as the hybrid connectivity to AWS and/or Azure Knowledge of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), compute, storage, database, network, content distribution, security/IAM, microservices, management, and serverless services Knowledge of Infrastructure as Code (IaC) such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Education and/or Experience: [Preferred] Bachelor's or Master's degree in an engineering discipline [Required] 10+ years of experience architecting of mission critical Cloud and On-Prem Real Time data streaming and event-driven architectures [Required] 10+ years of experience with Java [Required] 5+ years of specific Kafka and Flink experience [Preferred] 5+ years of Kubernetes experience Certificates or Licenses: [Preferred] Confluent Certified Developer for Apache Kafka [Preferred] AWS certifications (eg Solutions Architect Associate) [Preferred] Certified Kubernetes Application Developer
08/07/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Financial IT Infrastructure Architect. Candidate will be part of a small Innovation team of Architects that will collaborate with development teams, Solutions Architects, vendors, and other stakeholders to define and drive architectural vision, implementation and continuous improvement of solutions running on the core Real Time data streaming and compute infrastructure platforms such Kafka, Flink and K8s in a Hybrid Environment. Responsibilities: Collaborate with cross-functional teams to design, create and review software application architectures specifically tailored for streaming use cases. Ensure fault tolerance, scalability, and low-latency processing in streaming applications. Collaborate with DevOps teams to define deployment strategies and manage scalability. Drive optimization of streaming application performance by fine-tuning configurations, monitoring resource utilization, and identifying bottlenecks. Drive Implementation of best practices for efficient data serialization, compression, and network communication. Create and maintain architecture documentation, including system diagrams, data flow, and component interactions. Maintain vendor relationships and participate in escalation sessions and postmortems Evaluate and recommend tools and frameworks that enhance the performance and reliability of our streaming systems. Stay informed about industry trends related to Kafka, Flink, and Kubernetes. Qualifications: [Required] Effective communication skills to effectively collaborate and evangelize best practices with technical stakeholders. [Required] Advanced problem-solving skills and logical approach to solving problems [Required] Ability to execute spikes and provide code samples demonstrating best practices when developing solutions on Kafka and Flink. [Required] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Technical Skills: Expert level knowledge of Kafka Expert level knowledge of Flink In depth knowledge of on-premises networking as well as the hybrid connectivity to AWS and/or Azure Knowledge of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), compute, storage, database, network, content distribution, security/IAM, microservices, management, and serverless services Knowledge of Infrastructure as Code (IaC) such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Education and/or Experience: [Preferred] Bachelor's or Master's degree in an engineering discipline [Required] 10+ years of experience architecting of mission critical Cloud and On-Prem Real Time data streaming and event-driven architectures [Required] 10+ years of experience with Java [Required] 5+ years of specific Kafka and Flink experience [Preferred] 5+ years of Kubernetes experience Certificates or Licenses: [Preferred] Confluent Certified Developer for Apache Kafka [Preferred] AWS certifications (eg Solutions Architect Associate) [Preferred] Certified Kubernetes Application Developer
Our award winning client is looking for an Azure DevOps Engineer to join the on a permanent full time basis. You will be joining their existing DevOps engineers to implement, maintain and document the Azure hosted environments so that these continue to provide a robust, scalable, and cost-effective hosting solutions. Key skills & experience 2 + years' implementing and maintaining DevOps processes Docker containers (Linux and Windows) Container orchestration using Kubernetes Continuous deployments Microservices architecture Centralized logging and health checks Azure Pipelines Azure SQL databases and elastic pools Azure SQL MI IIS SSO CosmosDB Azure Git or equivalent source control Azure Service Bus or other message bus solution eg RabbitMq Powershell MS SQL Experience Nginx Azure Load Testing Azure Application Insights Azure Kubernetes Service Platform tuning experience Beneficial skills Bicep CloudFlare ARM Templates Familiar with Octopus Deploy Knowledge of C# .NET Prometheus/Grafana dashboards Seq, Loki or other application logging software VM's Excellent benefits Annual performance-based bonus incentives Full private health insurance with no excess through their healthcare partner Group Life Insurance and Income Protection BUPA Dental Insurance 23 days holiday, rising to 26 days per years' service + all UK Bank Holidays Employer pension contributions up to 10% AIG LifeWorks employee assistance programme (EAP) - 24/7 support for mental, financial, physical and emotional wellbeing Work-life balance - flexible working and work from home Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
08/07/2024
Full time
Our award winning client is looking for an Azure DevOps Engineer to join the on a permanent full time basis. You will be joining their existing DevOps engineers to implement, maintain and document the Azure hosted environments so that these continue to provide a robust, scalable, and cost-effective hosting solutions. Key skills & experience 2 + years' implementing and maintaining DevOps processes Docker containers (Linux and Windows) Container orchestration using Kubernetes Continuous deployments Microservices architecture Centralized logging and health checks Azure Pipelines Azure SQL databases and elastic pools Azure SQL MI IIS SSO CosmosDB Azure Git or equivalent source control Azure Service Bus or other message bus solution eg RabbitMq Powershell MS SQL Experience Nginx Azure Load Testing Azure Application Insights Azure Kubernetes Service Platform tuning experience Beneficial skills Bicep CloudFlare ARM Templates Familiar with Octopus Deploy Knowledge of C# .NET Prometheus/Grafana dashboards Seq, Loki or other application logging software VM's Excellent benefits Annual performance-based bonus incentives Full private health insurance with no excess through their healthcare partner Group Life Insurance and Income Protection BUPA Dental Insurance 23 days holiday, rising to 26 days per years' service + all UK Bank Holidays Employer pension contributions up to 10% AIG LifeWorks employee assistance programme (EAP) - 24/7 support for mental, financial, physical and emotional wellbeing Work-life balance - flexible working and work from home Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
Principal Java Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Java Solutions Architect/Principal Engineer/Engineering Manager Leadership experience: you must have led small teams on the delivery of projects AWS (EC2, ECS, EKS, Glue) Python SQL Spark MWAA/Airflow Agile The following is DESIRABLE, not essential: Iceberg DBT Trading, Front Office finance Role: Principal Java Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You will join a number of teams that are responsible for the core engineering of a large amount of financial trading data. The data is currently ingested into, and stored in, an AWS data lake. This is being migrated to a data mesh architecture though. You will lead a team of 4-5 engineers, in a very hands-on role, that will contribute towards this migration, working with AWS Glue, Athena, Python, Java, Iceberg, DBT, Arrow and Dremio. 20-30% of the role will be spent mentoring members of the team, architectural reviews, code reviews, implementing best practices, reporting to senior management and contributing towards technical strategy. They have a very flexible hybrid working set up of 1-2 days/month in the office. Salary: £125k - £155k + 15% Guaranteed Bonus + 10% Pension
08/07/2024
Full time
Principal Java Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Java Solutions Architect/Principal Engineer/Engineering Manager Leadership experience: you must have led small teams on the delivery of projects AWS (EC2, ECS, EKS, Glue) Python SQL Spark MWAA/Airflow Agile The following is DESIRABLE, not essential: Iceberg DBT Trading, Front Office finance Role: Principal Java Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You will join a number of teams that are responsible for the core engineering of a large amount of financial trading data. The data is currently ingested into, and stored in, an AWS data lake. This is being migrated to a data mesh architecture though. You will lead a team of 4-5 engineers, in a very hands-on role, that will contribute towards this migration, working with AWS Glue, Athena, Python, Java, Iceberg, DBT, Arrow and Dremio. 20-30% of the role will be spent mentoring members of the team, architectural reviews, code reviews, implementing best practices, reporting to senior management and contributing towards technical strategy. They have a very flexible hybrid working set up of 1-2 days/month in the office. Salary: £125k - £155k + 15% Guaranteed Bonus + 10% Pension
Principal Java Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Java Solutions Architect/Principal Engineer/Engineering Manager Leadership experience: you must have led small teams on the delivery of projects AWS (EC2, ECS, EKS, Glue) Python SQL Spark MWAA/Airflow Agile The following is DESIRABLE, not essential: Iceberg DBT Trading, Front Office finance Role: Principal Java Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You will join a number of teams that are responsible for the core engineering of a large amount of financial trading data. The data is currently ingested into, and stored in, an AWS data lake. This is being migrated to a data mesh architecture though. You will lead a team of 4-5 engineers, in a very hands-on role, that will contribute towards this migration, working with AWS Glue, Athena, Python, Java, Iceberg, DBT, Arrow and Dremio. 20-30% of the role will be spent mentoring members of the team, architectural reviews, code reviews, implementing best practices, reporting to senior management and contributing towards technical strategy. They have a very flexible hybrid working set up of 1-2 days/month in the office. Salary: £90k - £125k + 15% Guaranteed Bonus + 10% Pension
08/07/2024
Full time
Principal Java Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Java Solutions Architect/Principal Engineer/Engineering Manager Leadership experience: you must have led small teams on the delivery of projects AWS (EC2, ECS, EKS, Glue) Python SQL Spark MWAA/Airflow Agile The following is DESIRABLE, not essential: Iceberg DBT Trading, Front Office finance Role: Principal Java Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You will join a number of teams that are responsible for the core engineering of a large amount of financial trading data. The data is currently ingested into, and stored in, an AWS data lake. This is being migrated to a data mesh architecture though. You will lead a team of 4-5 engineers, in a very hands-on role, that will contribute towards this migration, working with AWS Glue, Athena, Python, Java, Iceberg, DBT, Arrow and Dremio. 20-30% of the role will be spent mentoring members of the team, architectural reviews, code reviews, implementing best practices, reporting to senior management and contributing towards technical strategy. They have a very flexible hybrid working set up of 1-2 days/month in the office. Salary: £90k - £125k + 15% Guaranteed Bonus + 10% Pension
Principal Python Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Python Solutions Architect/Principal Engineer/Engineering Manager Leadership experience: you must have led small teams on the delivery of projects AWS (EC2, ECS, EKS, Glue) Java SQL Spark MWAA/Airflow Agile The following is DESIRABLE, not essential: Iceberg DBT Trading, Front Office finance Role: Principal Python Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You will join a number of teams that are responsible for the core engineering of a large amount of financial trading data. The data is currently ingested into, and stored in, an AWS data lake. This is being migrated to a data mesh architecture though. You will lead a team of 4-5 engineers, in a very hands-on role, that will contribute towards this migration, working with AWS Glue, Athena, Python, Java, Iceberg, DBT, Arrow and Dremio. 20-30% of the role will be spent mentoring members of the team, architectural reviews, code reviews, implementing best practices, reporting to senior management and contributing towards technical strategy. They have a very flexible hybrid working set up of 1-2 days/month in the office. Salary: £125k - £155k + 15% Guaranteed Bonus + 10% Pension
08/07/2024
Full time
Principal Python Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Python Solutions Architect/Principal Engineer/Engineering Manager Leadership experience: you must have led small teams on the delivery of projects AWS (EC2, ECS, EKS, Glue) Java SQL Spark MWAA/Airflow Agile The following is DESIRABLE, not essential: Iceberg DBT Trading, Front Office finance Role: Principal Python Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You will join a number of teams that are responsible for the core engineering of a large amount of financial trading data. The data is currently ingested into, and stored in, an AWS data lake. This is being migrated to a data mesh architecture though. You will lead a team of 4-5 engineers, in a very hands-on role, that will contribute towards this migration, working with AWS Glue, Athena, Python, Java, Iceberg, DBT, Arrow and Dremio. 20-30% of the role will be spent mentoring members of the team, architectural reviews, code reviews, implementing best practices, reporting to senior management and contributing towards technical strategy. They have a very flexible hybrid working set up of 1-2 days/month in the office. Salary: £125k - £155k + 15% Guaranteed Bonus + 10% Pension
Principal Python Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Python Solutions Architect/Principal Engineer/Engineering Manager Leadership experience: you must have led small teams on the delivery of projects AWS (EC2, ECS, EKS, Glue) Java SQL Spark MWAA/Airflow Agile The following is DESIRABLE, not essential: Iceberg DBT Trading, Front Office finance Role: Principal Python Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You will join a number of teams that are responsible for the core engineering of a large amount of financial trading data. The data is currently ingested into, and stored in, an AWS data lake. This is being migrated to a data mesh architecture though. You will lead a team of 4-5 engineers, in a very hands-on role, that will contribute towards this migration, working with AWS Glue, Athena, Python, Java, Iceberg, DBT, Arrow and Dremio. 20-30% of the role will be spent mentoring members of the team, architectural reviews, code reviews, implementing best practices, reporting to senior management and contributing towards technical strategy. They have a very flexible hybrid working set up of 1-2 days/month in the office. Salary: £90k - £125k + 15% Guaranteed Bonus + 10% Pension
08/07/2024
Full time
Principal Python Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Python Solutions Architect/Principal Engineer/Engineering Manager Leadership experience: you must have led small teams on the delivery of projects AWS (EC2, ECS, EKS, Glue) Java SQL Spark MWAA/Airflow Agile The following is DESIRABLE, not essential: Iceberg DBT Trading, Front Office finance Role: Principal Python Data Engineer (Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Iceberg PySpark MWAA Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You will join a number of teams that are responsible for the core engineering of a large amount of financial trading data. The data is currently ingested into, and stored in, an AWS data lake. This is being migrated to a data mesh architecture though. You will lead a team of 4-5 engineers, in a very hands-on role, that will contribute towards this migration, working with AWS Glue, Athena, Python, Java, Iceberg, DBT, Arrow and Dremio. 20-30% of the role will be spent mentoring members of the team, architectural reviews, code reviews, implementing best practices, reporting to senior management and contributing towards technical strategy. They have a very flexible hybrid working set up of 1-2 days/month in the office. Salary: £90k - £125k + 15% Guaranteed Bonus + 10% Pension
Senior CRM Developer - Dynamics, C# .NET Our client, a leading financial services provider based in Reigate are currently seeking a Senior CRM Developer to join their team. You will be developing, supporting and maintaining their software solutions in line with the company's architectural vision. Key Requirements: Expert in Microsoft Dynamics CRM (including 365). Proficient in C#, VB.NET, .NET framework, and Azure. Skilled in DevOps and Continuous Integration/Delivery using tools such as TFS, Azure DevOps, MSBuild, and Release Management. Experienced with various Microsoft applications and platforms. Utilises Test Driven Development and Behaviour Driven Development methodologies. Strong understanding of development patterns and best practices. This is a permanent role paying £75,000-£85,000 per annum + bonus & benefits and offers hybrid working (3 days per week, which can be split between the client's Reigate and London offices). Successful applicants will be contacted within 24 hours of applying. The processing and use of your personal data by us are in accordance with our Privacy Notice, which can be found on our website. William Alexander's Diversity & Inclusion Policy actively promotes the principles of equality, diversity, and inclusion in all its dealings with employees, workers, job applicants, clients, customers, suppliers, contractors, and the public. We believe that an inclusive work culture, where people of different backgrounds are valued equally, will ensure better outcomes for us all. We approach recruitment for our clients with the same perspective and qualities. Senior CRM Developer - Dynamics, C# .Net
08/07/2024
Full time
Senior CRM Developer - Dynamics, C# .NET Our client, a leading financial services provider based in Reigate are currently seeking a Senior CRM Developer to join their team. You will be developing, supporting and maintaining their software solutions in line with the company's architectural vision. Key Requirements: Expert in Microsoft Dynamics CRM (including 365). Proficient in C#, VB.NET, .NET framework, and Azure. Skilled in DevOps and Continuous Integration/Delivery using tools such as TFS, Azure DevOps, MSBuild, and Release Management. Experienced with various Microsoft applications and platforms. Utilises Test Driven Development and Behaviour Driven Development methodologies. Strong understanding of development patterns and best practices. This is a permanent role paying £75,000-£85,000 per annum + bonus & benefits and offers hybrid working (3 days per week, which can be split between the client's Reigate and London offices). Successful applicants will be contacted within 24 hours of applying. The processing and use of your personal data by us are in accordance with our Privacy Notice, which can be found on our website. William Alexander's Diversity & Inclusion Policy actively promotes the principles of equality, diversity, and inclusion in all its dealings with employees, workers, job applicants, clients, customers, suppliers, contractors, and the public. We believe that an inclusive work culture, where people of different backgrounds are valued equally, will ensure better outcomes for us all. We approach recruitment for our clients with the same perspective and qualities. Senior CRM Developer - Dynamics, C# .Net
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Java Software Engineer. Candidate will support and work collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Responsibilities: Support the application development of Real Time and batch applications for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. [Required] 3+ year of experience in building high speed, Real Time and batch solutions [Required] 3+ years of experience in Java [Preferred] Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc [Preferred] Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. [Preferred] Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc [Preferred] Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google [Required] Experience writing unit and integration tests with testing frameworks like Junit, Citrus [Required] Experience working with various types of databases like Relational, NoSQL [Required] Experience working with Git [Preferred] Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc [Preferred] Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics [Required] Hands-on experience with Java version 8 onwards, Spring, SpringBoot, REST API Technical Skills: [Required] Java-based software development experience, including deep understanding of Java fundamentals like Data structures, Concurrency and Multithreading [Required] Experience in object-oriented design and software design patterns Education and/or Experience: [Required] BS degree in Computer Science, similar technical field required
03/07/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Java Software Engineer. Candidate will support and work collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Responsibilities: Support the application development of Real Time and batch applications for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. [Required] 3+ year of experience in building high speed, Real Time and batch solutions [Required] 3+ years of experience in Java [Preferred] Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc [Preferred] Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. [Preferred] Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc [Preferred] Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google [Required] Experience writing unit and integration tests with testing frameworks like Junit, Citrus [Required] Experience working with various types of databases like Relational, NoSQL [Required] Experience working with Git [Preferred] Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc [Preferred] Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics [Required] Hands-on experience with Java version 8 onwards, Spring, SpringBoot, REST API Technical Skills: [Required] Java-based software development experience, including deep understanding of Java fundamentals like Data structures, Concurrency and Multithreading [Required] Experience in object-oriented design and software design patterns Education and/or Experience: [Required] BS degree in Computer Science, similar technical field required
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious financial company is looking for a Java Back End Developer. This developer will need experience with Java, Real Time environment, Spring, Spring Boot, Multithreading, etc. Any experience with Kafka and DevOps tools is a plus. Responsibilities: Support the application development of Real Time and batch applications for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: Java-based software development experience, including deep understanding of Java fundamentals like Data structures, Concurrency and Multithreading Experience in object-oriented design and software design patterns BS degree in Computer Science, similar technical field required 3+ year of experience in building high speed, Real Time and batch solutions 3+ years of experience in Java Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google Experience writing unit and integration tests with testing frameworks like Junit, Citrus Experience working with various types of databases like Relational, NoSQL Experience working with Git Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc Hands-on experience with Java version 8 onwards, Spring, SpringBoot, REST API
03/07/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious financial company is looking for a Java Back End Developer. This developer will need experience with Java, Real Time environment, Spring, Spring Boot, Multithreading, etc. Any experience with Kafka and DevOps tools is a plus. Responsibilities: Support the application development of Real Time and batch applications for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: Java-based software development experience, including deep understanding of Java fundamentals like Data structures, Concurrency and Multithreading Experience in object-oriented design and software design patterns BS degree in Computer Science, similar technical field required 3+ year of experience in building high speed, Real Time and batch solutions 3+ years of experience in Java Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google Experience writing unit and integration tests with testing frameworks like Junit, Citrus Experience working with various types of databases like Relational, NoSQL Experience working with Git Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc Hands-on experience with Java version 8 onwards, Spring, SpringBoot, REST API
ASSOCIATE PRINCIPAL, SOFTWARE ENGINEERING (JAVA) SALARY: $160k - $170k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite and 2 days remote NO SPONSORSHIP Looking for a candidate with 5 plus years Back End Java development version 8 or above. financial big plus. Must have event-driven systems experience of cloud-based AWS data solutions any devops terraform ansible jenkins. big plus memory model data structures concurrency and Multithreading strong testing flint Apache Spark kafka streams etc. Re: Java, do you understand Multithreading What is your level of experience in Spring. A Re: Kafka Can you answer basic user/developer questions Re: Flink do you have any experience Do you have any skills or understanding of BigO notations. This role supports and works collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Specialist, Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Primary Duties and Responsibilities: To perform this job successfully, an individual must be able to perform each primary duty satisfactorily. Support the application development of big data application for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. 5+ year of experience in building high speed, data-centric solutions 5+ years of experience in Java Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Experience with cloud technologies and migrations. Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google Experience writing unit and integration tests with testing frameworks like Junit, Citrus Experience following Git workflows Working knowledge of DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics Technical Skills: Java-based software development experience and Multithreading Fluent in object-oriented design Strong testing experience Experience working with two or more of the following: Unix/Linux environments, event-driven systems, transaction processing systems, distributed and parallel systems, large software system development, security software development, public-cloud platforms Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API
02/07/2024
Full time
ASSOCIATE PRINCIPAL, SOFTWARE ENGINEERING (JAVA) SALARY: $160k - $170k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite and 2 days remote NO SPONSORSHIP Looking for a candidate with 5 plus years Back End Java development version 8 or above. financial big plus. Must have event-driven systems experience of cloud-based AWS data solutions any devops terraform ansible jenkins. big plus memory model data structures concurrency and Multithreading strong testing flint Apache Spark kafka streams etc. Re: Java, do you understand Multithreading What is your level of experience in Spring. A Re: Kafka Can you answer basic user/developer questions Re: Flink do you have any experience Do you have any skills or understanding of BigO notations. This role supports and works collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Specialist, Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Primary Duties and Responsibilities: To perform this job successfully, an individual must be able to perform each primary duty satisfactorily. Support the application development of big data application for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. 5+ year of experience in building high speed, data-centric solutions 5+ years of experience in Java Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Experience with cloud technologies and migrations. Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google Experience writing unit and integration tests with testing frameworks like Junit, Citrus Experience following Git workflows Working knowledge of DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics Technical Skills: Java-based software development experience and Multithreading Fluent in object-oriented design Strong testing experience Experience working with two or more of the following: Unix/Linux environments, event-driven systems, transaction processing systems, distributed and parallel systems, large software system development, security software development, public-cloud platforms Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API
Job Title: Product Architect Job Location: London/Leeds/Edinburgh Job Type: Perm About the FCA The FCA regulates the conduct of 50,000 firms in the UK to ensure our financial markets are honest, fair and competitive. We do this to make sure markets work well for individuals, businesses and the economy as a whole. The team/department The Data & Analytics Product Group (DAPG) sits within the Regulatory Systems Department of the Data, Technology & Innovation (DTI) Division. The DAPG supports the delivery of the FCA's Digital and Data Strategies, optimising FCA's performance as a digitally led regulator. The Analyse & Insight (A&I) product team (within DAPG) specifically supports FCA's Data Science and Advanced Analytics functions and the FCA's Data Science strategy. What you will get from the role? Stimulating, innovative and experimental work to solve the biggest challenges facing financial regulation and the opportunity to make a tangible impact on the organisation Exposure to new ideas, opportunity to increase your knowledge and understanding of new technologies in Financial Regulation Exposure to senior industry leaders and an opportunity to work with international regulators Working in key strategic initiatives supporting FCA's Data Science and AI Strategy The skills and experience you will have Minimum We are a signatory to the Government's Disability Confident scheme. This means that we will offer an interview to disabled candidates entering under the scheme, should they meet the minimum criteria for a role. Experience with Amazon Web Services, and AI/Data Science related components eg AWS Sagemaker, AWS Cognitive Services Exposure to Cloud Technologies Experience of Microservices architecture design and implementation Essential Experience with Jenkins, Git, Chef & Linux Hands on development background and ready to delivery as required Security Cleared or eligible for Security Clearance Strong stakeholder management Designing resilient and scalable systems Automated development using declarative CI/CD pipelines Secure by design architecture Approach and implementation in Data Science and AI industry trends If you are someone who is seeking that next challenge, and you have the experience and skills required, then please send me your CV / Our Recruitment Delivery Team are committed to offering an inclusive recruitment experience to all candidates. If you require any accommodations or adjustments as a result of disability, impairment, or health condition, please do not hesitate to let me know.
02/07/2024
Full time
Job Title: Product Architect Job Location: London/Leeds/Edinburgh Job Type: Perm About the FCA The FCA regulates the conduct of 50,000 firms in the UK to ensure our financial markets are honest, fair and competitive. We do this to make sure markets work well for individuals, businesses and the economy as a whole. The team/department The Data & Analytics Product Group (DAPG) sits within the Regulatory Systems Department of the Data, Technology & Innovation (DTI) Division. The DAPG supports the delivery of the FCA's Digital and Data Strategies, optimising FCA's performance as a digitally led regulator. The Analyse & Insight (A&I) product team (within DAPG) specifically supports FCA's Data Science and Advanced Analytics functions and the FCA's Data Science strategy. What you will get from the role? Stimulating, innovative and experimental work to solve the biggest challenges facing financial regulation and the opportunity to make a tangible impact on the organisation Exposure to new ideas, opportunity to increase your knowledge and understanding of new technologies in Financial Regulation Exposure to senior industry leaders and an opportunity to work with international regulators Working in key strategic initiatives supporting FCA's Data Science and AI Strategy The skills and experience you will have Minimum We are a signatory to the Government's Disability Confident scheme. This means that we will offer an interview to disabled candidates entering under the scheme, should they meet the minimum criteria for a role. Experience with Amazon Web Services, and AI/Data Science related components eg AWS Sagemaker, AWS Cognitive Services Exposure to Cloud Technologies Experience of Microservices architecture design and implementation Essential Experience with Jenkins, Git, Chef & Linux Hands on development background and ready to delivery as required Security Cleared or eligible for Security Clearance Strong stakeholder management Designing resilient and scalable systems Automated development using declarative CI/CD pipelines Secure by design architecture Approach and implementation in Data Science and AI industry trends If you are someone who is seeking that next challenge, and you have the experience and skills required, then please send me your CV / Our Recruitment Delivery Team are committed to offering an inclusive recruitment experience to all candidates. If you require any accommodations or adjustments as a result of disability, impairment, or health condition, please do not hesitate to let me know.
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Enterprise Financial Institution is currently seeking a Atlassian SaaS Platform Administrator. Candidate will coach others through education of platform development best practice. The Software Engineer must have a background in SaaS/Low-Code implementation, system administration and can help others to contribute to platform improvements. Responsibilities: Provide technical leadership for planning, designing, installing, testing, and implementing solutions. Provide subject matter expertise on the SDLC platforms we maintain (Confluence, Jira, SpiraTest). Implementing Atlassian plugins and supporting integration with other enterprise software. Ensure best practice adherence for core system development, security, tuning and performance. Supports Knowledge Management (KM) program strategy, transformation, and technical implementation. Creates knowledge documentation related to requirements and solution design. Facilitates knowledge transfer sessions for administration and self-service. Develop a train the trainer model for support and administration. Overall ownership of solution implementation and working with the team to ensure quality solutions that provide delightful experiences. Ensure systems and process compliance to regulatory and organizational requirements. Ability to stay on top of industry trends and best practices and propose process and tool changes to take advantage of new developments in the industry. Qualifications: [Required] 5+ years of working experience in IT. [Required] 3+ years of experience in implementing Atlassian products. [Required] Experience with RESTful APIs, JSON, and XML. [Required] Experience with Agile/Scrum or DevOps methodologies. [Preferred] Experience working in Financial Services or otherwise regulated environment. [Preferred] Experience working with SharePoint. [Preferred] Experience with SQL, Python, PowerShell, or other Scripting languages [Preferred] Experience with System and Data Architecture [Preferred] Experience or knowledge of SDLC pipeline tools such as Git, Jenkins, SonarQube or similar tools Technical Skills: Problem-solving skills and solution-oriented attitude. Requires a complete understanding of the system development life cycle. Highly motivated to learn new things and motivated by challenges. Using analysis and critical thinking skills to determine and assess the needs of the user and then create software to meet the requirements. Proactive attitude in automating processes as much as possible. Ability to understand the strategic goals of the platforms we support and evaluate customer requests in that context. Providing clear instructions to the project team, clearly explaining how the software works to the customer and being available to answer any questions that may arise using exceptional communication skills. Working well with others on the team using effective interpersonal skills. Being able to efficiently identify and resolve issues that arise during the design, testing and maintenance processes using problem-solving skills. Experience: Bachelors degree in a STEM field preferred, 4 years of additional related work experience may be substituted for degree. 3-5 years of experience of SaaS platform implementation and/or system administration. 3+ years of hands-on experience developing and maintaining cloud platform technologies. Certifications in Atlassian products are preferred. Low code/COTS implementation certifications are desired.
01/07/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Enterprise Financial Institution is currently seeking a Atlassian SaaS Platform Administrator. Candidate will coach others through education of platform development best practice. The Software Engineer must have a background in SaaS/Low-Code implementation, system administration and can help others to contribute to platform improvements. Responsibilities: Provide technical leadership for planning, designing, installing, testing, and implementing solutions. Provide subject matter expertise on the SDLC platforms we maintain (Confluence, Jira, SpiraTest). Implementing Atlassian plugins and supporting integration with other enterprise software. Ensure best practice adherence for core system development, security, tuning and performance. Supports Knowledge Management (KM) program strategy, transformation, and technical implementation. Creates knowledge documentation related to requirements and solution design. Facilitates knowledge transfer sessions for administration and self-service. Develop a train the trainer model for support and administration. Overall ownership of solution implementation and working with the team to ensure quality solutions that provide delightful experiences. Ensure systems and process compliance to regulatory and organizational requirements. Ability to stay on top of industry trends and best practices and propose process and tool changes to take advantage of new developments in the industry. Qualifications: [Required] 5+ years of working experience in IT. [Required] 3+ years of experience in implementing Atlassian products. [Required] Experience with RESTful APIs, JSON, and XML. [Required] Experience with Agile/Scrum or DevOps methodologies. [Preferred] Experience working in Financial Services or otherwise regulated environment. [Preferred] Experience working with SharePoint. [Preferred] Experience with SQL, Python, PowerShell, or other Scripting languages [Preferred] Experience with System and Data Architecture [Preferred] Experience or knowledge of SDLC pipeline tools such as Git, Jenkins, SonarQube or similar tools Technical Skills: Problem-solving skills and solution-oriented attitude. Requires a complete understanding of the system development life cycle. Highly motivated to learn new things and motivated by challenges. Using analysis and critical thinking skills to determine and assess the needs of the user and then create software to meet the requirements. Proactive attitude in automating processes as much as possible. Ability to understand the strategic goals of the platforms we support and evaluate customer requests in that context. Providing clear instructions to the project team, clearly explaining how the software works to the customer and being available to answer any questions that may arise using exceptional communication skills. Working well with others on the team using effective interpersonal skills. Being able to efficiently identify and resolve issues that arise during the design, testing and maintenance processes using problem-solving skills. Experience: Bachelors degree in a STEM field preferred, 4 years of additional related work experience may be substituted for degree. 3-5 years of experience of SaaS platform implementation and/or system administration. 3+ years of hands-on experience developing and maintaining cloud platform technologies. Certifications in Atlassian products are preferred. Low code/COTS implementation certifications are desired.
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Financial IT Infrastructure Architect. Candidate will be part of a small Innovation team of Architects that will collaborate with development teams, Solutions Architects, vendors, and other stakeholders to define and drive architectural vision, implementation and continuous improvement of solutions running on the core Real Time data streaming and compute infrastructure platforms such Kafka, Flink and K8s in a Hybrid Environment. Responsibilities: Collaborate with cross-functional teams to design, create and review software application architectures specifically tailored for streaming use cases. Ensure fault tolerance, scalability, and low-latency processing in streaming applications. Collaborate with DevOps teams to define deployment strategies and manage scalability. Drive optimization of streaming application performance by fine-tuning configurations, monitoring resource utilization, and identifying bottlenecks. Drive Implementation of best practices for efficient data serialization, compression, and network communication. Create and maintain architecture documentation, including system diagrams, data flow, and component interactions. Maintain vendor relationships and participate in escalation sessions and postmortems Evaluate and recommend tools and frameworks that enhance the performance and reliability of our streaming systems. Stay informed about industry trends related to Kafka, Flink, and Kubernetes. Qualifications: [Required] Effective communication skills to effectively collaborate and evangelize best practices with technical stakeholders. [Required] Advanced problem-solving skills and logical approach to solving problems [Required] Ability to execute spikes and provide code samples demonstrating best practices when developing solutions on Kafka and Flink. [Required] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Technical Skills: Expert level knowledge of Kafka Expert level knowledge of Flink In depth knowledge of on-premises networking as well as the hybrid connectivity to AWS and/or Azure Knowledge of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), compute, storage, database, network, content distribution, security/IAM, microservices, management, and serverless services Knowledge of Infrastructure as Code (IaC) such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Education and/or Experience: [Preferred] Bachelor's or Master's degree in an engineering discipline [Required] 10+ years of experience architecting of mission critical Cloud and On-Prem Real Time data streaming and event-driven architectures [Required] 10+ years of experience with Java [Required] 5+ years of specific Kafka and Flink experience [Preferred] 5+ years of Kubernetes experience Certificates or Licenses: [Preferred] Confluent Certified Developer for Apache Kafka [Preferred] AWS certifications (eg Solutions Architect Associate) [Preferred] Certified Kubernetes Application Developer
21/06/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Financial IT Infrastructure Architect. Candidate will be part of a small Innovation team of Architects that will collaborate with development teams, Solutions Architects, vendors, and other stakeholders to define and drive architectural vision, implementation and continuous improvement of solutions running on the core Real Time data streaming and compute infrastructure platforms such Kafka, Flink and K8s in a Hybrid Environment. Responsibilities: Collaborate with cross-functional teams to design, create and review software application architectures specifically tailored for streaming use cases. Ensure fault tolerance, scalability, and low-latency processing in streaming applications. Collaborate with DevOps teams to define deployment strategies and manage scalability. Drive optimization of streaming application performance by fine-tuning configurations, monitoring resource utilization, and identifying bottlenecks. Drive Implementation of best practices for efficient data serialization, compression, and network communication. Create and maintain architecture documentation, including system diagrams, data flow, and component interactions. Maintain vendor relationships and participate in escalation sessions and postmortems Evaluate and recommend tools and frameworks that enhance the performance and reliability of our streaming systems. Stay informed about industry trends related to Kafka, Flink, and Kubernetes. Qualifications: [Required] Effective communication skills to effectively collaborate and evangelize best practices with technical stakeholders. [Required] Advanced problem-solving skills and logical approach to solving problems [Required] Ability to execute spikes and provide code samples demonstrating best practices when developing solutions on Kafka and Flink. [Required] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Technical Skills: Expert level knowledge of Kafka Expert level knowledge of Flink In depth knowledge of on-premises networking as well as the hybrid connectivity to AWS and/or Azure Knowledge of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), compute, storage, database, network, content distribution, security/IAM, microservices, management, and serverless services Knowledge of Infrastructure as Code (IaC) such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Education and/or Experience: [Preferred] Bachelor's or Master's degree in an engineering discipline [Required] 10+ years of experience architecting of mission critical Cloud and On-Prem Real Time data streaming and event-driven architectures [Required] 10+ years of experience with Java [Required] 5+ years of specific Kafka and Flink experience [Preferred] 5+ years of Kubernetes experience Certificates or Licenses: [Preferred] Confluent Certified Developer for Apache Kafka [Preferred] AWS certifications (eg Solutions Architect Associate) [Preferred] Certified Kubernetes Application Developer