Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking an AWS DevOps Software Engineer. Candidate will provide subject matter expertise for ongoing support of applications deployed to non-production AWS environments and supporting 3rd party applications. Identify root causes and automate solutions in support of development. Candidate will have a deep understanding of DevOps practices, leadership skills, and expertise in various tools and technologies. You will be working in a fast-paced, dynamic environment, using cutting-edge tools and cloud technologies. Manage day to day activities when called upon. Responsibilities: Desing Develop release and support, Cloud Native applications running on Containers Kubernetes and Docker within AWS. DevOps Strategy: Develop and implement DevOps strategies and best practices to enhance development, testing, and deployment processes. Possess in-depth knowledge and hands-on experience with DevOps tools and technologies, including but not limited to GitHub, Jenkins, Terraform, Ansible, Kafka, AWS, Apigee. Support the lower environments for incident and problem management. Resolve complex support issues in non-production environments. Create procedural and troubleshooting documentation related to cloud native applications. Write complex automation scripts using common automation tools, such as yaml, Json, Bash, Groovy, Ansible, Terraform and python, Perform other duties as assigned Qualifications: Excellent problem-solving skills. Ability to work independently. Ability to work with management to prioritize tasks. Demonstrate strong confidence in abilities and knowledge. Ability to work well in crisis situations. Ability to work under minimal supervision. Flexibility to be on call from 5 PM to 7 AM for 3 months per year. Good written and oral communication skills. Technical Skills: Expertise on Kubernetes and Docker, including best practices Expertise in cloud containerization; design, develop and troubleshoot Strong programming or Scripting skills in yaml, Helm Charts, Json, Bash, Groovy, Ansible, Terraform, python or Java. Advance level on Networking technologies CI/CD tools such as Artifactory, Jenkins, and GIT, SonarQube Experience with cloud-based systems such as AWS, Azure, or Google Cloud, including expertise in IaC and CaC; Ansible, Terraform Experience with Kafka infrastructure and processes Understanding of software development methodologies and Agile practices Excellent analytical and problem-solving skills, with the ability to troubleshoot and identify the root cause of issues Good verbal and written communication skills, with the ability to collaborate effectively with cross-functional teams. Familiarity with monitoring and logging tools such Elk stack, Splunk. Familiarity with Technologies used to support microservices. Minimum 7 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with Kubernetes. Minimum 3 years experience working on Scripting or Programming Bachelor's degree in a related area Cloud Certification a plus
16/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking an AWS DevOps Software Engineer. Candidate will provide subject matter expertise for ongoing support of applications deployed to non-production AWS environments and supporting 3rd party applications. Identify root causes and automate solutions in support of development. Candidate will have a deep understanding of DevOps practices, leadership skills, and expertise in various tools and technologies. You will be working in a fast-paced, dynamic environment, using cutting-edge tools and cloud technologies. Manage day to day activities when called upon. Responsibilities: Desing Develop release and support, Cloud Native applications running on Containers Kubernetes and Docker within AWS. DevOps Strategy: Develop and implement DevOps strategies and best practices to enhance development, testing, and deployment processes. Possess in-depth knowledge and hands-on experience with DevOps tools and technologies, including but not limited to GitHub, Jenkins, Terraform, Ansible, Kafka, AWS, Apigee. Support the lower environments for incident and problem management. Resolve complex support issues in non-production environments. Create procedural and troubleshooting documentation related to cloud native applications. Write complex automation scripts using common automation tools, such as yaml, Json, Bash, Groovy, Ansible, Terraform and python, Perform other duties as assigned Qualifications: Excellent problem-solving skills. Ability to work independently. Ability to work with management to prioritize tasks. Demonstrate strong confidence in abilities and knowledge. Ability to work well in crisis situations. Ability to work under minimal supervision. Flexibility to be on call from 5 PM to 7 AM for 3 months per year. Good written and oral communication skills. Technical Skills: Expertise on Kubernetes and Docker, including best practices Expertise in cloud containerization; design, develop and troubleshoot Strong programming or Scripting skills in yaml, Helm Charts, Json, Bash, Groovy, Ansible, Terraform, python or Java. Advance level on Networking technologies CI/CD tools such as Artifactory, Jenkins, and GIT, SonarQube Experience with cloud-based systems such as AWS, Azure, or Google Cloud, including expertise in IaC and CaC; Ansible, Terraform Experience with Kafka infrastructure and processes Understanding of software development methodologies and Agile practices Excellent analytical and problem-solving skills, with the ability to troubleshoot and identify the root cause of issues Good verbal and written communication skills, with the ability to collaborate effectively with cross-functional teams. Familiarity with monitoring and logging tools such Elk stack, Splunk. Familiarity with Technologies used to support microservices. Minimum 7 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with Kubernetes. Minimum 3 years experience working on Scripting or Programming Bachelor's degree in a related area Cloud Certification a plus
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking an AWS DevOps Software Engineer. Candidate will provide subject matter expertise for ongoing support of applications deployed to non-production AWS environments and supporting 3rd party applications. Identify root causes and automate solutions in support of development. Candidate will have a deep understanding of DevOps practices, leadership skills, and expertise in various tools and technologies. You will be working in a fast-paced, dynamic environment, using cutting-edge tools and cloud technologies. Manage day to day activities when called upon. Responsibilities: Desing Develop release and support, Cloud Native applications running on Containers Kubernetes and Docker within AWS. DevOps Strategy: Develop and implement DevOps strategies and best practices to enhance development, testing, and deployment processes. Possess in-depth knowledge and hands-on experience with DevOps tools and technologies, including but not limited to GitHub, Jenkins, Terraform, Ansible, Kafka, AWS, Apigee. Support the lower environments for incident and problem management. Resolve complex support issues in non-production environments. Create procedural and troubleshooting documentation related to cloud native applications. Write complex automation scripts using common automation tools, such as yaml, Json, Bash, Groovy, Ansible, Terraform and python, Perform other duties as assigned Qualifications: Excellent problem-solving skills. Ability to work independently. Ability to work with management to prioritize tasks. Demonstrate strong confidence in abilities and knowledge. Ability to work well in crisis situations. Ability to work under minimal supervision. Flexibility to be on call from 5 PM to 7 AM for 3 months per year. Good written and oral communication skills. Technical Skills: Expertise on Kubernetes and Docker, including best practices Expertise in cloud containerization; design, develop and troubleshoot Strong programming or Scripting skills in yaml, Helm Charts, Json, Bash, Groovy, Ansible, Terraform, python or Java. Advance level on Networking technologies CI/CD tools such as Artifactory, Jenkins, and GIT, SonarQube Experience with cloud-based systems such as AWS, Azure, or Google Cloud, including expertise in IaC and CaC; Ansible, Terraform Experience with Kafka infrastructure and processes Understanding of software development methodologies and Agile practices Excellent analytical and problem-solving skills, with the ability to troubleshoot and identify the root cause of issues Good verbal and written communication skills, with the ability to collaborate effectively with cross-functional teams. Familiarity with monitoring and logging tools such Elk stack, Splunk. Familiarity with Technologies used to support microservices. Minimum 7 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with Kubernetes. Minimum 3 years experience working on Scripting or Programming Bachelor's degree in a related area Cloud Certification a plus
16/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking an AWS DevOps Software Engineer. Candidate will provide subject matter expertise for ongoing support of applications deployed to non-production AWS environments and supporting 3rd party applications. Identify root causes and automate solutions in support of development. Candidate will have a deep understanding of DevOps practices, leadership skills, and expertise in various tools and technologies. You will be working in a fast-paced, dynamic environment, using cutting-edge tools and cloud technologies. Manage day to day activities when called upon. Responsibilities: Desing Develop release and support, Cloud Native applications running on Containers Kubernetes and Docker within AWS. DevOps Strategy: Develop and implement DevOps strategies and best practices to enhance development, testing, and deployment processes. Possess in-depth knowledge and hands-on experience with DevOps tools and technologies, including but not limited to GitHub, Jenkins, Terraform, Ansible, Kafka, AWS, Apigee. Support the lower environments for incident and problem management. Resolve complex support issues in non-production environments. Create procedural and troubleshooting documentation related to cloud native applications. Write complex automation scripts using common automation tools, such as yaml, Json, Bash, Groovy, Ansible, Terraform and python, Perform other duties as assigned Qualifications: Excellent problem-solving skills. Ability to work independently. Ability to work with management to prioritize tasks. Demonstrate strong confidence in abilities and knowledge. Ability to work well in crisis situations. Ability to work under minimal supervision. Flexibility to be on call from 5 PM to 7 AM for 3 months per year. Good written and oral communication skills. Technical Skills: Expertise on Kubernetes and Docker, including best practices Expertise in cloud containerization; design, develop and troubleshoot Strong programming or Scripting skills in yaml, Helm Charts, Json, Bash, Groovy, Ansible, Terraform, python or Java. Advance level on Networking technologies CI/CD tools such as Artifactory, Jenkins, and GIT, SonarQube Experience with cloud-based systems such as AWS, Azure, or Google Cloud, including expertise in IaC and CaC; Ansible, Terraform Experience with Kafka infrastructure and processes Understanding of software development methodologies and Agile practices Excellent analytical and problem-solving skills, with the ability to troubleshoot and identify the root cause of issues Good verbal and written communication skills, with the ability to collaborate effectively with cross-functional teams. Familiarity with monitoring and logging tools such Elk stack, Splunk. Familiarity with Technologies used to support microservices. Minimum 7 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with Kubernetes. Minimum 3 years experience working on Scripting or Programming Bachelor's degree in a related area Cloud Certification a plus
Role responsibilities: Interacting with project roles as required, to gain an understanding of the business environment, technical context, and organisational strategic direction. Advising our customer on the latest technologies and methodologies, designing and implementing innovative approaches to their problems using automation. Understanding security policies and implementing solutions to satisfy security requirements. Designing and implementing solutions which have high availability and are scalable. What you will bring to the team: Enthusiasm for collaboration and excellent communication skills (written and verbal). An interest in keeping up with emerging tools, techniques, and technologies. Effective time management and organisational skills. A flexible and Agile way of working within a fast paced and everchanging environment. Attention to detail with a pragmatic and enthusiastic attitude to work Desirable Skills and Technologies: Experience and knowledge of AWS/Azure and Azure Virtual Desktop. Experience with configuration management tools, eg, Ansible (preferred), Puppet, Chef. Familiar with (or ability to learn easily) the following languages: Python, bash Scripting, React, Go. Experience with deploying, configuring, and managing cloud architecture and technologies in AWS environments. Experience with web application services such as NGINX, Apache, JBoss. Knowledge of OpenShift Containerisation, RHEL 6,7,8, Docker and Kubernetes. Experience with monitoring systems eg, ELK, Nagios, New Relic, DataDog, Splunk etc. Working knowledge of digital delivery processes and methodologies. Knowledge of Atlassian Toolset. Knowledge of JavaScript Understanding of Front End technologies, such as HTML5, and CSS3. Understanding the nature of asynchronous programming, its quirks and workarounds. Understanding of database schemas and query languages. Knowledge of infrastructure as code and CI/CD pipelines eg, Jenkins, Terraform, Bitbucket, GIT repositories, Concourse, Team City etc. An understanding of how to deploy and configure AWS components to adhere to tight security requirements. Awareness of security identity, access management and authentication using products such as ADFS, SSL/TLS Certs, OIDC, OAUTH2, Keycloak or Redhat SSO
15/05/2024
Full time
Role responsibilities: Interacting with project roles as required, to gain an understanding of the business environment, technical context, and organisational strategic direction. Advising our customer on the latest technologies and methodologies, designing and implementing innovative approaches to their problems using automation. Understanding security policies and implementing solutions to satisfy security requirements. Designing and implementing solutions which have high availability and are scalable. What you will bring to the team: Enthusiasm for collaboration and excellent communication skills (written and verbal). An interest in keeping up with emerging tools, techniques, and technologies. Effective time management and organisational skills. A flexible and Agile way of working within a fast paced and everchanging environment. Attention to detail with a pragmatic and enthusiastic attitude to work Desirable Skills and Technologies: Experience and knowledge of AWS/Azure and Azure Virtual Desktop. Experience with configuration management tools, eg, Ansible (preferred), Puppet, Chef. Familiar with (or ability to learn easily) the following languages: Python, bash Scripting, React, Go. Experience with deploying, configuring, and managing cloud architecture and technologies in AWS environments. Experience with web application services such as NGINX, Apache, JBoss. Knowledge of OpenShift Containerisation, RHEL 6,7,8, Docker and Kubernetes. Experience with monitoring systems eg, ELK, Nagios, New Relic, DataDog, Splunk etc. Working knowledge of digital delivery processes and methodologies. Knowledge of Atlassian Toolset. Knowledge of JavaScript Understanding of Front End technologies, such as HTML5, and CSS3. Understanding the nature of asynchronous programming, its quirks and workarounds. Understanding of database schemas and query languages. Knowledge of infrastructure as code and CI/CD pipelines eg, Jenkins, Terraform, Bitbucket, GIT repositories, Concourse, Team City etc. An understanding of how to deploy and configure AWS components to adhere to tight security requirements. Awareness of security identity, access management and authentication using products such as ADFS, SSL/TLS Certs, OIDC, OAUTH2, Keycloak or Redhat SSO
Data DevOps Engineer - DevOps, Big data - Permanent - Gloucestershire Location: Gloucestershire/Bristol (full-time onsite) Salary: £65 - £95K per annum Negotiable DOE Benefits: Flexible working hours, career opportunities, private medical, excellent pension, and social benefits Active DV Clearance is highly desirable. Please note that candidates will need to be eligible to undergo DV Clearance. The Client: Curo are collaborating with a global edge-to-cloud company advancing the way people live and work. They help companies connect, protect, analyse, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today's complex world. The Candidate: We are looking for a bright, driven, customer focussed professional to join our clients Hybrid Cloud Delivery team, and work alongside Enterprise Data Engineering Consultants to accelerate and drive data engineering opportunities. This is a fantastic opportunity for a dynamic individual with big ambitions, who is an established technologist with both outstanding technical ability and consultative mindset. This would suit an open-minded personable self-starter who relishes the fluidity and collaborative nature of consultancy. The Role: This role sits on our clients Advisory and Professional Services delivery team, who provide thought-leadership, industry know-how and technical excellence to consultative engagements. Helping customers to reap maximum business benefit from their technical investments, leveraging best in class Vender & Partner technologies to create relevant and effective business-valued technical solutions. The Data DevOps Engineer role is all about the detailed development and implementation of scalable clustered Big Data solutions, with a specific focus on automated dynamic scaling, self-healing systems. Duties: Participating in the full life cycle of data solution development, from requirements engineering through to continuous optimisation engineering and all the typical activities in between Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader Assisting with solution improvement activities driven either by the project or service Essential Requirements: Excellent knowledge of Linux operating system administration and implementation Broad understanding of the containerisation domain adjacent technologies/services, such as: Docker, OpenShift, Kubernetes etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies eg NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages Desirable Requirements: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology eg ActiveMQ NiFi Rego Familiarity with code development, Shell-Scripting in Python, Bash etc. To apply for this Data DevOps Engineer permanent job, please click the button below and submit your latest CV. Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience. Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.
13/05/2024
Full time
Data DevOps Engineer - DevOps, Big data - Permanent - Gloucestershire Location: Gloucestershire/Bristol (full-time onsite) Salary: £65 - £95K per annum Negotiable DOE Benefits: Flexible working hours, career opportunities, private medical, excellent pension, and social benefits Active DV Clearance is highly desirable. Please note that candidates will need to be eligible to undergo DV Clearance. The Client: Curo are collaborating with a global edge-to-cloud company advancing the way people live and work. They help companies connect, protect, analyse, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today's complex world. The Candidate: We are looking for a bright, driven, customer focussed professional to join our clients Hybrid Cloud Delivery team, and work alongside Enterprise Data Engineering Consultants to accelerate and drive data engineering opportunities. This is a fantastic opportunity for a dynamic individual with big ambitions, who is an established technologist with both outstanding technical ability and consultative mindset. This would suit an open-minded personable self-starter who relishes the fluidity and collaborative nature of consultancy. The Role: This role sits on our clients Advisory and Professional Services delivery team, who provide thought-leadership, industry know-how and technical excellence to consultative engagements. Helping customers to reap maximum business benefit from their technical investments, leveraging best in class Vender & Partner technologies to create relevant and effective business-valued technical solutions. The Data DevOps Engineer role is all about the detailed development and implementation of scalable clustered Big Data solutions, with a specific focus on automated dynamic scaling, self-healing systems. Duties: Participating in the full life cycle of data solution development, from requirements engineering through to continuous optimisation engineering and all the typical activities in between Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader Assisting with solution improvement activities driven either by the project or service Essential Requirements: Excellent knowledge of Linux operating system administration and implementation Broad understanding of the containerisation domain adjacent technologies/services, such as: Docker, OpenShift, Kubernetes etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies eg NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages Desirable Requirements: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology eg ActiveMQ NiFi Rego Familiarity with code development, Shell-Scripting in Python, Bash etc. To apply for this Data DevOps Engineer permanent job, please click the button below and submit your latest CV. Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience. Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.