Are you an experienced Senior JAVA software engineer with a passion for clean code? Have you been involved in the rearchitecture of large monolithic applications into Microservices? Are you interested in becoming part of a team that is responsible for API development, which enables developers to develop their own microservices? If so, this could be the perfect Senior JAVA Software Engineer job for you! You will join a specialist team based in Zurich that take responsibility for API development. The team developed an integration engine that enables developers to develop their own microservices. They are a crack team that take responsibility for digitalisation efforts. They do this by following a strict Agile approach to development, making use of high test coverage and also Cucumber for BDD. The core competencies are strong JAVA development skills, good knowledge of the Spring framework (Boot, Integration), microservices and Kafka for streaming. Given the nature of their work, there is containerization using Docker & Kubernetes. From a Front End perspective, the team use React, although Angular experience is also ok, provided you are happy to use React. As part of this team you will get the best of both worlds in the sense that you will work for a large well funded firm, but get to operate without as much of the classic red tape associated with a large environment. The team communicate in German and as such German fluency is mandatory The initial contract will run for 6 months but the expectation is that this will be a long term engagement with multiple extensions, depending on performance. If you are interested in finding out more about this Senior JAVA Software engineer job, please send your CV to (see below) or alternatively you can call me
09/11/2024
Project-based
Are you an experienced Senior JAVA software engineer with a passion for clean code? Have you been involved in the rearchitecture of large monolithic applications into Microservices? Are you interested in becoming part of a team that is responsible for API development, which enables developers to develop their own microservices? If so, this could be the perfect Senior JAVA Software Engineer job for you! You will join a specialist team based in Zurich that take responsibility for API development. The team developed an integration engine that enables developers to develop their own microservices. They are a crack team that take responsibility for digitalisation efforts. They do this by following a strict Agile approach to development, making use of high test coverage and also Cucumber for BDD. The core competencies are strong JAVA development skills, good knowledge of the Spring framework (Boot, Integration), microservices and Kafka for streaming. Given the nature of their work, there is containerization using Docker & Kubernetes. From a Front End perspective, the team use React, although Angular experience is also ok, provided you are happy to use React. As part of this team you will get the best of both worlds in the sense that you will work for a large well funded firm, but get to operate without as much of the classic red tape associated with a large environment. The team communicate in German and as such German fluency is mandatory The initial contract will run for 6 months but the expectation is that this will be a long term engagement with multiple extensions, depending on performance. If you are interested in finding out more about this Senior JAVA Software engineer job, please send your CV to (see below) or alternatively you can call me
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for an Associate Principal, Application/Cloud Engineering. This role is focused on engineering and maintaining lab environments in public cloud and data centers using IaC techniques. This person will need experience with DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, AWS, etc. This person will also need experience developing tools and automate tasks using languages such as Python, PowerShell, Bash. Responsibilities: Engineer and maintain Lab environments in Public Cloud and Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for company infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, serverless, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery Create and maintain operational documentation, runbooks, and Standard Operating Procedures (SOPs) for the Lab environments that will be used to validate assumptions within high level Solution Designs Qualifications: Bachelor's or master's degree in computer science related degree or equivalent experience 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes
08/11/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for an Associate Principal, Application/Cloud Engineering. This role is focused on engineering and maintaining lab environments in public cloud and data centers using IaC techniques. This person will need experience with DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, AWS, etc. This person will also need experience developing tools and automate tasks using languages such as Python, PowerShell, Bash. Responsibilities: Engineer and maintain Lab environments in Public Cloud and Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for company infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, serverless, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery Create and maintain operational documentation, runbooks, and Standard Operating Procedures (SOPs) for the Lab environments that will be used to validate assumptions within high level Solution Designs Qualifications: Bachelor's or master's degree in computer science related degree or equivalent experience 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes
Role: Full stack Engineer- .NET, JavaScript Level: Junior & mid level Location: Cheshire Flexibility going forward: 2 days per week in office Salary: £35,000 - £55,000 Cheshire based, international company, market-dominant, Digital transformation Internationally successful business that continues to go from strength to strength- through acquisitions and development of technology. Ethical employer where people and customers are put first. This business is a leading brand in animal (pet and agriculture) Pharma and food. Through continued growth and acquisitions they are now in a position where the tech needs a overhaul in the frame of the digital transformation. They take a Data,-focused approach and value collaboration in their developers. As a full stack developer you will be working within an Agile team to better the technology infrastructure and internal platforms used to manage data, track orders, tracks sales and any internal operations. Why would you like to work for this company as a Full stack Developer? Newly created team, that will impact and change the way technology is viewed and influences the business moving forward. Supportive, round table environment, where contributing to product ideas and development best practice is more than encouraged- Knowledge sharing/progression is huge here. A digital lead organisation who strive to improve their platforms to suit internal, and client needs. They promote a culture of collaboration and encourage everyone to share and listen to ideas. What are they looking for in a software Developer? Strong commercial skills, in broad software engineering practices - OOP, platform building, solution design and web app design and RESTful experience Experience working full stack- with a focus on C# .NET, and JavaScript Utilisation of data tooling's SQL NoSQL, database platforming etc Good grasp of testing techniques, unit testing Basic Front End JavaScript experience with HTML CSS Experience with NoSQL databases and Kafka messaging will make you stand out If you're keen to learn more about this opportunity please apply today or reach out to Joe O'Sullivan at Burns Sheehan Burns Sheehan Ltd will consider applications based only on skills and ability and will not discriminate on any grounds.
08/11/2024
Full time
Role: Full stack Engineer- .NET, JavaScript Level: Junior & mid level Location: Cheshire Flexibility going forward: 2 days per week in office Salary: £35,000 - £55,000 Cheshire based, international company, market-dominant, Digital transformation Internationally successful business that continues to go from strength to strength- through acquisitions and development of technology. Ethical employer where people and customers are put first. This business is a leading brand in animal (pet and agriculture) Pharma and food. Through continued growth and acquisitions they are now in a position where the tech needs a overhaul in the frame of the digital transformation. They take a Data,-focused approach and value collaboration in their developers. As a full stack developer you will be working within an Agile team to better the technology infrastructure and internal platforms used to manage data, track orders, tracks sales and any internal operations. Why would you like to work for this company as a Full stack Developer? Newly created team, that will impact and change the way technology is viewed and influences the business moving forward. Supportive, round table environment, where contributing to product ideas and development best practice is more than encouraged- Knowledge sharing/progression is huge here. A digital lead organisation who strive to improve their platforms to suit internal, and client needs. They promote a culture of collaboration and encourage everyone to share and listen to ideas. What are they looking for in a software Developer? Strong commercial skills, in broad software engineering practices - OOP, platform building, solution design and web app design and RESTful experience Experience working full stack- with a focus on C# .NET, and JavaScript Utilisation of data tooling's SQL NoSQL, database platforming etc Good grasp of testing techniques, unit testing Basic Front End JavaScript experience with HTML CSS Experience with NoSQL databases and Kafka messaging will make you stand out If you're keen to learn more about this opportunity please apply today or reach out to Joe O'Sullivan at Burns Sheehan Burns Sheehan Ltd will consider applications based only on skills and ability and will not discriminate on any grounds.
We're seeking a Senior Software Engineer with expert knowledge in systems development and a strong command over the entire software development life cycle, from design to deployment. As a core part of our team, you'll lead the development, architecture, and information security engineering. You will be joining an exciting team based in Luxembourg and you will be required to be on-site 2/3 days a week for the first few months. This is a hands-on role where you'll set the standard for best practices, programming tools, and techniques, while delivering innovative and reliable solutions across cloud and on-premises environments. Key Responsibilities: Architect and Develop: Lead the design, deployment, and management of web, Application Server, Middleware, and web infrastructure components (RedHat JBoss A-MQ, Redis, Kafka, Matomo) targeting cloud deployments using OpenShift, IaaS, and PaaS models. Project Leadership: Manage projects autonomously, coordinating a small team as needed, and reviewing deliverables to ensure high standards. Innovate and Evaluate: Assess new infrastructure solutions and cutting-edge technologies, delivering critical applications that drive business value. Document and Secure: Create robust documentation around CFS architecture and security, especially for customer-facing web applications, ensuring compliance with relevant processes and standards. Collaborate Across Teams: Interface effectively with development, project management, infrastructure, and information security teams, as well as third-party vendors. Support Development: Develop and maintain Java/JavaEE integration components, including security modules and automation frameworks, while keeping thorough documentation and test suites. Production Implementation: Participate in approximately four major production implementations annually (including scheduled Saturday work). Required Skills and Experience: Education: Master's Degree (or equivalent) in Computer Science or a related field. Web Application Infrastructure: Proficient in configuring, deploying, and supporting web infrastructure (Apache httpd Web server, Java application Servers) on Linux. Security Expertise: Solid understanding of TLS (PKI), certificate/key deployment, and application security design for web-facing applications. Cloud and DevOps: Hands-on experience with RedHat OpenShift, Docker, Kubernetes, ArgoCD, Helm Charts, Git, and cloud APIs (Google Cloud Platform preferred). Additional Skills: Strong technical writing skills for documenting architecture and security frameworks. Technologies: Proficient with IntelliJ/Eclipse IDE, Apache Maven, Single-Sign-On (OpenID Connect preferred), and JavaEE services and APIs. Nice-to-Have Skills: Experience with ActiveMQ, Kafka, Ansible, Jenkins, and RedHat EAP is a plus. If this is of interest, please apply: (see below)
08/11/2024
Project-based
We're seeking a Senior Software Engineer with expert knowledge in systems development and a strong command over the entire software development life cycle, from design to deployment. As a core part of our team, you'll lead the development, architecture, and information security engineering. You will be joining an exciting team based in Luxembourg and you will be required to be on-site 2/3 days a week for the first few months. This is a hands-on role where you'll set the standard for best practices, programming tools, and techniques, while delivering innovative and reliable solutions across cloud and on-premises environments. Key Responsibilities: Architect and Develop: Lead the design, deployment, and management of web, Application Server, Middleware, and web infrastructure components (RedHat JBoss A-MQ, Redis, Kafka, Matomo) targeting cloud deployments using OpenShift, IaaS, and PaaS models. Project Leadership: Manage projects autonomously, coordinating a small team as needed, and reviewing deliverables to ensure high standards. Innovate and Evaluate: Assess new infrastructure solutions and cutting-edge technologies, delivering critical applications that drive business value. Document and Secure: Create robust documentation around CFS architecture and security, especially for customer-facing web applications, ensuring compliance with relevant processes and standards. Collaborate Across Teams: Interface effectively with development, project management, infrastructure, and information security teams, as well as third-party vendors. Support Development: Develop and maintain Java/JavaEE integration components, including security modules and automation frameworks, while keeping thorough documentation and test suites. Production Implementation: Participate in approximately four major production implementations annually (including scheduled Saturday work). Required Skills and Experience: Education: Master's Degree (or equivalent) in Computer Science or a related field. Web Application Infrastructure: Proficient in configuring, deploying, and supporting web infrastructure (Apache httpd Web server, Java application Servers) on Linux. Security Expertise: Solid understanding of TLS (PKI), certificate/key deployment, and application security design for web-facing applications. Cloud and DevOps: Hands-on experience with RedHat OpenShift, Docker, Kubernetes, ArgoCD, Helm Charts, Git, and cloud APIs (Google Cloud Platform preferred). Additional Skills: Strong technical writing skills for documenting architecture and security frameworks. Technologies: Proficient with IntelliJ/Eclipse IDE, Apache Maven, Single-Sign-On (OpenID Connect preferred), and JavaEE services and APIs. Nice-to-Have Skills: Experience with ActiveMQ, Kafka, Ansible, Jenkins, and RedHat EAP is a plus. If this is of interest, please apply: (see below)
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Senior Java Software Engineer. Candidate will support and work collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Responsibilities: Support the application development of Real Time and batch applications for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. [Required] 3+ year of experience in building high speed, Real Time and batch solutions [Required] 3+ years of experience in Java [Preferred] Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc [Preferred] Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. [Preferred] Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc [Preferred] Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google [Required] Experience writing unit and integration tests with testing frameworks like Junit, Citrus [Required] Experience working with various types of databases like Relational, NoSQL [Required] Experience working with Git [Preferred] Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc [Preferred] Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics [Required] Hands-on experience with Java version 8 onwards, Spring, SpringBoot, REST API Technical Skills: [Required] Java-based software development experience, including deep understanding of Java fundamentals like Data structures, Concurrency and Multithreading [Required] Experience in object-oriented design and software design patterns Education and/or Experience: [Required] BS degree in Computer Science, similar technical field required
08/11/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Senior Java Software Engineer. Candidate will support and work collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Responsibilities: Support the application development of Real Time and batch applications for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. [Required] 3+ year of experience in building high speed, Real Time and batch solutions [Required] 3+ years of experience in Java [Preferred] Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc [Preferred] Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. [Preferred] Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc [Preferred] Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google [Required] Experience writing unit and integration tests with testing frameworks like Junit, Citrus [Required] Experience working with various types of databases like Relational, NoSQL [Required] Experience working with Git [Preferred] Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc [Preferred] Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics [Required] Hands-on experience with Java version 8 onwards, Spring, SpringBoot, REST API Technical Skills: [Required] Java-based software development experience, including deep understanding of Java fundamentals like Data structures, Concurrency and Multithreading [Required] Experience in object-oriented design and software design patterns Education and/or Experience: [Required] BS degree in Computer Science, similar technical field required
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for an Associate Principal, Application/Cloud Engineering. This role is focused on engineering and maintaining lab environments in public cloud and data centers using IaC techniques. This person will need experience with DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, AWS, etc. This person will also need experience developing tools and automate tasks using languages such as Python, PowerShell, Bash. Responsibilities: Engineer and maintain Lab environments in Public Cloud and Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for company infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, serverless, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery Create and maintain operational documentation, runbooks, and Standard Operating Procedures (SOPs) for the Lab environments that will be used to validate assumptions within high level Solution Designs Qualifications: Bachelor's or master's degree in computer science related degree or equivalent experience 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes
07/11/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for an Associate Principal, Application/Cloud Engineering. This role is focused on engineering and maintaining lab environments in public cloud and data centers using IaC techniques. This person will need experience with DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, AWS, etc. This person will also need experience developing tools and automate tasks using languages such as Python, PowerShell, Bash. Responsibilities: Engineer and maintain Lab environments in Public Cloud and Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for company infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, serverless, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery Create and maintain operational documentation, runbooks, and Standard Operating Procedures (SOPs) for the Lab environments that will be used to validate assumptions within high level Solution Designs Qualifications: Bachelor's or master's degree in computer science related degree or equivalent experience 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this 6+ Month Contract role* Prestigious Financial Institution Firm is currently seeking a Metadata Data Lineage Analyst. Candidate will develop Metadata and Data Lineage Solutions for multiple data sources across On-Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Responsibilities: Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On-Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.
06/11/2024
Project-based
*We are unable to sponsor for this 6+ Month Contract role* Prestigious Financial Institution Firm is currently seeking a Metadata Data Lineage Analyst. Candidate will develop Metadata and Data Lineage Solutions for multiple data sources across On-Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Responsibilities: Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On-Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Senior Java Software Engineer. Candidate will support and work collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Responsibilities: Support the application development of Real Time and batch applications for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. [Required] 3+ year of experience in building high speed, Real Time and batch solutions [Required] 3+ years of experience in Java [Preferred] Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc [Preferred] Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. [Preferred] Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc [Preferred] Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google [Required] Experience writing unit and integration tests with testing frameworks like Junit, Citrus [Required] Experience working with various types of databases like Relational, NoSQL [Required] Experience working with Git [Preferred] Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc [Preferred] Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics [Required] Hands-on experience with Java version 8 onwards, Spring, SpringBoot, REST API Technical Skills: [Required] Java-based software development experience, including deep understanding of Java fundamentals like Data structures, Concurrency and Multithreading [Required] Experience in object-oriented design and software design patterns Education and/or Experience: [Required] BS degree in Computer Science, similar technical field required
06/11/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Senior Java Software Engineer. Candidate will support and work collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Responsibilities: Support the application development of Real Time and batch applications for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. [Required] 3+ year of experience in building high speed, Real Time and batch solutions [Required] 3+ years of experience in Java [Preferred] Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc [Preferred] Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. [Preferred] Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc [Preferred] Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google [Required] Experience writing unit and integration tests with testing frameworks like Junit, Citrus [Required] Experience working with various types of databases like Relational, NoSQL [Required] Experience working with Git [Preferred] Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc [Preferred] Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics [Required] Hands-on experience with Java version 8 onwards, Spring, SpringBoot, REST API Technical Skills: [Required] Java-based software development experience, including deep understanding of Java fundamentals like Data structures, Concurrency and Multithreading [Required] Experience in object-oriented design and software design patterns Education and/or Experience: [Required] BS degree in Computer Science, similar technical field required
NO SPONSORSHIP Director, Cloud Engineering (AWS) Prescreen questions need to be answered for all submittals. How strong on hybrid Kubernetes multi cluster environments Do you have experience setting up network mesh, service mesh Do you have experience on working on confluent kafka How widely your Kubernetes and Kafka footprint Have you worked on tuning the Kubernetes and Kafka systems The role is all about Kubernetes, Containerization and Automation They need to be able to incorporate applications into the cloud into Kubernetes They need to come from a highly regulated background Big plus if they have Kafka, Terraform and Ansible They will be managing over 8 people They need to have come up technically and should know what they are doing Hybrid 3 days onsite and 2 days remote Looking for a candidate with deep Cloud AWS preferably out of a financial services company/You will need 5 years experience leading engineering teams. Design and development database architectures IP Networking security cloud operations performance tuning Linux to focus on platform services core services functional understanding of security and controls and the implementation within cloud cloud networking Firewalls VPCs secure gateways container management Kubernetes BS Degree Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Required: 5+ years of demonstrated experience leading engineering teams 10+ years of progressive experience in software engineering with an understanding of large-scale computing solutions (Primarily AWS), including software design and development, database architectures, IP Networking, security, cloud operations, and performance tuning. Technical Skills: Basic understanding of operating systems, including Linux. Deep understanding of AWS services, with a strong focus on platform services and core services. Functional understanding of security, compliance, and controls and their implementation within the cloud. Functional understanding of cloud networking principles, including Firewalls, VPCs, secure gateways and more. Functional understanding of containers, container management, and container orchestration, including Kubernetes and the different components of Kubernetes. Basic understanding of good delivery practices and continual integration and improvement. Agile/Lean background for projects and project delivery. Strong understanding of automation, automation principles, and different aspects of pipelines and automation delivery. Advanced cloud strategic capability and delivery deep understanding. Advanced security segmentation and controls deep understanding. Advanced automation principles and modern cloud design. Education and/or Experience: Bachelor's degree and six years of relevant experience, or a Master's degree and three years of relevant experience Certificates or Licenses: AWS Solution Architect Associate (Required)
06/11/2024
Full time
NO SPONSORSHIP Director, Cloud Engineering (AWS) Prescreen questions need to be answered for all submittals. How strong on hybrid Kubernetes multi cluster environments Do you have experience setting up network mesh, service mesh Do you have experience on working on confluent kafka How widely your Kubernetes and Kafka footprint Have you worked on tuning the Kubernetes and Kafka systems The role is all about Kubernetes, Containerization and Automation They need to be able to incorporate applications into the cloud into Kubernetes They need to come from a highly regulated background Big plus if they have Kafka, Terraform and Ansible They will be managing over 8 people They need to have come up technically and should know what they are doing Hybrid 3 days onsite and 2 days remote Looking for a candidate with deep Cloud AWS preferably out of a financial services company/You will need 5 years experience leading engineering teams. Design and development database architectures IP Networking security cloud operations performance tuning Linux to focus on platform services core services functional understanding of security and controls and the implementation within cloud cloud networking Firewalls VPCs secure gateways container management Kubernetes BS Degree Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Required: 5+ years of demonstrated experience leading engineering teams 10+ years of progressive experience in software engineering with an understanding of large-scale computing solutions (Primarily AWS), including software design and development, database architectures, IP Networking, security, cloud operations, and performance tuning. Technical Skills: Basic understanding of operating systems, including Linux. Deep understanding of AWS services, with a strong focus on platform services and core services. Functional understanding of security, compliance, and controls and their implementation within the cloud. Functional understanding of cloud networking principles, including Firewalls, VPCs, secure gateways and more. Functional understanding of containers, container management, and container orchestration, including Kubernetes and the different components of Kubernetes. Basic understanding of good delivery practices and continual integration and improvement. Agile/Lean background for projects and project delivery. Strong understanding of automation, automation principles, and different aspects of pipelines and automation delivery. Advanced cloud strategic capability and delivery deep understanding. Advanced security segmentation and controls deep understanding. Advanced automation principles and modern cloud design. Education and/or Experience: Bachelor's degree and six years of relevant experience, or a Master's degree and three years of relevant experience Certificates or Licenses: AWS Solution Architect Associate (Required)
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for a Java Developer - Metadata Lineage Analyst. This is a Java Developer position and they will focus on data analysis, Metadata data flows, data mappings, data lineage solutions. This analyst will not be programming but will be developing custom metadata connections/scanners using Java, python, etc. They will need hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Responsibilities: Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.
06/11/2024
Project-based
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for a Java Developer - Metadata Lineage Analyst. This is a Java Developer position and they will focus on data analysis, Metadata data flows, data mappings, data lineage solutions. This analyst will not be programming but will be developing custom metadata connections/scanners using Java, python, etc. They will need hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Responsibilities: Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.
Metadata Solutions Developer Rate: Open Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns. Responsibilities Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary.
06/11/2024
Project-based
Metadata Solutions Developer Rate: Open Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns. Responsibilities Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary.
Senior Backend Java Developer Salary: $140k-$150k + bonus Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications Bachelor's degree 7-10+ years of related experience Must have 5 years tenure in current or previous role Experience (including internal workings of Java) in Java 11+ is required. Experience with app development in Golang. Experience developing software using Object Oriented Designs, advance patterns (like AOP) and multi-threading is required. Experience with distributed message brokers like Kafka, IBM MQ, Amazon Kinesis, etc. is desirable. Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. Must be able to write good quality code with 80% or above unit and integration tests coverage. Experience with testing frameworks like Junit and Citrus is desirable. Experience working with various types of databases like Relational, NoSQL, Object-based, Graph. Experience following Git workflows is required. Familiarity with DevOps tools: Terraform, Ansible, Jenkins, Kubernetes, Docker, Helm and CI/CD pipeline etc. Responsibilities Actively participates in design of highly performing, scalable, secure, reliable and cost optimized solutions. Primary responsibility is application design and development of next gen clearing applications for business requirements in agreed architecture framework and Agile environment. Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation. Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented. Participates in code-reviews based on high engineering standards Writes unit and integration tests based on chosen test frameworks. Assists Production Support by providing advice on system functionality and fixes as required.
06/11/2024
Full time
Senior Backend Java Developer Salary: $140k-$150k + bonus Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications Bachelor's degree 7-10+ years of related experience Must have 5 years tenure in current or previous role Experience (including internal workings of Java) in Java 11+ is required. Experience with app development in Golang. Experience developing software using Object Oriented Designs, advance patterns (like AOP) and multi-threading is required. Experience with distributed message brokers like Kafka, IBM MQ, Amazon Kinesis, etc. is desirable. Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. Must be able to write good quality code with 80% or above unit and integration tests coverage. Experience with testing frameworks like Junit and Citrus is desirable. Experience working with various types of databases like Relational, NoSQL, Object-based, Graph. Experience following Git workflows is required. Familiarity with DevOps tools: Terraform, Ansible, Jenkins, Kubernetes, Docker, Helm and CI/CD pipeline etc. Responsibilities Actively participates in design of highly performing, scalable, secure, reliable and cost optimized solutions. Primary responsibility is application design and development of next gen clearing applications for business requirements in agreed architecture framework and Agile environment. Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation. Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented. Participates in code-reviews based on high engineering standards Writes unit and integration tests based on chosen test frameworks. Assists Production Support by providing advice on system functionality and fixes as required.
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* *NO CONTRACTORS OR CONSULTANTS* A prestigious company is looking for an Associate Principal, Backend Java Developer. This company needs someone with 7-10 years of experience focused on Back End Java development, Java 11, Kafka, Golang, Multithreading, AWS, etc. They will be working in a Real Time and highly regulated financial environment. Responsibilities: Actively participates in design of highly performing, scalable, secure, reliable and cost optimized solutions. Primary responsibility is application design and development of next gen clearing applications for business requirements in agreed architecture framework and Agile environment. Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation. Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented. Participates in code-reviews based on high engineering standards Writes unit and integration tests based on chosen test frameworks. Assists Production Support by providing advice on system functionality and fixes as required. Qualifications: BS degree in Computer Science, similar technical field required. Masters preferred. 7-10 years of experience in building large scale, compute and event-driven solutions. Experience (including internal workings of Java) in Java 11+ is required. Experience with app development in Golang. Experience developing software using Object Oriented Designs, advance patterns (like AOP) and multi-threading is required. Experience with distributed message brokers like Kafka, IBM MQ, Amazon Kinesis, etc. is desirable. Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. Must be able to write good quality code with 80% or above unit and integration tests coverage. Experience with testing frameworks like Junit, Citrus is desirable. Experience working with various types of databases like Relational, NoSQL, Object-based, Graph. Experience following Git workflows is required. Familiarity with DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Docker, Helm and CI/CD pipeline etc.is a plus. Experience with performance optimization, profiling, and memory management.
05/11/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* *NO CONTRACTORS OR CONSULTANTS* A prestigious company is looking for an Associate Principal, Backend Java Developer. This company needs someone with 7-10 years of experience focused on Back End Java development, Java 11, Kafka, Golang, Multithreading, AWS, etc. They will be working in a Real Time and highly regulated financial environment. Responsibilities: Actively participates in design of highly performing, scalable, secure, reliable and cost optimized solutions. Primary responsibility is application design and development of next gen clearing applications for business requirements in agreed architecture framework and Agile environment. Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation. Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented. Participates in code-reviews based on high engineering standards Writes unit and integration tests based on chosen test frameworks. Assists Production Support by providing advice on system functionality and fixes as required. Qualifications: BS degree in Computer Science, similar technical field required. Masters preferred. 7-10 years of experience in building large scale, compute and event-driven solutions. Experience (including internal workings of Java) in Java 11+ is required. Experience with app development in Golang. Experience developing software using Object Oriented Designs, advance patterns (like AOP) and multi-threading is required. Experience with distributed message brokers like Kafka, IBM MQ, Amazon Kinesis, etc. is desirable. Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. Must be able to write good quality code with 80% or above unit and integration tests coverage. Experience with testing frameworks like Junit, Citrus is desirable. Experience working with various types of databases like Relational, NoSQL, Object-based, Graph. Experience following Git workflows is required. Familiarity with DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Docker, Helm and CI/CD pipeline etc.is a plus. Experience with performance optimization, profiling, and memory management.
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for a Java Developer - Metadata Lineage Analyst. This is a Java Developer position and they will focus on data analysis, Metadata data flows, data mappings, data lineage solutions. This analyst will not be programming but will be developing custom metadata connections/scanners using Java, python, etc. They will need hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Responsibilities: Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.
05/11/2024
Project-based
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for a Java Developer - Metadata Lineage Analyst. This is a Java Developer position and they will focus on data analysis, Metadata data flows, data mappings, data lineage solutions. This analyst will not be programming but will be developing custom metadata connections/scanners using Java, python, etc. They will need hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Responsibilities: Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.
Senior Engineer, Cloud/Infrastructure Security Salary: Open + bonus Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications Bachelor's degree in computer science related degree 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc. In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Preferred Experience with DevOps tools, ex. Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Familiarity with security standards such as the NIST CSF Related certifications Responsibilities Engineer and maintain Lab environments in Public Cloud and Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for company infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, serverless, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery
05/11/2024
Full time
Senior Engineer, Cloud/Infrastructure Security Salary: Open + bonus Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications Bachelor's degree in computer science related degree 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc. In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Preferred Experience with DevOps tools, ex. Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Familiarity with security standards such as the NIST CSF Related certifications Responsibilities Engineer and maintain Lab environments in Public Cloud and Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for company infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, serverless, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery
Senior Backend Engineer (Go) Remote, UK 6 Month Contract An incredible opportunity for an experienced Senior Backend Engineer with advanced Go skills to join a prestigious tech client on a contract basis. This company is renowned for its engineering excellence, and they're looking for a Senior Backend Engineer who can take their distributed systems to the next level. As a Senior Backend Engineer, you'll be working on mission-critical, data-intensive applications that push the boundaries of technology, taking ownership of the full software engineering life cycle, including design, development, and implementation. Reporting into an Engineering Manager, the Senior Backend Engineer will also leverage modern technologies like AWS, Kubernetes, Docker, and Kafka, whilst also architecting and implementing microservices-based solutions in collaboration with cross-functional teams. In addition, the Senior Backend Engineer will also be responsible for optimising system performance, reliability, and scalability, as well as participating in code reviews, design discussions, and knowledge sharing. Senior Backend Engineer (Go) - Key Requirements: Significant professional experience in software development, with a strong focus on Back End systems Proficiency in Go/Golang and proven expertise in AWS, Kubernetes, and Docker Experience with end-to-end software engineering, including system design and architecture Hands-on experience working on complex, data-intensive applications A product-focused mindset and familiarity with working in technology-driven organisations or start-ups Hands-on skills in Kafka, Cassandra, gRPC, and microservices architecture will also be beneficial, as well as experience contributing to open-source projects If you're a passionate Senior Backend Engineer seeking a challenging and rewarding contract role with a reputable tech company, apply now! Our client is looking to onboard the right talent as soon as possible. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
05/11/2024
Project-based
Senior Backend Engineer (Go) Remote, UK 6 Month Contract An incredible opportunity for an experienced Senior Backend Engineer with advanced Go skills to join a prestigious tech client on a contract basis. This company is renowned for its engineering excellence, and they're looking for a Senior Backend Engineer who can take their distributed systems to the next level. As a Senior Backend Engineer, you'll be working on mission-critical, data-intensive applications that push the boundaries of technology, taking ownership of the full software engineering life cycle, including design, development, and implementation. Reporting into an Engineering Manager, the Senior Backend Engineer will also leverage modern technologies like AWS, Kubernetes, Docker, and Kafka, whilst also architecting and implementing microservices-based solutions in collaboration with cross-functional teams. In addition, the Senior Backend Engineer will also be responsible for optimising system performance, reliability, and scalability, as well as participating in code reviews, design discussions, and knowledge sharing. Senior Backend Engineer (Go) - Key Requirements: Significant professional experience in software development, with a strong focus on Back End systems Proficiency in Go/Golang and proven expertise in AWS, Kubernetes, and Docker Experience with end-to-end software engineering, including system design and architecture Hands-on experience working on complex, data-intensive applications A product-focused mindset and familiarity with working in technology-driven organisations or start-ups Hands-on skills in Kafka, Cassandra, gRPC, and microservices architecture will also be beneficial, as well as experience contributing to open-source projects If you're a passionate Senior Backend Engineer seeking a challenging and rewarding contract role with a reputable tech company, apply now! Our client is looking to onboard the right talent as soon as possible. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Cloud Automation and Tools Software Engineer with strong Python/PowerShell automation experience. Candidate will be part of a small Innovation team of Engineers that will collaborate with stakeholders, partner teams, and Solutions Architects to research and engineer emerging technologies as part of a comprehensive requirements-driven solution design. Candidate will be developing technology engineering requirements and working on Proof-of-Concept and laboratory testing efforts using modern approaches to process and automation. Candidate will build/deploy/document/manage Lab environments within On-Prem/Cloud Datacenters to be used for Proof-of-Concepts and rapid prototyping. In this engineering role, you will use your technology background to evaluate emerging technologies and help OTSI Leadership make informed decisions on changes to the Technology Roadmap. Responsibilities: Engineer and maintain Lab environments in Public Cloud and the Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, server-less, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery Create and maintain operational documentation, runbooks, and Standard Operating Procedures (SOPs) for the Lab environments that will be used to validate assumptions within high level Solution Designs Qualifications: Ability to think strategically and map architectural decisions/recommendations to business needs Advanced problem-solving skills and logical approach to solving problems [Required] Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc [Preferred] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. [Preferred] Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Technical Skills: In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes [Preferred] Familiarity with security standards such as the NIST CSF Education and/or Experience: [Preferred] Bachelor's or master's degree in computer science related degree or equivalent experience [Required] 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment [Required] 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Certificates or Licenses: [Preferred] Cloud computing certification such as AWS Solutions Architect Associate, Azure Administrator or something similar [Desired] Technical Security Certifications such as AWS Certified Security, Microsoft Azure Security Engineer or something similar [Desired] CCNA, Network+ or other relevant Networking certifications
04/11/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Cloud Automation and Tools Software Engineer with strong Python/PowerShell automation experience. Candidate will be part of a small Innovation team of Engineers that will collaborate with stakeholders, partner teams, and Solutions Architects to research and engineer emerging technologies as part of a comprehensive requirements-driven solution design. Candidate will be developing technology engineering requirements and working on Proof-of-Concept and laboratory testing efforts using modern approaches to process and automation. Candidate will build/deploy/document/manage Lab environments within On-Prem/Cloud Datacenters to be used for Proof-of-Concepts and rapid prototyping. In this engineering role, you will use your technology background to evaluate emerging technologies and help OTSI Leadership make informed decisions on changes to the Technology Roadmap. Responsibilities: Engineer and maintain Lab environments in Public Cloud and the Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, server-less, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery Create and maintain operational documentation, runbooks, and Standard Operating Procedures (SOPs) for the Lab environments that will be used to validate assumptions within high level Solution Designs Qualifications: Ability to think strategically and map architectural decisions/recommendations to business needs Advanced problem-solving skills and logical approach to solving problems [Required] Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc [Preferred] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. [Preferred] Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Technical Skills: In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes [Preferred] Familiarity with security standards such as the NIST CSF Education and/or Experience: [Preferred] Bachelor's or master's degree in computer science related degree or equivalent experience [Required] 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment [Required] 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Certificates or Licenses: [Preferred] Cloud computing certification such as AWS Solutions Architect Associate, Azure Administrator or something similar [Desired] Technical Security Certifications such as AWS Certified Security, Microsoft Azure Security Engineer or something similar [Desired] CCNA, Network+ or other relevant Networking certifications
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Cloud Automation and Tools Software Engineer with strong Python/PowerShell automation experience. Candidate will be part of a small Innovation team of Engineers that will collaborate with stakeholders, partner teams, and Solutions Architects to research and engineer emerging technologies as part of a comprehensive requirements-driven solution design. Candidate will be developing technology engineering requirements and working on Proof-of-Concept and laboratory testing efforts using modern approaches to process and automation. Candidate will build/deploy/document/manage Lab environments within On-Prem/Cloud Datacenters to be used for Proof-of-Concepts and rapid prototyping. In this engineering role, you will use your technology background to evaluate emerging technologies and help OTSI Leadership make informed decisions on changes to the Technology Roadmap. Responsibilities: Engineer and maintain Lab environments in Public Cloud and the Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, server-less, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery Create and maintain operational documentation, runbooks, and Standard Operating Procedures (SOPs) for the Lab environments that will be used to validate assumptions within high level Solution Designs Qualifications: Ability to think strategically and map architectural decisions/recommendations to business needs Advanced problem-solving skills and logical approach to solving problems [Required] Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc [Preferred] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. [Preferred] Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Technical Skills: In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes [Preferred] Familiarity with security standards such as the NIST CSF Education and/or Experience: [Preferred] Bachelor's or master's degree in computer science related degree or equivalent experience [Required] 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment [Required] 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Certificates or Licenses: [Preferred] Cloud computing certification such as AWS Solutions Architect Associate, Azure Administrator or something similar [Desired] Technical Security Certifications such as AWS Certified Security, Microsoft Azure Security Engineer or something similar [Desired] CCNA, Network+ or other relevant Networking certifications
04/11/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Cloud Automation and Tools Software Engineer with strong Python/PowerShell automation experience. Candidate will be part of a small Innovation team of Engineers that will collaborate with stakeholders, partner teams, and Solutions Architects to research and engineer emerging technologies as part of a comprehensive requirements-driven solution design. Candidate will be developing technology engineering requirements and working on Proof-of-Concept and laboratory testing efforts using modern approaches to process and automation. Candidate will build/deploy/document/manage Lab environments within On-Prem/Cloud Datacenters to be used for Proof-of-Concepts and rapid prototyping. In this engineering role, you will use your technology background to evaluate emerging technologies and help OTSI Leadership make informed decisions on changes to the Technology Roadmap. Responsibilities: Engineer and maintain Lab environments in Public Cloud and the Data Centers using Infrastructure as Code techniques Collaborate with Engineering, Architecture and Cloud Platform Engineering teams to evaluate, document, and demonstrate Proof of Concepts for infrastructure, application and services that impact the Technology Roadmap Document Technology design decisions and conduct Technology assessments as part of a centralized Demand Management process within IT Apply your expertise in compute, storage, database, server-less, monitoring, microservices, and event management to pilot new/innovative solutions to business problems Find opportunities to improve existing infrastructure architecture to improve performance, support, scalability, reliability, and security Incorporate security best practices, Identity and Access Management, and encryption mechanisms for data protection Develop automation scripts and processes to streamline routine tasks such as scaling, patching, backup, and recovery Create and maintain operational documentation, runbooks, and Standard Operating Procedures (SOPs) for the Lab environments that will be used to validate assumptions within high level Solution Designs Qualifications: Ability to think strategically and map architectural decisions/recommendations to business needs Advanced problem-solving skills and logical approach to solving problems [Required] Ability to develop tools and automate tasks using Scripting languages such as Python, PowerShell, Bash, PERL, Ruby, etc [Preferred] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. [Preferred] Experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Technical Skills: In depth knowledge of on-premises, cloud and hybrid networking concepts Knowledge of Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes [Preferred] Familiarity with security standards such as the NIST CSF Education and/or Experience: [Preferred] Bachelor's or master's degree in computer science related degree or equivalent experience [Required] 7+ years of experience as a System or Cloud Engineer with hands on implementation, security, and standards experience within a hybrid technology environment [Required] 3+ years of experience contributing to the architecture of Cloud and On-Prem Solutions Certificates or Licenses: [Preferred] Cloud computing certification such as AWS Solutions Architect Associate, Azure Administrator or something similar [Desired] Technical Security Certifications such as AWS Certified Security, Microsoft Azure Security Engineer or something similar [Desired] CCNA, Network+ or other relevant Networking certifications