NO SPONSORSHIP Associate Principal, Software Programming Quantitative Risk Management Area Associate Principal, Software Engineering Automating Risk Models Chicago - On site 3 days a week Salary - $185 - $195K + Bonus Looking for a hard core developer who works within the quantitative risk management and cab develop applications and solutions for the QRM team. You will not build models, you will automate models You will need to come from a financial institute, trading company, exchange, etc. Develop hardcore applications You will need to have CICD pipelines, Infrastructure as a Code, Kubernetes, Terraform, etc. Preferably having Java, Python, C++ Configure and manage resources in the local and AWS cloud environments and deploy QRMs software on these resources. Develop CI/CD pipelines. Contribute to development of QRMs databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. cloud environment. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Education and/or Experience: Masters degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas
25/06/2024
Full time
NO SPONSORSHIP Associate Principal, Software Programming Quantitative Risk Management Area Associate Principal, Software Engineering Automating Risk Models Chicago - On site 3 days a week Salary - $185 - $195K + Bonus Looking for a hard core developer who works within the quantitative risk management and cab develop applications and solutions for the QRM team. You will not build models, you will automate models You will need to come from a financial institute, trading company, exchange, etc. Develop hardcore applications You will need to have CICD pipelines, Infrastructure as a Code, Kubernetes, Terraform, etc. Preferably having Java, Python, C++ Configure and manage resources in the local and AWS cloud environments and deploy QRMs software on these resources. Develop CI/CD pipelines. Contribute to development of QRMs databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. cloud environment. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Education and/or Experience: Masters degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas
ASSOCIATE PRINCIPAL, SOFTWARE ENGINEERING (JAVA) SALARY: $160k - $170k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite and 2 days remote NO SPONSORSHIP Looking for a candidate with 5 plus years Back End Java development version 8 or above. financial big plus. Must have event-driven systems experience of cloud-based AWS data solutions any devops terraform ansible jenkins. big plus memory model data structures concurrency and Multithreading strong testing flint Apache Spark kafka streams etc. Re: Java, do you understand Multithreading What is your level of experience in Spring. A Re: Kafka Can you answer basic user/developer questions Re: Flink do you have any experience Do you have any skills or understanding of BigO notations. This role supports and works collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Specialist, Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Primary Duties and Responsibilities: To perform this job successfully, an individual must be able to perform each primary duty satisfactorily. Support the application development of big data application for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. 5+ year of experience in building high speed, data-centric solutions 5+ years of experience in Java Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Experience with cloud technologies and migrations. Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google Experience writing unit and integration tests with testing frameworks like Junit, Citrus Experience following Git workflows Working knowledge of DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics Technical Skills: Java-based software development experience and Multithreading Fluent in object-oriented design Strong testing experience Experience working with two or more of the following: Unix/Linux environments, event-driven systems, transaction processing systems, distributed and parallel systems, large software system development, security software development, public-cloud platforms Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API
25/06/2024
Full time
ASSOCIATE PRINCIPAL, SOFTWARE ENGINEERING (JAVA) SALARY: $160k - $170k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite and 2 days remote NO SPONSORSHIP Looking for a candidate with 5 plus years Back End Java development version 8 or above. financial big plus. Must have event-driven systems experience of cloud-based AWS data solutions any devops terraform ansible jenkins. big plus memory model data structures concurrency and Multithreading strong testing flint Apache Spark kafka streams etc. Re: Java, do you understand Multithreading What is your level of experience in Spring. A Re: Kafka Can you answer basic user/developer questions Re: Flink do you have any experience Do you have any skills or understanding of BigO notations. This role supports and works collaboratively with business analysts, team leads and development team. A contributor in developing scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; collaborate with other developers, architects and product owners to support enterprise transformation into a data-driven organization. The Specialist, Application Developer will be a team player and work well with business, technical and non-technical professionals in a project environment. Primary Duties and Responsibilities: To perform this job successfully, an individual must be able to perform each primary duty satisfactorily. Support the application development of big data application for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management Experience with resolving security vulnerabilities Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. 5+ year of experience in building high speed, data-centric solutions 5+ years of experience in Java Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Experience with cloud technologies and migrations. Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google Experience writing unit and integration tests with testing frameworks like Junit, Citrus Experience following Git workflows Working knowledge of DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics Technical Skills: Java-based software development experience and Multithreading Fluent in object-oriented design Strong testing experience Experience working with two or more of the following: Unix/Linux environments, event-driven systems, transaction processing systems, distributed and parallel systems, large software system development, security software development, public-cloud platforms Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Database Administrator. This DBA will focus on production support, performance back up, and DBMS. This company needs someone with 7+ years of working with DB2 LUW on Red Hat and PostgreSQL. This DBA is also expected to have heavy experience coding, reviewing SQL, and some experience with other Scripting languages such as basic Java, linux Shell, Pearl, etc. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Maintains metadata repositories Qualifications: Bachelor's degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business 3+ years' experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years' experience as an associate DBA Experienced in two or more programming languages and two or more Scripting languages Practiced at Entity/Relationship or Object modelling and translation to physical database designs Proficient in DML, DDL, and database utilities for at least two DBMS technologies Proficient in all access methods of a DBMS as well as the underlying operating system access methods Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment Technical Skills: 7+ years' experience with PostgreSQL 7+ years' experience with DB2 LUW; preferably on Red Hat Linux Proficient with coding and review of SQL, stored procedures, and triggers 1+ year' Terraform, Ansible, Jenkins & CI/CD skills 1+ years' EDB Postgres & EDB Postgres Distributed experience Basic Java, Perl & Linux Shell script skills 1+ years' experience with SQL Server 1+ years' experience with DB2 in a z/OS environment 1+ years' experience with MySQL/MariaDB Experience with BMC tools for DB2 (Change/Catalog Manager, MainView, Log Master)
25/06/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Database Administrator. This DBA will focus on production support, performance back up, and DBMS. This company needs someone with 7+ years of working with DB2 LUW on Red Hat and PostgreSQL. This DBA is also expected to have heavy experience coding, reviewing SQL, and some experience with other Scripting languages such as basic Java, linux Shell, Pearl, etc. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Maintains metadata repositories Qualifications: Bachelor's degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business 3+ years' experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years' experience as an associate DBA Experienced in two or more programming languages and two or more Scripting languages Practiced at Entity/Relationship or Object modelling and translation to physical database designs Proficient in DML, DDL, and database utilities for at least two DBMS technologies Proficient in all access methods of a DBMS as well as the underlying operating system access methods Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment Technical Skills: 7+ years' experience with PostgreSQL 7+ years' experience with DB2 LUW; preferably on Red Hat Linux Proficient with coding and review of SQL, stored procedures, and triggers 1+ year' Terraform, Ansible, Jenkins & CI/CD skills 1+ years' EDB Postgres & EDB Postgres Distributed experience Basic Java, Perl & Linux Shell script skills 1+ years' experience with SQL Server 1+ years' experience with DB2 in a z/OS environment 1+ years' experience with MySQL/MariaDB Experience with BMC tools for DB2 (Change/Catalog Manager, MainView, Log Master)
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Java Software Engineering. This engineer will focus on Back End Java development and must have experience with event-driven architecture, AWS data solutions, Kafka, Multithreading, etc. Responsibilities: Support the application development of big data application for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Qualifications: BS degree in Computer Science, similar technical field required 5+ year of experience in building high speed, data-centric solutions Java-based software development experience, including deep understanding of Java fundamentals like Memory Model, Data structures, Concurrency and Multithreading Fluent in object-oriented design, industry best practices, software patterns, and architecture principles Strong testing experience which includes developing test plans, automated test cases, and working with test frameworks Experience working with two or more of the following: Unix/Linux environments, event-driven systems, transaction processing systems, distributed and parallel systems, large software system development, security software development, public-cloud platforms Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google Experience writing unit and integration tests with testing frameworks like Junit, Citrus Experience working with various types of databases like Relational, NoSQL, Object-based, Graph Experience following Git workflows Working knowledge of DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc
25/06/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Java Software Engineering. This engineer will focus on Back End Java development and must have experience with event-driven architecture, AWS data solutions, Kafka, Multithreading, etc. Responsibilities: Support the application development of big data application for business requirements in agreed architecture framework and Agile environment Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Performs application and project risk analysis and recommends quality improvements Assists Production Support by providing advice on system functionality and fixes as required Qualifications: BS degree in Computer Science, similar technical field required 5+ year of experience in building high speed, data-centric solutions Java-based software development experience, including deep understanding of Java fundamentals like Memory Model, Data structures, Concurrency and Multithreading Fluent in object-oriented design, industry best practices, software patterns, and architecture principles Strong testing experience which includes developing test plans, automated test cases, and working with test frameworks Experience working with two or more of the following: Unix/Linux environments, event-driven systems, transaction processing systems, distributed and parallel systems, large software system development, security software development, public-cloud platforms Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API Experience with high speed distributed computing frameworks like FLINK, Apache Spark, Kafka Streams, etc Experience with distributed message brokers like Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google Experience writing unit and integration tests with testing frameworks like Junit, Citrus Experience working with various types of databases like Relational, NoSQL, Object-based, Graph Experience following Git workflows Working knowledge of DevOps tools like Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Financial IT Infrastructure Architect. Candidate will be part of a small Innovation team of Architects that will collaborate with development teams, Solutions Architects, vendors, and other stakeholders to define and drive architectural vision, implementation and continuous improvement of solutions running on the core Real Time data streaming and compute infrastructure platforms such Kafka, Flink and K8s in a Hybrid Environment. Responsibilities: Collaborate with cross-functional teams to design, create and review software application architectures specifically tailored for streaming use cases. Ensure fault tolerance, scalability, and low-latency processing in streaming applications. Collaborate with DevOps teams to define deployment strategies and manage scalability. Drive optimization of streaming application performance by fine-tuning configurations, monitoring resource utilization, and identifying bottlenecks. Drive Implementation of best practices for efficient data serialization, compression, and network communication. Create and maintain architecture documentation, including system diagrams, data flow, and component interactions. Maintain vendor relationships and participate in escalation sessions and postmortems Evaluate and recommend tools and frameworks that enhance the performance and reliability of our streaming systems. Stay informed about industry trends related to Kafka, Flink, and Kubernetes. Qualifications: [Required] Effective communication skills to effectively collaborate and evangelize best practices with technical stakeholders. [Required] Advanced problem-solving skills and logical approach to solving problems [Required] Ability to execute spikes and provide code samples demonstrating best practices when developing solutions on Kafka and Flink. [Required] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Technical Skills: Expert level knowledge of Kafka Expert level knowledge of Flink In depth knowledge of on-premises networking as well as the hybrid connectivity to AWS and/or Azure Knowledge of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), compute, storage, database, network, content distribution, security/IAM, microservices, management, and serverless services Knowledge of Infrastructure as Code (IaC) such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Education and/or Experience: [Preferred] Bachelor's or Master's degree in an engineering discipline [Required] 10+ years of experience architecting of mission critical Cloud and On-Prem Real Time data streaming and event-driven architectures [Required] 10+ years of experience with Java [Required] 5+ years of specific Kafka and Flink experience [Preferred] 5+ years of Kubernetes experience Certificates or Licenses: [Preferred] Confluent Certified Developer for Apache Kafka [Preferred] AWS certifications (eg Solutions Architect Associate) [Preferred] Certified Kubernetes Application Developer
21/06/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Financial IT Infrastructure Architect. Candidate will be part of a small Innovation team of Architects that will collaborate with development teams, Solutions Architects, vendors, and other stakeholders to define and drive architectural vision, implementation and continuous improvement of solutions running on the core Real Time data streaming and compute infrastructure platforms such Kafka, Flink and K8s in a Hybrid Environment. Responsibilities: Collaborate with cross-functional teams to design, create and review software application architectures specifically tailored for streaming use cases. Ensure fault tolerance, scalability, and low-latency processing in streaming applications. Collaborate with DevOps teams to define deployment strategies and manage scalability. Drive optimization of streaming application performance by fine-tuning configurations, monitoring resource utilization, and identifying bottlenecks. Drive Implementation of best practices for efficient data serialization, compression, and network communication. Create and maintain architecture documentation, including system diagrams, data flow, and component interactions. Maintain vendor relationships and participate in escalation sessions and postmortems Evaluate and recommend tools and frameworks that enhance the performance and reliability of our streaming systems. Stay informed about industry trends related to Kafka, Flink, and Kubernetes. Qualifications: [Required] Effective communication skills to effectively collaborate and evangelize best practices with technical stakeholders. [Required] Advanced problem-solving skills and logical approach to solving problems [Required] Ability to execute spikes and provide code samples demonstrating best practices when developing solutions on Kafka and Flink. [Required] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Technical Skills: Expert level knowledge of Kafka Expert level knowledge of Flink In depth knowledge of on-premises networking as well as the hybrid connectivity to AWS and/or Azure Knowledge of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), compute, storage, database, network, content distribution, security/IAM, microservices, management, and serverless services Knowledge of Infrastructure as Code (IaC) such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Education and/or Experience: [Preferred] Bachelor's or Master's degree in an engineering discipline [Required] 10+ years of experience architecting of mission critical Cloud and On-Prem Real Time data streaming and event-driven architectures [Required] 10+ years of experience with Java [Required] 5+ years of specific Kafka and Flink experience [Preferred] 5+ years of Kubernetes experience Certificates or Licenses: [Preferred] Confluent Certified Developer for Apache Kafka [Preferred] AWS certifications (eg Solutions Architect Associate) [Preferred] Certified Kubernetes Application Developer
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Financial IT Infrastructure Architect. Candidate will be part of a small Innovation team of Architects that will collaborate with development teams, Solutions Architects, vendors, and other stakeholders to define and drive architectural vision, implementation and continuous improvement of solutions running on the core Real Time data streaming and compute infrastructure platforms such Kafka, Flink and K8s in a Hybrid Environment. Responsibilities: Collaborate with cross-functional teams to design, create and review software application architectures specifically tailored for streaming use cases. Ensure fault tolerance, scalability, and low-latency processing in streaming applications. Collaborate with DevOps teams to define deployment strategies and manage scalability. Drive optimization of streaming application performance by fine-tuning configurations, monitoring resource utilization, and identifying bottlenecks. Drive Implementation of best practices for efficient data serialization, compression, and network communication. Create and maintain architecture documentation, including system diagrams, data flow, and component interactions. Maintain vendor relationships and participate in escalation sessions and postmortems Evaluate and recommend tools and frameworks that enhance the performance and reliability of our streaming systems. Stay informed about industry trends related to Kafka, Flink, and Kubernetes. Qualifications: [Required] Effective communication skills to effectively collaborate and evangelize best practices with technical stakeholders. [Required] Advanced problem-solving skills and logical approach to solving problems [Required] Ability to execute spikes and provide code samples demonstrating best practices when developing solutions on Kafka and Flink. [Required] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Technical Skills: Expert level knowledge of Kafka Expert level knowledge of Flink In depth knowledge of on-premises networking as well as the hybrid connectivity to AWS and/or Azure Knowledge of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), compute, storage, database, network, content distribution, security/IAM, microservices, management, and serverless services Knowledge of Infrastructure as Code (IaC) such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Education and/or Experience: [Preferred] Bachelor's or Master's degree in an engineering discipline [Required] 10+ years of experience architecting of mission critical Cloud and On-Prem Real Time data streaming and event-driven architectures [Required] 10+ years of experience with Java [Required] 5+ years of specific Kafka and Flink experience [Preferred] 5+ years of Kubernetes experience Certificates or Licenses: [Preferred] Confluent Certified Developer for Apache Kafka [Preferred] AWS certifications (eg Solutions Architect Associate) [Preferred] Certified Kubernetes Application Developer
21/06/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Financial IT Infrastructure Architect. Candidate will be part of a small Innovation team of Architects that will collaborate with development teams, Solutions Architects, vendors, and other stakeholders to define and drive architectural vision, implementation and continuous improvement of solutions running on the core Real Time data streaming and compute infrastructure platforms such Kafka, Flink and K8s in a Hybrid Environment. Responsibilities: Collaborate with cross-functional teams to design, create and review software application architectures specifically tailored for streaming use cases. Ensure fault tolerance, scalability, and low-latency processing in streaming applications. Collaborate with DevOps teams to define deployment strategies and manage scalability. Drive optimization of streaming application performance by fine-tuning configurations, monitoring resource utilization, and identifying bottlenecks. Drive Implementation of best practices for efficient data serialization, compression, and network communication. Create and maintain architecture documentation, including system diagrams, data flow, and component interactions. Maintain vendor relationships and participate in escalation sessions and postmortems Evaluate and recommend tools and frameworks that enhance the performance and reliability of our streaming systems. Stay informed about industry trends related to Kafka, Flink, and Kubernetes. Qualifications: [Required] Effective communication skills to effectively collaborate and evangelize best practices with technical stakeholders. [Required] Advanced problem-solving skills and logical approach to solving problems [Required] Ability to execute spikes and provide code samples demonstrating best practices when developing solutions on Kafka and Flink. [Required] Experience with DevOps tools, eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc. Technical Skills: Expert level knowledge of Kafka Expert level knowledge of Flink In depth knowledge of on-premises networking as well as the hybrid connectivity to AWS and/or Azure Knowledge of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), compute, storage, database, network, content distribution, security/IAM, microservices, management, and serverless services Knowledge of Infrastructure as Code (IaC) such as Terraform, CloudFormation, or Azure Resource Manager Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Education and/or Experience: [Preferred] Bachelor's or Master's degree in an engineering discipline [Required] 10+ years of experience architecting of mission critical Cloud and On-Prem Real Time data streaming and event-driven architectures [Required] 10+ years of experience with Java [Required] 5+ years of specific Kafka and Flink experience [Preferred] 5+ years of Kubernetes experience Certificates or Licenses: [Preferred] Confluent Certified Developer for Apache Kafka [Preferred] AWS certifications (eg Solutions Architect Associate) [Preferred] Certified Kubernetes Application Developer
Associate Principal, Software Programming - Quantitative Risk Management Area - Associate Principal, Software Engineering - Automating Risk Models On site 3 days a week Salary - $185 - $195K + Bonus Looking for a hard core developer who works within the quantitative risk management and cab develop applications and solutions for the QRM team. You will not build models, you will automate models You will need to come from a financial institute, trading company, exchange, etc. Develop hardcore applications You will need to have CICD pipelines, Infrastructure as a Code, Kubernetes, Terraform, etc. Preferably having Java, Python, C++ Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. cloud environment. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Education and/or Experience: Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas
20/06/2024
Full time
Associate Principal, Software Programming - Quantitative Risk Management Area - Associate Principal, Software Engineering - Automating Risk Models On site 3 days a week Salary - $185 - $195K + Bonus Looking for a hard core developer who works within the quantitative risk management and cab develop applications and solutions for the QRM team. You will not build models, you will automate models You will need to come from a financial institute, trading company, exchange, etc. Develop hardcore applications You will need to have CICD pipelines, Infrastructure as a Code, Kubernetes, Terraform, etc. Preferably having Java, Python, C++ Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. cloud environment. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Education and/or Experience: Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas