Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Senior Linux DevOps Engineer. Candidate will be responsible for design and support of core platform engineering automation. This role will drive the strategy for infrastructure automation and be charged to improve application adoption, reduce overall operational support, and increase end-user usability of our platform services. Candidate will provide team leadership required to support a large, complex Architect L3 Linux based computing environment and an increasing transition to Linux infrastructure in AWS. Assist in driving infrastructure as code mentality throughout the organization and demonstrate a passion for automation concepts and tools. Responsibilities: Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Qualifications : Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Technical Skills: Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Linux Experience: Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Cloud Experience - Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Infra Automation - Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. CICD Experience - Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Resilient Design - Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
03/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Senior Linux DevOps Engineer. Candidate will be responsible for design and support of core platform engineering automation. This role will drive the strategy for infrastructure automation and be charged to improve application adoption, reduce overall operational support, and increase end-user usability of our platform services. Candidate will provide team leadership required to support a large, complex Architect L3 Linux based computing environment and an increasing transition to Linux infrastructure in AWS. Assist in driving infrastructure as code mentality throughout the organization and demonstrate a passion for automation concepts and tools. Responsibilities: Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Qualifications : Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Technical Skills: Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Linux Experience: Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Cloud Experience - Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Infra Automation - Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. CICD Experience - Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Resilient Design - Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Senior Linux DevOps Engineer. Candidate will be responsible for design and support of core platform engineering automation. This role will drive the strategy for infrastructure automation and be charged to improve application adoption, reduce overall operational support, and increase end-user usability of our platform services. Candidate will provide team leadership required to support a large, complex Architect L3 Linux based computing environment and an increasing transition to Linux infrastructure in AWS. Assist in driving infrastructure as code mentality throughout the organization and demonstrate a passion for automation concepts and tools. Responsibilities: Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Qualifications : Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Technical Skills: Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Linux Experience: Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Cloud Experience - Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Infra Automation - Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. CICD Experience - Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Resilient Design - Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
03/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Senior Linux DevOps Engineer. Candidate will be responsible for design and support of core platform engineering automation. This role will drive the strategy for infrastructure automation and be charged to improve application adoption, reduce overall operational support, and increase end-user usability of our platform services. Candidate will provide team leadership required to support a large, complex Architect L3 Linux based computing environment and an increasing transition to Linux infrastructure in AWS. Assist in driving infrastructure as code mentality throughout the organization and demonstrate a passion for automation concepts and tools. Responsibilities: Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Qualifications : Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Technical Skills: Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Linux Experience: Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Cloud Experience - Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Infra Automation - Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. CICD Experience - Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Resilient Design - Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Linux Engineer. This engineer will focus on design, support, engineering, and automation for the Linux Operating system. This engineer will need hands on experience with Terraform, Kubernetes, Jenkins, Ansible, AWS, Docker, CICD, DevOps, etc. Responsibilities/Qualifications: Bachelor's degree, preferably in a technical discipline (Computer Science, Mathematics, etc.), or equivalent combination of education and experience required 8+ years' experience in IT systems installation, operations, administration, and maintenance of cloud systems/virtualized Servers Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
03/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Linux Engineer. This engineer will focus on design, support, engineering, and automation for the Linux Operating system. This engineer will need hands on experience with Terraform, Kubernetes, Jenkins, Ansible, AWS, Docker, CICD, DevOps, etc. Responsibilities/Qualifications: Bachelor's degree, preferably in a technical discipline (Computer Science, Mathematics, etc.), or equivalent combination of education and experience required 8+ years' experience in IT systems installation, operations, administration, and maintenance of cloud systems/virtualized Servers Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Linux Engineer. This engineer will focus on design, support, engineering, and automation for the Linux Operating system. This engineer will need hands on experience with Terraform, Kubernetes, Jenkins, Ansible, AWS, Docker, CICD, DevOps, etc. Responsibilities/Qualifications: Bachelor's degree, preferably in a technical discipline (Computer Science, Mathematics, etc.), or equivalent combination of education and experience required 8+ years' experience in IT systems installation, operations, administration, and maintenance of cloud systems/virtualized Servers Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
03/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Linux Engineer. This engineer will focus on design, support, engineering, and automation for the Linux Operating system. This engineer will need hands on experience with Terraform, Kubernetes, Jenkins, Ansible, AWS, Docker, CICD, DevOps, etc. Responsibilities/Qualifications: Bachelor's degree, preferably in a technical discipline (Computer Science, Mathematics, etc.), or equivalent combination of education and experience required 8+ years' experience in IT systems installation, operations, administration, and maintenance of cloud systems/virtualized Servers Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
Role: Scala Developer Location: Osterley, UK Duration: 6 months (With possible extension) Hybrid work option: Yes (2-3 days from office, if requested later candidate should be flexible to work Full time from office) Years of exp required: 5+ years Job details: Real Time data processing and RESTful microservices in Scala (Typelevel stack, Kafka, Cassandra, Kubernetes, GCP, AWS). Good working knowledge of Akka HTTP and Akka Streams is required to support existing services. Looking into how our personalisation services can evolve with machine learning. Having the freedom to self-organise as part of a cross functional agile team. Refining the team's processes to continuously integrate and working towards a deliverable application. Championing best practices such as Pair Programming and TDD in order to develop clean, resilient code that performs at serious scale. Coaching and providing feedback to fellow developers. Growing our engineering culture which is focussed on DevOps and GitOps principles. How will you be doing this? Work in a motivated team, empowered to meet ambitious goals. Collaborate on technical choices, architecture, tools and processes. Review code and give feedback to ensure that the highest standards are maintained. Actively improve overall software quality.
03/05/2024
Role: Scala Developer Location: Osterley, UK Duration: 6 months (With possible extension) Hybrid work option: Yes (2-3 days from office, if requested later candidate should be flexible to work Full time from office) Years of exp required: 5+ years Job details: Real Time data processing and RESTful microservices in Scala (Typelevel stack, Kafka, Cassandra, Kubernetes, GCP, AWS). Good working knowledge of Akka HTTP and Akka Streams is required to support existing services. Looking into how our personalisation services can evolve with machine learning. Having the freedom to self-organise as part of a cross functional agile team. Refining the team's processes to continuously integrate and working towards a deliverable application. Championing best practices such as Pair Programming and TDD in order to develop clean, resilient code that performs at serious scale. Coaching and providing feedback to fellow developers. Growing our engineering culture which is focussed on DevOps and GitOps principles. How will you be doing this? Work in a motivated team, empowered to meet ambitious goals. Collaborate on technical choices, architecture, tools and processes. Review code and give feedback to ensure that the highest standards are maintained. Actively improve overall software quality.
Location: Hybrid (options in Newcastle, Edinburgh, Glasgow, Leeds, Manchester, Birmingham, London) Job Summary: Join Scrumconnect, a catalyst for digital transformation impacting over 50+ million citizens, saving taxpayers over £25 million, and launching 64 services in the past 24 months. As a Senior Scala Developer, you will lead our development initiatives, champion a culture of innovation and collaboration, and help define the future of digital public services. Key Responsibilities: Lead Development Projects: Spearhead the development of high-quality applications using Scala and Play Framework, focusing on scalability and performance. Software Architecture: Direct software design and architectural discussions to craft robust and scalable solutions, integrating modern design patterns. Mentorship: Mentor junior developers, fostering an environment of learning and growth by promoting best practices in coding and process. Collaborative Engagement: Work closely with cross-functional teams to ensure seamless delivery of comprehensive software solutions. Continuous Improvement: Drive technical excellence, clean code principles, Test-Driven Development (TDD), and Behavior-Driven Development (BDD) within the team. Innovation: Play a key role in the evolution of our Agile processes and strive for a high level of automation in continuous integration and delivery. Required Qualifications: Minimum of 7 years of experience in Scala development, with a strong portfolio of projects. At least 3 years of experience specifically in Scala development. Proficiency in functional programming, modern software design patterns, asynchronous programming, and Multithreading. Skilled in: Scala, Play framework, NoSql (Mongo DB), ScalaTest/ScalaMock, Wiremock, Jenkins, Kibana, Grafana, Git, Intellij Preferred Qualifications: Tech stack is: Scala, Play framework, NoSql (Mongo DB), ScalaTest/ScalaMock, Wiremock, Jenkins, Kibana, Grafana, Git, Intellij Familiarity with Apache Kafka and RESTful web services. Strong understanding of architectural and integration design patterns. Commitment to Agile best practices. Commitment: Permanent About Scrumconnect: At Scrumconnect, we are dedicated to pioneering advancements in technology that serve public needs and lead to significant societal benefits. Our team is comprised of innovators and thinkers who are committed to redefining the possibilities of digital solutions in public services. What We Offer: A competitive salary alongside performance-based incentives. Comprehensive health and wellness benefits. Opportunities for professional growth through continuous learning and development. Flexible working arrangements to support a healthy work-life balance. A vibrant, inclusive work environment where your contributions make a visible impact. Join Us: If you are looking to make a significant impact and lead in the development of critical digital services, we encourage you to apply. Please submit your CV and a cover letter detailing your relevant experience and why you are excited about this opportunity at Scrumconnect.
03/05/2024
Full time
Location: Hybrid (options in Newcastle, Edinburgh, Glasgow, Leeds, Manchester, Birmingham, London) Job Summary: Join Scrumconnect, a catalyst for digital transformation impacting over 50+ million citizens, saving taxpayers over £25 million, and launching 64 services in the past 24 months. As a Senior Scala Developer, you will lead our development initiatives, champion a culture of innovation and collaboration, and help define the future of digital public services. Key Responsibilities: Lead Development Projects: Spearhead the development of high-quality applications using Scala and Play Framework, focusing on scalability and performance. Software Architecture: Direct software design and architectural discussions to craft robust and scalable solutions, integrating modern design patterns. Mentorship: Mentor junior developers, fostering an environment of learning and growth by promoting best practices in coding and process. Collaborative Engagement: Work closely with cross-functional teams to ensure seamless delivery of comprehensive software solutions. Continuous Improvement: Drive technical excellence, clean code principles, Test-Driven Development (TDD), and Behavior-Driven Development (BDD) within the team. Innovation: Play a key role in the evolution of our Agile processes and strive for a high level of automation in continuous integration and delivery. Required Qualifications: Minimum of 7 years of experience in Scala development, with a strong portfolio of projects. At least 3 years of experience specifically in Scala development. Proficiency in functional programming, modern software design patterns, asynchronous programming, and Multithreading. Skilled in: Scala, Play framework, NoSql (Mongo DB), ScalaTest/ScalaMock, Wiremock, Jenkins, Kibana, Grafana, Git, Intellij Preferred Qualifications: Tech stack is: Scala, Play framework, NoSql (Mongo DB), ScalaTest/ScalaMock, Wiremock, Jenkins, Kibana, Grafana, Git, Intellij Familiarity with Apache Kafka and RESTful web services. Strong understanding of architectural and integration design patterns. Commitment to Agile best practices. Commitment: Permanent About Scrumconnect: At Scrumconnect, we are dedicated to pioneering advancements in technology that serve public needs and lead to significant societal benefits. Our team is comprised of innovators and thinkers who are committed to redefining the possibilities of digital solutions in public services. What We Offer: A competitive salary alongside performance-based incentives. Comprehensive health and wellness benefits. Opportunities for professional growth through continuous learning and development. Flexible working arrangements to support a healthy work-life balance. A vibrant, inclusive work environment where your contributions make a visible impact. Join Us: If you are looking to make a significant impact and lead in the development of critical digital services, we encourage you to apply. Please submit your CV and a cover letter detailing your relevant experience and why you are excited about this opportunity at Scrumconnect.
NO SPONSORSHIP Principal, Data Architecture SALARY: $195k - $200k plus 27% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote Looking for a candidate that does data architecture and design. Datalakes, data warehouse solutions. schema designs relational nonrelational data messaging design of data science data analytics Kafka and protocol buffers SQL no SQL tableau power bi presto/trino data lakes avro parquet ORC infrastructure technologies ServiceNow or similar. 10 years as a senior data architect data engineer DBA lead logical and conceptual data models data modelling standards data taxonomy data governance Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar Education and/or Experience: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead
02/05/2024
Full time
NO SPONSORSHIP Principal, Data Architecture SALARY: $195k - $200k plus 27% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote Looking for a candidate that does data architecture and design. Datalakes, data warehouse solutions. schema designs relational nonrelational data messaging design of data science data analytics Kafka and protocol buffers SQL no SQL tableau power bi presto/trino data lakes avro parquet ORC infrastructure technologies ServiceNow or similar. 10 years as a senior data architect data engineer DBA lead logical and conceptual data models data modelling standards data taxonomy data governance Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar Education and/or Experience: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead
Performance Testing - CI/CD - Open Source Tools, Uc4 C2C LOCATION: CHICAGO - HYBRID 3 DAYS ONSITE Long Term Contract Looking for a candidate to do performance testing using open source tools like jmeter, gatling, Perl, solid python Scripting. Familiar with creating modules that multiply transaction (data) multiple platforms store data financial environment Java cloud automation look at Java and convert it to python 20% SDET automation testing QA automation testing using CICD concepts Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. EXPERIENCE REQUIRED: Python Scripting - familiarity with creating modules that multiply transactional data and other data multiplier strategies that will be used in test cycles of the Real Time Clearing System SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL; Operating Systems experience; Methodologies: Agile, Iterative & Waterfall
01/05/2024
Project-based
Performance Testing - CI/CD - Open Source Tools, Uc4 C2C LOCATION: CHICAGO - HYBRID 3 DAYS ONSITE Long Term Contract Looking for a candidate to do performance testing using open source tools like jmeter, gatling, Perl, solid python Scripting. Familiar with creating modules that multiply transaction (data) multiple platforms store data financial environment Java cloud automation look at Java and convert it to python 20% SDET automation testing QA automation testing using CICD concepts Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. EXPERIENCE REQUIRED: Python Scripting - familiarity with creating modules that multiply transactional data and other data multiplier strategies that will be used in test cycles of the Real Time Clearing System SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL; Operating Systems experience; Methodologies: Agile, Iterative & Waterfall
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for an Associate Principal, Appian Development. This role will focus on design, development, testing, and implementation of Appian. Responsibilities: Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Integrating disparate data from REST and WebSocket services within a cohesive user interface Participating in innovative design and proof of concepts with emerging technologies and solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Writes unit and integration tests based on chosen DevOps frameworks Qualifications: BS degree in Computer Science, similar technical field (required) Appian certified developer (required) Blue Prism certified associate developer or higher (preferred) 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required)
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for an Associate Principal, Appian Development. This role will focus on design, development, testing, and implementation of Appian. Responsibilities: Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Integrating disparate data from REST and WebSocket services within a cohesive user interface Participating in innovative design and proof of concepts with emerging technologies and solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Writes unit and integration tests based on chosen DevOps frameworks Qualifications: BS degree in Computer Science, similar technical field (required) Appian certified developer (required) Blue Prism certified associate developer or higher (preferred) 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required)
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for an Associate Principal, Appian Development. This role will focus on design, development, testing, and implementation of Appian. Responsibilities: Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Integrating disparate data from REST and WebSocket services within a cohesive user interface Participating in innovative design and proof of concepts with emerging technologies and solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Writes unit and integration tests based on chosen DevOps frameworks Qualifications: BS degree in Computer Science, similar technical field (required) Appian certified developer (required) Blue Prism certified associate developer or higher (preferred) 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required)
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for an Associate Principal, Appian Development. This role will focus on design, development, testing, and implementation of Appian. Responsibilities: Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Integrating disparate data from REST and WebSocket services within a cohesive user interface Participating in innovative design and proof of concepts with emerging technologies and solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented Writes unit and integration tests based on chosen DevOps frameworks Qualifications: BS degree in Computer Science, similar technical field (required) Appian certified developer (required) Blue Prism certified associate developer or higher (preferred) 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required)
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
Principal, Data Architect Salary: Open + Bonus Location: Chicago, IL or Dallas, TX Hybrid: 3 days onsite, 2 days remote Qualifications 10+ years of related experience leading to a Senior Architect level. Experience in design of data lake/warehouse solutions, preferably in the cloud. Experience in schema design for relational and non-relational data and messaging protocols. Experience in design of data science and data analytics solutions. Expertise with the following Kafka and Protocol Buffers SQL and No SQL databases BI tools (Tableau, Power BI etc.) Federated query tools such as Presto/Trino Data lake file formats such as Avro, Parquet, ORC Responsibilities Design the data architecture of organization to support data driven vision. Create design and blueprint of the data capabilities for the organization within the data framework. Analyze structural requirements for new solutions and applications. Optimize new and current database systems. Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards. Responsible for creating logical and conceptual data models. Assist in building data taxonomy and aligning it with business processes. Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues. Drives short- and long-term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations.
01/05/2024
Full time
Principal, Data Architect Salary: Open + Bonus Location: Chicago, IL or Dallas, TX Hybrid: 3 days onsite, 2 days remote Qualifications 10+ years of related experience leading to a Senior Architect level. Experience in design of data lake/warehouse solutions, preferably in the cloud. Experience in schema design for relational and non-relational data and messaging protocols. Experience in design of data science and data analytics solutions. Expertise with the following Kafka and Protocol Buffers SQL and No SQL databases BI tools (Tableau, Power BI etc.) Federated query tools such as Presto/Trino Data lake file formats such as Avro, Parquet, ORC Responsibilities Design the data architecture of organization to support data driven vision. Create design and blueprint of the data capabilities for the organization within the data framework. Analyze structural requirements for new solutions and applications. Optimize new and current database systems. Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards. Responsible for creating logical and conceptual data models. Assist in building data taxonomy and aligning it with business processes. Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues. Drives short- and long-term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations.
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
Principal, Data Architect Salary: Open + Bonus Location: Chicago, IL or Dallas, TX Hybrid: 3 days onsite, 2 days remote Qualifications 10+ years of related experience leading to a Senior Architect level. Experience in design of data lake/warehouse solutions, preferably in the cloud. Experience in schema design for relational and non-relational data and messaging protocols. Experience in design of data science and data analytics solutions. Expertise with the following Kafka and Protocol Buffers SQL and No SQL databases BI tools (Tableau, Power BI etc.) Federated query tools such as Presto/Trino Data lake file formats such as Avro, Parquet, ORC Responsibilities Design the data architecture of organization to support data driven vision. Create design and blueprint of the data capabilities for the organization within the data framework. Analyze structural requirements for new solutions and applications. Optimize new and current database systems. Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards. Responsible for creating logical and conceptual data models. Assist in building data taxonomy and aligning it with business processes. Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues. Drives short- and long-term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations.
01/05/2024
Full time
Principal, Data Architect Salary: Open + Bonus Location: Chicago, IL or Dallas, TX Hybrid: 3 days onsite, 2 days remote Qualifications 10+ years of related experience leading to a Senior Architect level. Experience in design of data lake/warehouse solutions, preferably in the cloud. Experience in schema design for relational and non-relational data and messaging protocols. Experience in design of data science and data analytics solutions. Expertise with the following Kafka and Protocol Buffers SQL and No SQL databases BI tools (Tableau, Power BI etc.) Federated query tools such as Presto/Trino Data lake file formats such as Avro, Parquet, ORC Responsibilities Design the data architecture of organization to support data driven vision. Create design and blueprint of the data capabilities for the organization within the data framework. Analyze structural requirements for new solutions and applications. Optimize new and current database systems. Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards. Responsible for creating logical and conceptual data models. Assist in building data taxonomy and aligning it with business processes. Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues. Drives short- and long-term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations.
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
01/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
01/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
Senior .NET Developer - Perm £70k - Sports Gaming Simulation One of the largest sports gaming and gambling groups is on the search for a Senior .NET Developer to join their Simulation engineering team. As a Senior developer you will be joining a strong .NET Team to combine maths, sports, and computer science to create simulations of a sports game. The goal is to produce fast, accurate and efficient sports simulations for downstream cloud-based microservices architecture. In this role you will have the opportunity to learn a array of subjects from probabilities, Matrix manipulation, and statistics to in depths sport game rules/data/advanced algorithms, latency management and many more! Salary: £70,000 Location: X2 per week in London Requirements: 5-7years Experience with C# and .NET Frameworks (Version 5+) Experience with MySQL Experience and knowledge of SOLID Principles Experience with messaging queues Kafka or Redis Experience with AWS A MSc Qualification in Software Engineering or Computer Science Desirable passion/interest of sports If you're interested in joining a giant gambling group who is expanding their game simulation experience with modern technology and high coding practices. Please get in touch with a up to date CV to get a conversation rolling. Carbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to this vacancy.
01/05/2024
Full time
Senior .NET Developer - Perm £70k - Sports Gaming Simulation One of the largest sports gaming and gambling groups is on the search for a Senior .NET Developer to join their Simulation engineering team. As a Senior developer you will be joining a strong .NET Team to combine maths, sports, and computer science to create simulations of a sports game. The goal is to produce fast, accurate and efficient sports simulations for downstream cloud-based microservices architecture. In this role you will have the opportunity to learn a array of subjects from probabilities, Matrix manipulation, and statistics to in depths sport game rules/data/advanced algorithms, latency management and many more! Salary: £70,000 Location: X2 per week in London Requirements: 5-7years Experience with C# and .NET Frameworks (Version 5+) Experience with MySQL Experience and knowledge of SOLID Principles Experience with messaging queues Kafka or Redis Experience with AWS A MSc Qualification in Software Engineering or Computer Science Desirable passion/interest of sports If you're interested in joining a giant gambling group who is expanding their game simulation experience with modern technology and high coding practices. Please get in touch with a up to date CV to get a conversation rolling. Carbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to this vacancy.
Contract - UC4 Automation Engineer Rate: Open Location: Chicago, IL Hybrid: 3 days on-site, 2 days remote Qualifications Python Scripting SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL Responsibilities Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. Setting up of parallel testing environments that will be used to compare existing system business processes and data to a new cloud-based system/platform. Goal is to ensure that new system is producing correct results and performing as expected before it can become the official system of record. The ability to take raw data, mask it and create algorithms and solutions that increase the data load that will feed into our new Clearing System and with no issues, duplicates or any other data issues that will cause it to be rejected. Assist in the set up and maintenance of cloud-based performance and functional test environments in the Cloud (AWS) and define the steps to automate the process for continuous testing and iterations of cycles.
30/04/2024
Project-based
Contract - UC4 Automation Engineer Rate: Open Location: Chicago, IL Hybrid: 3 days on-site, 2 days remote Qualifications Python Scripting SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL Responsibilities Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. Setting up of parallel testing environments that will be used to compare existing system business processes and data to a new cloud-based system/platform. Goal is to ensure that new system is producing correct results and performing as expected before it can become the official system of record. The ability to take raw data, mask it and create algorithms and solutions that increase the data load that will feed into our new Clearing System and with no issues, duplicates or any other data issues that will cause it to be rejected. Assist in the set up and maintenance of cloud-based performance and functional test environments in the Cloud (AWS) and define the steps to automate the process for continuous testing and iterations of cycles.
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer
30/04/2024
Full time
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer