Full-stack Software Engineer with strong Front End Development capabilities in JavaScript, React, Typescript, Node.js CI/CD, and modern JavaScript testing frameworks. Cloud Native experience (AWS is ideal) ideally combined with Java Back End knowledge and knowledge of Pair programming/team programming to build and evolve the micro Front End for a market-leading cleantech company building a software product at the forefront of renewable energy asset/battery performance management. The role can be based hybrid in Central Cambridge or fully remote in the UK. Essential skills as the Full Stack Engineer Software Engineer (Front End focus) include Strong Front End skills, including tools such as JavaScript/TypeScript, Node.js and HTML5+CSS3, with experience with modern frameworks such as React Sound experience and understanding of practises promoting Continuous Delivery, such as TDD and CI with trunk-based development, along with experience using tools that enable it, such as modern JavaScript test frameworks and CI/CD pipelines An Understand isolated UI component deployment to support rapid deployment and testing - to help evolve the micro-Front End product Solid understanding of cloud-native development and experience with AWS or a major cloud product experience The desire to work collaboratively, through pair and team programming as well as asynchronously within the team Comfortable tackling a wide range of problems as a T-Shaped software engineer Desirable skills: Exposure and experience in full stack development, including different toolsets and languages Ideally, Java and the Spring ecosystem as these are used extensively, but would welcome experience in a wide range of technology, too Interest and experience in Domain Driven Design and Behaviour Driven Design Experience in using synchronous APIs, messaging and event-driven architectures, and related tooling. Kafka is the backbone of the event-driven system Docker and Containerised deployments Configuration management, deployment management and related tools This role will be part of a small team working on the development of the Front End product for customer-centric renewables performance management, with a strong focus on enablement across stream-aligned product teams through the development of a Front End framework and tooling, backed up by collaboration and coaching to assure a coherent cross-domain product. This is a great chance to focus on the evolution of the Front End product to support automated testing and continuous delivery of multiple features to multiple geographies and flex your technical skills by delivering high-quality robust code. This a great chance to join a clean tech company building a product forefront of renewable energy asset/battery performance management, with a strong software engineering culture. Opus Resourcing acts as an employment agency for permanent employment.
09/05/2024
Full time
Full-stack Software Engineer with strong Front End Development capabilities in JavaScript, React, Typescript, Node.js CI/CD, and modern JavaScript testing frameworks. Cloud Native experience (AWS is ideal) ideally combined with Java Back End knowledge and knowledge of Pair programming/team programming to build and evolve the micro Front End for a market-leading cleantech company building a software product at the forefront of renewable energy asset/battery performance management. The role can be based hybrid in Central Cambridge or fully remote in the UK. Essential skills as the Full Stack Engineer Software Engineer (Front End focus) include Strong Front End skills, including tools such as JavaScript/TypeScript, Node.js and HTML5+CSS3, with experience with modern frameworks such as React Sound experience and understanding of practises promoting Continuous Delivery, such as TDD and CI with trunk-based development, along with experience using tools that enable it, such as modern JavaScript test frameworks and CI/CD pipelines An Understand isolated UI component deployment to support rapid deployment and testing - to help evolve the micro-Front End product Solid understanding of cloud-native development and experience with AWS or a major cloud product experience The desire to work collaboratively, through pair and team programming as well as asynchronously within the team Comfortable tackling a wide range of problems as a T-Shaped software engineer Desirable skills: Exposure and experience in full stack development, including different toolsets and languages Ideally, Java and the Spring ecosystem as these are used extensively, but would welcome experience in a wide range of technology, too Interest and experience in Domain Driven Design and Behaviour Driven Design Experience in using synchronous APIs, messaging and event-driven architectures, and related tooling. Kafka is the backbone of the event-driven system Docker and Containerised deployments Configuration management, deployment management and related tools This role will be part of a small team working on the development of the Front End product for customer-centric renewables performance management, with a strong focus on enablement across stream-aligned product teams through the development of a Front End framework and tooling, backed up by collaboration and coaching to assure a coherent cross-domain product. This is a great chance to focus on the evolution of the Front End product to support automated testing and continuous delivery of multiple features to multiple geographies and flex your technical skills by delivering high-quality robust code. This a great chance to join a clean tech company building a product forefront of renewable energy asset/battery performance management, with a strong software engineering culture. Opus Resourcing acts as an employment agency for permanent employment.
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
08/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
NO SPONSORSHIP Principal, Data Architecture SALARY: $195k - $200k plus 27% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote Looking for a candidate that does data architecture and design. Datalakes, data warehouse solutions. schema designs relational nonrelational data messaging design of data science data analytics Kafka and protocol buffers SQL no SQL tableau power bi presto/trino data lakes avro parquet ORC infrastructure technologies ServiceNow or similar. 10 years as a senior data architect data engineer DBA lead logical and conceptual data models data modelling standards data taxonomy data governance Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar Education and/or Experience: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead
08/05/2024
Full time
NO SPONSORSHIP Principal, Data Architecture SALARY: $195k - $200k plus 27% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote Looking for a candidate that does data architecture and design. Datalakes, data warehouse solutions. schema designs relational nonrelational data messaging design of data science data analytics Kafka and protocol buffers SQL no SQL tableau power bi presto/trino data lakes avro parquet ORC infrastructure technologies ServiceNow or similar. 10 years as a senior data architect data engineer DBA lead logical and conceptual data models data modelling standards data taxonomy data governance Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar Education and/or Experience: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead
Performance Testing - CI/CD - Open Source Tools, Uc4 C2C LOCATION: CHICAGO - HYBRID 3 DAYS ONSITE Long Term Contract Looking for a candidate to do performance testing using open source tools like jmeter, gatling, Perl, solid python Scripting. Familiar with creating modules that multiply transaction (data) multiple platforms store data financial environment Java cloud automation look at Java and convert it to python 20% SDET automation testing QA automation testing using CICD concepts Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. EXPERIENCE REQUIRED: Python Scripting - familiarity with creating modules that multiply transactional data and other data multiplier strategies that will be used in test cycles of the Real Time Clearing System SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL; Operating Systems experience; Methodologies: Agile, Iterative & Waterfall
08/05/2024
Project-based
Performance Testing - CI/CD - Open Source Tools, Uc4 C2C LOCATION: CHICAGO - HYBRID 3 DAYS ONSITE Long Term Contract Looking for a candidate to do performance testing using open source tools like jmeter, gatling, Perl, solid python Scripting. Familiar with creating modules that multiply transaction (data) multiple platforms store data financial environment Java cloud automation look at Java and convert it to python 20% SDET automation testing QA automation testing using CICD concepts Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. EXPERIENCE REQUIRED: Python Scripting - familiarity with creating modules that multiply transactional data and other data multiplier strategies that will be used in test cycles of the Real Time Clearing System SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL; Operating Systems experience; Methodologies: Agile, Iterative & Waterfall
Platform Engineer vacancy requiring profound API and Streaming platforms knowledge for our Zurich based client in the financial sector . Your tasks: Designing, developing, and maintaining high-performance APIs and streaming solutions to support our platform's functionality and scalability requirements Collaborating with product managers, software engineers, and other stakeholders to define API specifications, integration requirements, and streaming protocols Implementing best practices for API design, including versioning, authentication, authorization, and documentation to ensure developer-friendly interfaces Architecting and optimizing microservices-based systems to enable efficient data streaming, Real Time processing, and event-driven architectures Troubleshooting and debugging complex issues related to API integrations, data streaming, and platform performance, and implementing effective solutions Your experience/knowledge: Proficiency in programming languages such as Java, Python, or Go, and experience with API frameworks Experience with streaming technologies like Kafka, Apigee, Apache Flink, or Spark Streaming, and Real Time data processing frameworks Strong understanding of microservices architecture, containerization and cloud computing platforms Solid understanding of API security best practices, OAuth, JWT, and API gateway technologies Language skills: English - fluent in written and spoken Your soft skills: Excellent problem-solving skills, attention to detail, and ability to thrive in a fast-paced, collaborative environment Effective communication skills and ability to articulate technical concepts to non-technical stakeholders Location: Zurich, Switzerland Sector: Financial Start: ASAP Duration: 12MM+ Ref .Nr.: BH21638 Take the next step and send us your resume along with a daytime phone number where we can reach you. Due to Swiss work permit restrictions, we can only consider applications from Swiss nationals, EU citizens as well as current work-permit holders for Switzerland. Ukrainian refugees are warmly welcomed, we will support you all the way. We welcome applications from individuals of all genders, age groups, sexual orientations, personal expressions, ethnic backgrounds, and religious beliefs. Therefore, there is no requirement to provide gender information or a photo in your application. As per client requirements, we need information about your marital status, nationality, date of birth, and a valid Swiss work permit. For applicants with disabilities, we are happy to explore potential solutions with our end client.
08/05/2024
Project-based
Platform Engineer vacancy requiring profound API and Streaming platforms knowledge for our Zurich based client in the financial sector . Your tasks: Designing, developing, and maintaining high-performance APIs and streaming solutions to support our platform's functionality and scalability requirements Collaborating with product managers, software engineers, and other stakeholders to define API specifications, integration requirements, and streaming protocols Implementing best practices for API design, including versioning, authentication, authorization, and documentation to ensure developer-friendly interfaces Architecting and optimizing microservices-based systems to enable efficient data streaming, Real Time processing, and event-driven architectures Troubleshooting and debugging complex issues related to API integrations, data streaming, and platform performance, and implementing effective solutions Your experience/knowledge: Proficiency in programming languages such as Java, Python, or Go, and experience with API frameworks Experience with streaming technologies like Kafka, Apigee, Apache Flink, or Spark Streaming, and Real Time data processing frameworks Strong understanding of microservices architecture, containerization and cloud computing platforms Solid understanding of API security best practices, OAuth, JWT, and API gateway technologies Language skills: English - fluent in written and spoken Your soft skills: Excellent problem-solving skills, attention to detail, and ability to thrive in a fast-paced, collaborative environment Effective communication skills and ability to articulate technical concepts to non-technical stakeholders Location: Zurich, Switzerland Sector: Financial Start: ASAP Duration: 12MM+ Ref .Nr.: BH21638 Take the next step and send us your resume along with a daytime phone number where we can reach you. Due to Swiss work permit restrictions, we can only consider applications from Swiss nationals, EU citizens as well as current work-permit holders for Switzerland. Ukrainian refugees are warmly welcomed, we will support you all the way. We welcome applications from individuals of all genders, age groups, sexual orientations, personal expressions, ethnic backgrounds, and religious beliefs. Therefore, there is no requirement to provide gender information or a photo in your application. As per client requirements, we need information about your marital status, nationality, date of birth, and a valid Swiss work permit. For applicants with disabilities, we are happy to explore potential solutions with our end client.
Contract - UC4 Automation Engineer Rate: Open Location: Chicago, IL Hybrid: 3 days on-site, 2 days remote Qualifications Python Scripting SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL Responsibilities Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. Setting up of parallel testing environments that will be used to compare existing system business processes and data to a new cloud-based system/platform. Goal is to ensure that new system is producing correct results and performing as expected before it can become the official system of record. The ability to take raw data, mask it and create algorithms and solutions that increase the data load that will feed into our new Clearing System and with no issues, duplicates or any other data issues that will cause it to be rejected. Assist in the set up and maintenance of cloud-based performance and functional test environments in the Cloud (AWS) and define the steps to automate the process for continuous testing and iterations of cycles.
07/05/2024
Project-based
Contract - UC4 Automation Engineer Rate: Open Location: Chicago, IL Hybrid: 3 days on-site, 2 days remote Qualifications Python Scripting SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL Responsibilities Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. Setting up of parallel testing environments that will be used to compare existing system business processes and data to a new cloud-based system/platform. Goal is to ensure that new system is producing correct results and performing as expected before it can become the official system of record. The ability to take raw data, mask it and create algorithms and solutions that increase the data load that will feed into our new Clearing System and with no issues, duplicates or any other data issues that will cause it to be rejected. Assist in the set up and maintenance of cloud-based performance and functional test environments in the Cloud (AWS) and define the steps to automate the process for continuous testing and iterations of cycles.
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Romsey SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
07/05/2024
Full time
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Romsey SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Woking SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
07/05/2024
Full time
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Woking SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer
06/05/2024
Full time
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Linux Engineer. This engineer will focus on design, support, engineering, and automation for the Linux Operating system. This engineer will need hands on experience with Terraform, Kubernetes, Jenkins, Ansible, AWS, Docker, CICD, DevOps, etc. Responsibilities/Qualifications: Bachelor's degree, preferably in a technical discipline (Computer Science, Mathematics, etc.), or equivalent combination of education and experience required 8+ years' experience in IT systems installation, operations, administration, and maintenance of cloud systems/virtualized Servers Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
03/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Linux Engineer. This engineer will focus on design, support, engineering, and automation for the Linux Operating system. This engineer will need hands on experience with Terraform, Kubernetes, Jenkins, Ansible, AWS, Docker, CICD, DevOps, etc. Responsibilities/Qualifications: Bachelor's degree, preferably in a technical discipline (Computer Science, Mathematics, etc.), or equivalent combination of education and experience required 8+ years' experience in IT systems installation, operations, administration, and maintenance of cloud systems/virtualized Servers Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Linux Engineer. This engineer will focus on design, support, engineering, and automation for the Linux Operating system. This engineer will need hands on experience with Terraform, Kubernetes, Jenkins, Ansible, AWS, Docker, CICD, DevOps, etc. Responsibilities/Qualifications: Bachelor's degree, preferably in a technical discipline (Computer Science, Mathematics, etc.), or equivalent combination of education and experience required 8+ years' experience in IT systems installation, operations, administration, and maintenance of cloud systems/virtualized Servers Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
03/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Linux Engineer. This engineer will focus on design, support, engineering, and automation for the Linux Operating system. This engineer will need hands on experience with Terraform, Kubernetes, Jenkins, Ansible, AWS, Docker, CICD, DevOps, etc. Responsibilities/Qualifications: Bachelor's degree, preferably in a technical discipline (Computer Science, Mathematics, etc.), or equivalent combination of education and experience required 8+ years' experience in IT systems installation, operations, administration, and maintenance of cloud systems/virtualized Servers Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Ansible. Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Extensive knowledge of Linux operating systems, Linux shells and standard utilities, and common Linux security tools at L3 level In depth system administration knowledge and skills for RedHat Linux. Kubernetes Experience - Strong knowledge in Kubernetes deployment frameworks/platforms including Helm, Docker, Rancher, OpenShift, EKS. Provide advanced system administration, operational support and problem resolution for a large complex Linux computing environment, including both virtualized and physical Servers. Create and Patch AMIs, perform pull requests, write Automation code using tools such as Ansible, Terraform, etc. Strong knowledge of secure cloud infrastructure design and components, such as: Servers, operating systems, networks, IAM, and storage. Cloud Certifications, specifically AWS Cloud certification would be preferred. Expert knowledge in core automation development toolchain including Terraform, Ansible, Jenkins, Git, Harness. Mastery of CICD best practices in a large organization. (GitOps/DevOps, secure builds, secure code promotion, deployments (Harness/Argo), automated testing (app and infra), integration of policy frameworks, cost-optimization, SLSA best practices) Experience with architecting, implementing and maintaining highly available mission critical environments for 24/7 availability.
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Woking SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
02/05/2024
Full time
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Woking SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Romsey SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
02/05/2024
Full time
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Romsey SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
01/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM