DevOps Lead Marshall Wolfe is proud to be supporting a technology company in the search for an experienced and highly motivated DevOps Lead lives and breathes making a modern micro-services architecture 'sing' for our products and our customers. This is an exciting opportunity to join a fast-growing company that designs, builds and operates products and platforms on a global scale; liaising with both clients and in-house teams on an international scale. We are really looking for is someone with: A Minimum 7 years of experience in infrastructure engineering with a DevOps approach. A smart, humble, and empathetic engineer who loves creating and managing great DevOps environments. Proven experience designing, building and supporting Back End systems in production with a solid grasp on good software engineering practices such as code reviews, deep focus on quality, and documentation. Architect and scale AWS infrastructure; manage EC2, EKS, S3, RDS, ELB, ECS. Deploy and manage Kafka for Real Time data processing. Proficient in Scripting, and Git and Git workflows. Implement and maintain CI/CD pipelines and repository management using GitLab. Development lead experience. and it would be useful if you brought some experience with: PostgresSQL Go Gitlab Kubernetes Cloudfare Hashicorp Fully remote working also available. Competitive salary.
20/05/2024
Full time
DevOps Lead Marshall Wolfe is proud to be supporting a technology company in the search for an experienced and highly motivated DevOps Lead lives and breathes making a modern micro-services architecture 'sing' for our products and our customers. This is an exciting opportunity to join a fast-growing company that designs, builds and operates products and platforms on a global scale; liaising with both clients and in-house teams on an international scale. We are really looking for is someone with: A Minimum 7 years of experience in infrastructure engineering with a DevOps approach. A smart, humble, and empathetic engineer who loves creating and managing great DevOps environments. Proven experience designing, building and supporting Back End systems in production with a solid grasp on good software engineering practices such as code reviews, deep focus on quality, and documentation. Architect and scale AWS infrastructure; manage EC2, EKS, S3, RDS, ELB, ECS. Deploy and manage Kafka for Real Time data processing. Proficient in Scripting, and Git and Git workflows. Implement and maintain CI/CD pipelines and repository management using GitLab. Development lead experience. and it would be useful if you brought some experience with: PostgresSQL Go Gitlab Kubernetes Cloudfare Hashicorp Fully remote working also available. Competitive salary.
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking an AWS DevOps Software Engineer. Candidate will provide subject matter expertise for ongoing support of applications deployed to non-production AWS environments and supporting 3rd party applications. Identify root causes and automate solutions in support of development. Candidate will have a deep understanding of DevOps practices, leadership skills, and expertise in various tools and technologies. You will be working in a fast-paced, dynamic environment, using cutting-edge tools and cloud technologies. Manage day to day activities when called upon. Responsibilities: Desing Develop release and support, Cloud Native applications running on Containers Kubernetes and Docker within AWS. DevOps Strategy: Develop and implement DevOps strategies and best practices to enhance development, testing, and deployment processes. Possess in-depth knowledge and hands-on experience with DevOps tools and technologies, including but not limited to GitHub, Jenkins, Terraform, Ansible, Kafka, AWS, Apigee. Support the lower environments for incident and problem management. Resolve complex support issues in non-production environments. Create procedural and troubleshooting documentation related to cloud native applications. Write complex automation scripts using common automation tools, such as yaml, Json, Bash, Groovy, Ansible, Terraform and python, Perform other duties as assigned Qualifications: Excellent problem-solving skills. Ability to work independently. Ability to work with management to prioritize tasks. Demonstrate strong confidence in abilities and knowledge. Ability to work well in crisis situations. Ability to work under minimal supervision. Flexibility to be on call from 5 PM to 7 AM for 3 months per year. Good written and oral communication skills. Technical Skills: Expertise on Kubernetes and Docker, including best practices Expertise in cloud containerization; design, develop and troubleshoot Strong programming or Scripting skills in yaml, Helm Charts, Json, Bash, Groovy, Ansible, Terraform, python or Java. Advance level on Networking technologies CI/CD tools such as Artifactory, Jenkins, and GIT, SonarQube Experience with cloud-based systems such as AWS, Azure, or Google Cloud, including expertise in IaC and CaC; Ansible, Terraform Experience with Kafka infrastructure and processes Understanding of software development methodologies and Agile practices Excellent analytical and problem-solving skills, with the ability to troubleshoot and identify the root cause of issues Good verbal and written communication skills, with the ability to collaborate effectively with cross-functional teams. Familiarity with monitoring and logging tools such Elk stack, Splunk. Familiarity with Technologies used to support microservices. Minimum 7 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with Kubernetes. Minimum 3 years experience working on Scripting or Programming Bachelor's degree in a related area Cloud Certification a plus
16/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking an AWS DevOps Software Engineer. Candidate will provide subject matter expertise for ongoing support of applications deployed to non-production AWS environments and supporting 3rd party applications. Identify root causes and automate solutions in support of development. Candidate will have a deep understanding of DevOps practices, leadership skills, and expertise in various tools and technologies. You will be working in a fast-paced, dynamic environment, using cutting-edge tools and cloud technologies. Manage day to day activities when called upon. Responsibilities: Desing Develop release and support, Cloud Native applications running on Containers Kubernetes and Docker within AWS. DevOps Strategy: Develop and implement DevOps strategies and best practices to enhance development, testing, and deployment processes. Possess in-depth knowledge and hands-on experience with DevOps tools and technologies, including but not limited to GitHub, Jenkins, Terraform, Ansible, Kafka, AWS, Apigee. Support the lower environments for incident and problem management. Resolve complex support issues in non-production environments. Create procedural and troubleshooting documentation related to cloud native applications. Write complex automation scripts using common automation tools, such as yaml, Json, Bash, Groovy, Ansible, Terraform and python, Perform other duties as assigned Qualifications: Excellent problem-solving skills. Ability to work independently. Ability to work with management to prioritize tasks. Demonstrate strong confidence in abilities and knowledge. Ability to work well in crisis situations. Ability to work under minimal supervision. Flexibility to be on call from 5 PM to 7 AM for 3 months per year. Good written and oral communication skills. Technical Skills: Expertise on Kubernetes and Docker, including best practices Expertise in cloud containerization; design, develop and troubleshoot Strong programming or Scripting skills in yaml, Helm Charts, Json, Bash, Groovy, Ansible, Terraform, python or Java. Advance level on Networking technologies CI/CD tools such as Artifactory, Jenkins, and GIT, SonarQube Experience with cloud-based systems such as AWS, Azure, or Google Cloud, including expertise in IaC and CaC; Ansible, Terraform Experience with Kafka infrastructure and processes Understanding of software development methodologies and Agile practices Excellent analytical and problem-solving skills, with the ability to troubleshoot and identify the root cause of issues Good verbal and written communication skills, with the ability to collaborate effectively with cross-functional teams. Familiarity with monitoring and logging tools such Elk stack, Splunk. Familiarity with Technologies used to support microservices. Minimum 7 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with Kubernetes. Minimum 3 years experience working on Scripting or Programming Bachelor's degree in a related area Cloud Certification a plus
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking an AWS DevOps Software Engineer. Candidate will provide subject matter expertise for ongoing support of applications deployed to non-production AWS environments and supporting 3rd party applications. Identify root causes and automate solutions in support of development. Candidate will have a deep understanding of DevOps practices, leadership skills, and expertise in various tools and technologies. You will be working in a fast-paced, dynamic environment, using cutting-edge tools and cloud technologies. Manage day to day activities when called upon. Responsibilities: Desing Develop release and support, Cloud Native applications running on Containers Kubernetes and Docker within AWS. DevOps Strategy: Develop and implement DevOps strategies and best practices to enhance development, testing, and deployment processes. Possess in-depth knowledge and hands-on experience with DevOps tools and technologies, including but not limited to GitHub, Jenkins, Terraform, Ansible, Kafka, AWS, Apigee. Support the lower environments for incident and problem management. Resolve complex support issues in non-production environments. Create procedural and troubleshooting documentation related to cloud native applications. Write complex automation scripts using common automation tools, such as yaml, Json, Bash, Groovy, Ansible, Terraform and python, Perform other duties as assigned Qualifications: Excellent problem-solving skills. Ability to work independently. Ability to work with management to prioritize tasks. Demonstrate strong confidence in abilities and knowledge. Ability to work well in crisis situations. Ability to work under minimal supervision. Flexibility to be on call from 5 PM to 7 AM for 3 months per year. Good written and oral communication skills. Technical Skills: Expertise on Kubernetes and Docker, including best practices Expertise in cloud containerization; design, develop and troubleshoot Strong programming or Scripting skills in yaml, Helm Charts, Json, Bash, Groovy, Ansible, Terraform, python or Java. Advance level on Networking technologies CI/CD tools such as Artifactory, Jenkins, and GIT, SonarQube Experience with cloud-based systems such as AWS, Azure, or Google Cloud, including expertise in IaC and CaC; Ansible, Terraform Experience with Kafka infrastructure and processes Understanding of software development methodologies and Agile practices Excellent analytical and problem-solving skills, with the ability to troubleshoot and identify the root cause of issues Good verbal and written communication skills, with the ability to collaborate effectively with cross-functional teams. Familiarity with monitoring and logging tools such Elk stack, Splunk. Familiarity with Technologies used to support microservices. Minimum 7 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with Kubernetes. Minimum 3 years experience working on Scripting or Programming Bachelor's degree in a related area Cloud Certification a plus
16/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking an AWS DevOps Software Engineer. Candidate will provide subject matter expertise for ongoing support of applications deployed to non-production AWS environments and supporting 3rd party applications. Identify root causes and automate solutions in support of development. Candidate will have a deep understanding of DevOps practices, leadership skills, and expertise in various tools and technologies. You will be working in a fast-paced, dynamic environment, using cutting-edge tools and cloud technologies. Manage day to day activities when called upon. Responsibilities: Desing Develop release and support, Cloud Native applications running on Containers Kubernetes and Docker within AWS. DevOps Strategy: Develop and implement DevOps strategies and best practices to enhance development, testing, and deployment processes. Possess in-depth knowledge and hands-on experience with DevOps tools and technologies, including but not limited to GitHub, Jenkins, Terraform, Ansible, Kafka, AWS, Apigee. Support the lower environments for incident and problem management. Resolve complex support issues in non-production environments. Create procedural and troubleshooting documentation related to cloud native applications. Write complex automation scripts using common automation tools, such as yaml, Json, Bash, Groovy, Ansible, Terraform and python, Perform other duties as assigned Qualifications: Excellent problem-solving skills. Ability to work independently. Ability to work with management to prioritize tasks. Demonstrate strong confidence in abilities and knowledge. Ability to work well in crisis situations. Ability to work under minimal supervision. Flexibility to be on call from 5 PM to 7 AM for 3 months per year. Good written and oral communication skills. Technical Skills: Expertise on Kubernetes and Docker, including best practices Expertise in cloud containerization; design, develop and troubleshoot Strong programming or Scripting skills in yaml, Helm Charts, Json, Bash, Groovy, Ansible, Terraform, python or Java. Advance level on Networking technologies CI/CD tools such as Artifactory, Jenkins, and GIT, SonarQube Experience with cloud-based systems such as AWS, Azure, or Google Cloud, including expertise in IaC and CaC; Ansible, Terraform Experience with Kafka infrastructure and processes Understanding of software development methodologies and Agile practices Excellent analytical and problem-solving skills, with the ability to troubleshoot and identify the root cause of issues Good verbal and written communication skills, with the ability to collaborate effectively with cross-functional teams. Familiarity with monitoring and logging tools such Elk stack, Splunk. Familiarity with Technologies used to support microservices. Minimum 7 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with Kubernetes. Minimum 3 years experience working on Scripting or Programming Bachelor's degree in a related area Cloud Certification a plus
Platform Engineer vacancy requiring profound API and Streaming platforms knowledge for our Zurich based client in the financial sector . Your tasks: Designing, developing, and maintaining high-performance APIs and streaming solutions to support our platform's functionality and scalability requirements Collaborating with product managers, software engineers, and other stakeholders to define API specifications, integration requirements, and streaming protocols Implementing best practices for API design, including versioning, authentication, authorization, and documentation to ensure developer-friendly interfaces Architecting and optimizing microservices-based systems to enable efficient data streaming, Real Time processing, and event-driven architectures Troubleshooting and debugging complex issues related to API integrations, data streaming, and platform performance, and implementing effective solutions Your experience/knowledge: Proficiency in programming languages such as Java, Python, or Go, and experience with API frameworks Experience with streaming technologies like Kafka, Apigee, Apache Flink, or Spark Streaming, and Real Time data processing frameworks Strong understanding of microservices architecture, containerization and cloud computing platforms Solid understanding of API security best practices, OAuth, JWT, and API gateway technologies Language skills: English - fluent in written and spoken Your soft skills: Excellent problem-solving skills, attention to detail, and ability to thrive in a fast-paced, collaborative environment Effective communication skills and ability to articulate technical concepts to non-technical stakeholders Location: Zurich, Switzerland Sector: Financial Start: ASAP Duration: 12MM+ Ref .Nr.: BH21638 Take the next step and send us your resume along with a daytime phone number where we can reach you. Due to Swiss work permit restrictions, we can only consider applications from Swiss nationals, EU citizens as well as current work-permit holders for Switzerland. Ukrainian refugees are warmly welcomed, we will support you all the way. We welcome applications from individuals of all genders, age groups, sexual orientations, personal expressions, ethnic backgrounds, and religious beliefs. Therefore, there is no requirement to provide gender information or a photo in your application. As per client requirements, we need information about your marital status, nationality, date of birth, and a valid Swiss work permit. For applicants with disabilities, we are happy to explore potential solutions with our end client.
16/05/2024
Project-based
Platform Engineer vacancy requiring profound API and Streaming platforms knowledge for our Zurich based client in the financial sector . Your tasks: Designing, developing, and maintaining high-performance APIs and streaming solutions to support our platform's functionality and scalability requirements Collaborating with product managers, software engineers, and other stakeholders to define API specifications, integration requirements, and streaming protocols Implementing best practices for API design, including versioning, authentication, authorization, and documentation to ensure developer-friendly interfaces Architecting and optimizing microservices-based systems to enable efficient data streaming, Real Time processing, and event-driven architectures Troubleshooting and debugging complex issues related to API integrations, data streaming, and platform performance, and implementing effective solutions Your experience/knowledge: Proficiency in programming languages such as Java, Python, or Go, and experience with API frameworks Experience with streaming technologies like Kafka, Apigee, Apache Flink, or Spark Streaming, and Real Time data processing frameworks Strong understanding of microservices architecture, containerization and cloud computing platforms Solid understanding of API security best practices, OAuth, JWT, and API gateway technologies Language skills: English - fluent in written and spoken Your soft skills: Excellent problem-solving skills, attention to detail, and ability to thrive in a fast-paced, collaborative environment Effective communication skills and ability to articulate technical concepts to non-technical stakeholders Location: Zurich, Switzerland Sector: Financial Start: ASAP Duration: 12MM+ Ref .Nr.: BH21638 Take the next step and send us your resume along with a daytime phone number where we can reach you. Due to Swiss work permit restrictions, we can only consider applications from Swiss nationals, EU citizens as well as current work-permit holders for Switzerland. Ukrainian refugees are warmly welcomed, we will support you all the way. We welcome applications from individuals of all genders, age groups, sexual orientations, personal expressions, ethnic backgrounds, and religious beliefs. Therefore, there is no requirement to provide gender information or a photo in your application. As per client requirements, we need information about your marital status, nationality, date of birth, and a valid Swiss work permit. For applicants with disabilities, we are happy to explore potential solutions with our end client.
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Woking SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
16/05/2024
Full time
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Woking SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Romsey SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
16/05/2024
Full time
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Romsey SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
NO SPONSORSHIP Principal, Data Architecture SALARY: $195k - $200k plus 27% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote Looking for a candidate that does data architecture and design. Datalakes, data warehouse solutions. schema designs relational nonrelational data messaging design of data science data analytics Kafka and protocol buffers SQL no SQL tableau power bi presto/trino data lakes avro parquet ORC infrastructure technologies ServiceNow or similar. 10 years as a senior data architect data engineer DBA lead logical and conceptual data models data modelling standards data taxonomy data governance Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar Education and/or Experience: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead
15/05/2024
Full time
NO SPONSORSHIP Principal, Data Architecture SALARY: $195k - $200k plus 27% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote Looking for a candidate that does data architecture and design. Datalakes, data warehouse solutions. schema designs relational nonrelational data messaging design of data science data analytics Kafka and protocol buffers SQL no SQL tableau power bi presto/trino data lakes avro parquet ORC infrastructure technologies ServiceNow or similar. 10 years as a senior data architect data engineer DBA lead logical and conceptual data models data modelling standards data taxonomy data governance Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar Education and/or Experience: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead
Scala Data Engineer (Cloudera, Hadoop and CI/CD) - Banking Client - Brussels Duration: 1 year freelance contract Rate: €500 - €800 per day Hybrid working INSIDE OF IR35 You will join the AIR (Analytics, Insight and Reporting tribe) in GDC division. You will be join our dedicated in-house team of data-specialists using a pragmatic best-tool-for-the-job approach to optimise our hybrid infrastructure. With a strong focus on DataOps and MLOps we firmly believe in robust and production-ready solutions being an essential part to our work. The result? Our team provides an ecosystem of data-driven products to internal and external consumers Job requirements The Skills You are strongly motivated to become an excellent Data Engineer. You have a background in IT development and are at least fluent in one programming language, preferably Scala . You have a strong passion for data-driven technologies and interesting design challenges. You have a knack for picking up on new technologies and frameworks . You ideally have knowledge of English , knowledge of other European languages is a plus Nice to haves You know that Cloudera, Hadoop and CI/CD aren't popular video games. You have heard of tools like Apache Spark, Impala and/or Kafka. You have a first experience with cloud platforms like Azure. We are looking for colleagues with the same passion for data as well as personal and professional development, so don't worry if you don't tick all the boxes. Please do send across to me the most up-to-date copy of your CV to (see below)
15/05/2024
Project-based
Scala Data Engineer (Cloudera, Hadoop and CI/CD) - Banking Client - Brussels Duration: 1 year freelance contract Rate: €500 - €800 per day Hybrid working INSIDE OF IR35 You will join the AIR (Analytics, Insight and Reporting tribe) in GDC division. You will be join our dedicated in-house team of data-specialists using a pragmatic best-tool-for-the-job approach to optimise our hybrid infrastructure. With a strong focus on DataOps and MLOps we firmly believe in robust and production-ready solutions being an essential part to our work. The result? Our team provides an ecosystem of data-driven products to internal and external consumers Job requirements The Skills You are strongly motivated to become an excellent Data Engineer. You have a background in IT development and are at least fluent in one programming language, preferably Scala . You have a strong passion for data-driven technologies and interesting design challenges. You have a knack for picking up on new technologies and frameworks . You ideally have knowledge of English , knowledge of other European languages is a plus Nice to haves You know that Cloudera, Hadoop and CI/CD aren't popular video games. You have heard of tools like Apache Spark, Impala and/or Kafka. You have a first experience with cloud platforms like Azure. We are looking for colleagues with the same passion for data as well as personal and professional development, so don't worry if you don't tick all the boxes. Please do send across to me the most up-to-date copy of your CV to (see below)
We are looking for API Tech Lead to work with one of our leading client. Essential:- Api Design & Best practices in MS Stack background Event Driven Architecture Api Testing & Deployment practices Tools used for Api Design, Develop & Test Strong Azure with Api Management experience Rest Api, Open Api specification, Api Strategies, Governance, messaging Technologies (eg Azure Topics/service bus, Kafka), microservices MS Stack background & related technology to build API. RESTful web servicesJMS MessagingJSON, XMLAbility to lead on discussion regarding API platform standards.Ability to lead on discussion regarding Event Driven Architecture standards.Micro services architectureExperience of developing component designs and specification Experience:- Proven experience in leading software engineering teams, preferably in API and EDA development and management. Strong technical background in API and EDA architecture, cloud technologies (Azure preferred), messaging technologies, and digital transformation. Excellent stakeholder management skills, with ability to align various interests and requirements. In-depth knowledge of SDLC, API strategies, and governance practices. Experience in Financial services preferred but not essential.
15/05/2024
Full time
We are looking for API Tech Lead to work with one of our leading client. Essential:- Api Design & Best practices in MS Stack background Event Driven Architecture Api Testing & Deployment practices Tools used for Api Design, Develop & Test Strong Azure with Api Management experience Rest Api, Open Api specification, Api Strategies, Governance, messaging Technologies (eg Azure Topics/service bus, Kafka), microservices MS Stack background & related technology to build API. RESTful web servicesJMS MessagingJSON, XMLAbility to lead on discussion regarding API platform standards.Ability to lead on discussion regarding Event Driven Architecture standards.Micro services architectureExperience of developing component designs and specification Experience:- Proven experience in leading software engineering teams, preferably in API and EDA development and management. Strong technical background in API and EDA architecture, cloud technologies (Azure preferred), messaging technologies, and digital transformation. Excellent stakeholder management skills, with ability to align various interests and requirements. In-depth knowledge of SDLC, API strategies, and governance practices. Experience in Financial services preferred but not essential.
Contract - UC4 Automation Engineer Rate: Open Location: Chicago, IL Hybrid: 3 days on-site, 2 days remote Qualifications Python Scripting SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL Responsibilities Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. Setting up of parallel testing environments that will be used to compare existing system business processes and data to a new cloud-based system/platform. Goal is to ensure that new system is producing correct results and performing as expected before it can become the official system of record. The ability to take raw data, mask it and create algorithms and solutions that increase the data load that will feed into our new Clearing System and with no issues, duplicates or any other data issues that will cause it to be rejected. Assist in the set up and maintenance of cloud-based performance and functional test environments in the Cloud (AWS) and define the steps to automate the process for continuous testing and iterations of cycles.
14/05/2024
Project-based
Contract - UC4 Automation Engineer Rate: Open Location: Chicago, IL Hybrid: 3 days on-site, 2 days remote Qualifications Python Scripting SDET automation testing skills/QA automation engineering Experience with Performance Engineering concepts and methodologies as well as cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Solid utility building with Python, Perl and Powershell. Test automation using CI/CD concepts. Languages & Technologies: Java, Kafka, Docker, Kubernetes, DB2, CyberArk, Harness, JIRA, Jenkins, Splunk, Confluence, Git, JSON, API Testing, Cucumber, Selenium, Terraform, Ansible, Veracode, Virtualan, UC4, Change Data Capture, Docker, AWS/Google/Azure Cloud, Open API/Swagger, SOAP Web Service(JAX-WS), Restful Web Service (JAX-RS), Apache-CXF, Spring-Core, Spring WS, Spring Transaction, Spring-Integration, JDBC, Shell Scripting, XML, JavaScript, SQL, Python, JMeter, Gatling, Perl, PowerShell. SignalFX, AppDynamics. Software tools and Utilities: Jenkins, Kubernetes, Enterprise Architect (EA), Enterprise Manager-UM, SQL Developer, JConsole, Visual Studio, JMeter, Bitbucket, Git, CVS, SVN, PuTTy, Microsoft Visio, TOAD, SourceTree, JIRA, Confluence, Sonar, Bamboo, Splunk, Automic (UC4), Apache Kafka, LogicMonitor, BMC MainView, Real Time, and Historical monitoring tools on-prem and in the Cloud. Web Servers/App. Servers/Containers Experience; Database Technologies: DB2, PostgreSQL Responsibilities Performance Testing with open-source tools like JMeter, Gatling. Perl Scripting, PowerShell Scripting, solid Python Scripting and Java. Setting up of parallel testing environments that will be used to compare existing system business processes and data to a new cloud-based system/platform. Goal is to ensure that new system is producing correct results and performing as expected before it can become the official system of record. The ability to take raw data, mask it and create algorithms and solutions that increase the data load that will feed into our new Clearing System and with no issues, duplicates or any other data issues that will cause it to be rejected. Assist in the set up and maintenance of cloud-based performance and functional test environments in the Cloud (AWS) and define the steps to automate the process for continuous testing and iterations of cycles.
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer
13/05/2024
Full time
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Romsey SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
13/05/2024
Full time
Role: DevOps Engineer Salary: Up to £50,000 per annum dependent on experience Location: Hybrid/Romsey SC clearance is required for this role We are looking for an experienced DevOps Engineer with experience around 2-3 years experience in software development. You will be overseeing code releases, deployments, and support operational systems. Skills and experience; Active SC clearance Experience with cloud technologies ie AWS or Azure Programming language experience ie Java, Python, node.js or SQL Data technologies experience ie PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (see below) CBSbutler is acting as an employment agency for this role.
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
09/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
08/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar