Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Java Risk Management Software Engineer. Candidate will develop and maintain risk models for margin, clearing fund and stress testing with the focus on developing and maintaining risk model software in production, and environments and infrastructure used in model implementation and testing. Candidate will collaborate with other developers, quantitative analysts, business users, data & technology staff to expand the technical capabilities for model development, back-testing and monitoring. Responsibilities: Develop and maintain software and environments used to implement and test systems for pricing, margin risk and stress testing of financial products and derivatives. Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Configure, execute, and monitor execution pipelines for model testing, back-testing and monitoring. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Qualifications: Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. Track record of complex production implementations and a demonstrated ability in developing and maintaining enterprise level software, including in the cloud environment. Proficiency in technical and/or scientific documentation (eg, white papers, user guides, etc.) Strong problem-solving skills: Be able to accurately identify a problem's source, severity, and impact to determine possible solutions and needed resources. Experience with Agile/SCRUM or another rapid development framework. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 10+ years of experience as a software developer with exposure to the cloud or high-performance computing areas Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Experience with logging, profiling, monitoring, telemetry (eg Splunk, OpenTelemetry). Good command of database technology and query languages (SQL) and non-relational DB and other Big Data technology, including efficient storage and serialization protocols (eg Parquet, Avro, Protocol Buffers). Experience with automated quality assurance frameworks (eg, Junit, TestNG, PyTest, etc.). Experience with high performance and distributed computing. Experience with productivity tools such as Jira, Confluence, MS Office. Experience with Scripting languages such as Python is a plus. Experience with numerical libraries and/or scientific computing is a plus.
09/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Principal Java Risk Management Software Engineer. Candidate will develop and maintain risk models for margin, clearing fund and stress testing with the focus on developing and maintaining risk model software in production, and environments and infrastructure used in model implementation and testing. Candidate will collaborate with other developers, quantitative analysts, business users, data & technology staff to expand the technical capabilities for model development, back-testing and monitoring. Responsibilities: Develop and maintain software and environments used to implement and test systems for pricing, margin risk and stress testing of financial products and derivatives. Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Configure, execute, and monitor execution pipelines for model testing, back-testing and monitoring. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Qualifications: Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. Track record of complex production implementations and a demonstrated ability in developing and maintaining enterprise level software, including in the cloud environment. Proficiency in technical and/or scientific documentation (eg, white papers, user guides, etc.) Strong problem-solving skills: Be able to accurately identify a problem's source, severity, and impact to determine possible solutions and needed resources. Experience with Agile/SCRUM or another rapid development framework. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 10+ years of experience as a software developer with exposure to the cloud or high-performance computing areas Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Experience with logging, profiling, monitoring, telemetry (eg Splunk, OpenTelemetry). Good command of database technology and query languages (SQL) and non-relational DB and other Big Data technology, including efficient storage and serialization protocols (eg Parquet, Avro, Protocol Buffers). Experience with automated quality assurance frameworks (eg, Junit, TestNG, PyTest, etc.). Experience with high performance and distributed computing. Experience with productivity tools such as Jira, Confluence, MS Office. Experience with Scripting languages such as Python is a plus. Experience with numerical libraries and/or scientific computing is a plus.
Associate Principal, Software Programming - Quantitative Risk Management Area - Associate Principal, Software Engineering - Automating Risk Models On site 3 days a week Salary - $185 - $195K + Bonus Looking for a hard core developer who works within the quantitative risk management and cab develop applications and solutions for the QRM team. You will not build models, you will automate models You will need to come from a financial institute, trading company, exchange, etc. Develop hardcore applications You will need to have CICD pipelines, Infrastructure as a Code, Kubernetes, Terraform, etc. Preferably having Java, Python, C++ Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. cloud environment. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Education and/or Experience: Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas
09/05/2024
Full time
Associate Principal, Software Programming - Quantitative Risk Management Area - Associate Principal, Software Engineering - Automating Risk Models On site 3 days a week Salary - $185 - $195K + Bonus Looking for a hard core developer who works within the quantitative risk management and cab develop applications and solutions for the QRM team. You will not build models, you will automate models You will need to come from a financial institute, trading company, exchange, etc. Develop hardcore applications You will need to have CICD pipelines, Infrastructure as a Code, Kubernetes, Terraform, etc. Preferably having Java, Python, C++ Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. cloud environment. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Education and/or Experience: Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
09/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
Principal Python Engineer (Sell Side) Up to £140,000 2 days a week on site in Central London Summary You'll play a crucial role in delivering top-notch software solutions, meticulously tested and optimized to perfection. Your responsibilities will span from building innovative solutions and integration's to enhancing our existing code base, all while contributing to the continuous evolution of our platform design and architecture. You'll be at the forefront of transforming raw data into actionable insights for our data scientists and business analysts, designing and developing reusable data pipelines tailored for my clients data platform. Exposure to cutting-edge technologies such as serverless solutions, micro service architecture, delta lake, and cloud-based applications will be part of your journey, along with maintaining Infrastructure as Code (IaC). You'll need: 3-5+ years of full-stack development experience, including design, development, testing, deployment, and version control. Proficiency in Python, Structured Query Language (SQL), ReactJS, and REST API. Hands-on experience with Amazon Web Services (AWS), including EC2, S3, RDS, DynamoDB, Lambda, and EBS, for designing scalable, cloud-native, distributed software utilizing modern development architectures. Strong analytical and problem-solving skills, coupled with effective communication abilities. What's in it for you Up to £140,000 base salary Market leading bonus and pension flexible working arrangements Cornwallis Elt is an Employment Agency & Employment Business and has been listed 3 times in The Sunday Times Virgin Fast Track 100 of the UKs fastest growing private companies, as well as in the Recruitment International Top 250, Top 50 in IT and the Recruiter Fast 50 & Hot 100 reports.
09/05/2024
Full time
Principal Python Engineer (Sell Side) Up to £140,000 2 days a week on site in Central London Summary You'll play a crucial role in delivering top-notch software solutions, meticulously tested and optimized to perfection. Your responsibilities will span from building innovative solutions and integration's to enhancing our existing code base, all while contributing to the continuous evolution of our platform design and architecture. You'll be at the forefront of transforming raw data into actionable insights for our data scientists and business analysts, designing and developing reusable data pipelines tailored for my clients data platform. Exposure to cutting-edge technologies such as serverless solutions, micro service architecture, delta lake, and cloud-based applications will be part of your journey, along with maintaining Infrastructure as Code (IaC). You'll need: 3-5+ years of full-stack development experience, including design, development, testing, deployment, and version control. Proficiency in Python, Structured Query Language (SQL), ReactJS, and REST API. Hands-on experience with Amazon Web Services (AWS), including EC2, S3, RDS, DynamoDB, Lambda, and EBS, for designing scalable, cloud-native, distributed software utilizing modern development architectures. Strong analytical and problem-solving skills, coupled with effective communication abilities. What's in it for you Up to £140,000 base salary Market leading bonus and pension flexible working arrangements Cornwallis Elt is an Employment Agency & Employment Business and has been listed 3 times in The Sunday Times Virgin Fast Track 100 of the UKs fastest growing private companies, as well as in the Recruitment International Top 250, Top 50 in IT and the Recruiter Fast 50 & Hot 100 reports.
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
08/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
NO SPONSORSHIP Principal, Data Architecture SALARY: $195k - $200k plus 27% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote Looking for a candidate that does data architecture and design. Datalakes, data warehouse solutions. schema designs relational nonrelational data messaging design of data science data analytics Kafka and protocol buffers SQL no SQL tableau power bi presto/trino data lakes avro parquet ORC infrastructure technologies ServiceNow or similar. 10 years as a senior data architect data engineer DBA lead logical and conceptual data models data modelling standards data taxonomy data governance Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar Education and/or Experience: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead
08/05/2024
Full time
NO SPONSORSHIP Principal, Data Architecture SALARY: $195k - $200k plus 27% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote Looking for a candidate that does data architecture and design. Datalakes, data warehouse solutions. schema designs relational nonrelational data messaging design of data science data analytics Kafka and protocol buffers SQL no SQL tableau power bi presto/trino data lakes avro parquet ORC infrastructure technologies ServiceNow or similar. 10 years as a senior data architect data engineer DBA lead logical and conceptual data models data modelling standards data taxonomy data governance Qualifications: The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions. Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar Education and/or Experience: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead
Technical Author - Content writing, Generative AI, Technology, Financial Services, Documentation, Confluence Overview: My client is looking for someone London team to support the Public Cloud compute team by writing daily documentation, establishing practices, and assisting 200 engineers. Key Responsibilities: Ensure consistency in design, workflows, and documentation. Develop documentation best practices. Collaborate with engineering teams to create accessible documentation. Identify and address outdated content. Principal Responsibilities: Collaborate with engineering to understand product and documentation needs. Create and maintain technical documentation. Utilize generative AI tools for documentation when necessary. Ensure documentation compliance with internal standards. Qualifications and Skills: Proven experience as a Technical Writer. Familiarity with cloud, networking, and Linux platforms. Experience in financial services is advantageous. Proficiency in wiki-style documentation and HTML. Familiarity with version control (git) is beneficial. Permanent/Onsite London By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you. Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.
08/05/2024
Full time
Technical Author - Content writing, Generative AI, Technology, Financial Services, Documentation, Confluence Overview: My client is looking for someone London team to support the Public Cloud compute team by writing daily documentation, establishing practices, and assisting 200 engineers. Key Responsibilities: Ensure consistency in design, workflows, and documentation. Develop documentation best practices. Collaborate with engineering teams to create accessible documentation. Identify and address outdated content. Principal Responsibilities: Collaborate with engineering to understand product and documentation needs. Create and maintain technical documentation. Utilize generative AI tools for documentation when necessary. Ensure documentation compliance with internal standards. Qualifications and Skills: Proven experience as a Technical Writer. Familiarity with cloud, networking, and Linux platforms. Experience in financial services is advantageous. Proficiency in wiki-style documentation and HTML. Familiarity with version control (git) is beneficial. Permanent/Onsite London By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you. Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.
NO SPONSORSHIP Associate Principal, Software Programming Quantitative Risk Management Area Associate Principal, Software Engineering Automating Risk Models Chicago - On site 3 days a week Salary - $185 - $195K + Bonus Looking for a hard core developer who works within the quantitative risk management and cab develop applications and solutions for the QRM team. You will not build models, you will automate models You will need to come from a financial institute, trading company, exchange, etc. Develop hardcore applications You will need to have CICD pipelines, Infrastructure as a Code, Kubernetes, Terraform, etc. Preferably having Java, Python, C++ Configure and manage resources in the local and AWS cloud environments and deploy QRMs software on these resources. Develop CI/CD pipelines. Contribute to development of QRMs databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. cloud environment. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Education and/or Experience: Masters degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas
08/05/2024
Full time
NO SPONSORSHIP Associate Principal, Software Programming Quantitative Risk Management Area Associate Principal, Software Engineering Automating Risk Models Chicago - On site 3 days a week Salary - $185 - $195K + Bonus Looking for a hard core developer who works within the quantitative risk management and cab develop applications and solutions for the QRM team. You will not build models, you will automate models You will need to come from a financial institute, trading company, exchange, etc. Develop hardcore applications You will need to have CICD pipelines, Infrastructure as a Code, Kubernetes, Terraform, etc. Preferably having Java, Python, C++ Configure and manage resources in the local and AWS cloud environments and deploy QRMs software on these resources. Develop CI/CD pipelines. Contribute to development of QRMs databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. cloud environment. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Education and/or Experience: Masters degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas
NO SPONSORSHIP Principal, Software Engineering Enterprise Monitoring - Splunk SALARY: $200k- $215k base w/up to 30% bonus LOCATION: Chicago, IL 3 days onsite, 2 days remote Looking for a technical team lead over the enterprise splunk monitoring system. You will be the SME in Splunk Monitoring, Cloud Native Applications running on Kubernetes within AWS. Responsibilities Translate middle and senior management strategic directives into workable technical directives Monitor project status and take remedial action on projects behind schedule and/or over budget Provide subject matter expertise for ongoing support of third-party tools like Splunk Provide expert-level technical mentoring to more junior members of the team Resolve complex support issues in non-production and production environments. Have an understanding of Cloud Native applications running on Kubernetes within AWS and how exposed APIs may be used to monitor them Assist production support and development staff in debugging environment defects using logging monitors and/or APM-related profiling data Create procedural and troubleshooting documentation related to enterprise monitoring systems and the applications they are monitoring Write complex automation scripts using common automation tools, such as Jenkins, Ansible, and Terraform for the installation, configuration, and/or upgrade of monitoring systems Qualifications Systems administration and change management practices Enterprise monitoring and reporting tools Experience Scripting and/or coding against APIs In-depth knowledge of common used management and monitoring tech Internet/Web based technologies ITLT Best Practices Experience with tech used to support microservices Network technologies AWS log collection such as CloudTrail, CloudWatch, VPC Flow Logs Monitoring and reporting using SNMP CI/CD tools such as Artifactory, Jenkins, and GIT Cloud native applications, including Terraform experience Technologies used to support microservices Encryption technologies (SSL/TLS, PKI Infrastructure management) Security controls as applied to software technologies Bachelor's degree 10+ years of related experience Minimum 10 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with cloud native applications Minimum 3 years experience managing technical projects
07/05/2024
Full time
NO SPONSORSHIP Principal, Software Engineering Enterprise Monitoring - Splunk SALARY: $200k- $215k base w/up to 30% bonus LOCATION: Chicago, IL 3 days onsite, 2 days remote Looking for a technical team lead over the enterprise splunk monitoring system. You will be the SME in Splunk Monitoring, Cloud Native Applications running on Kubernetes within AWS. Responsibilities Translate middle and senior management strategic directives into workable technical directives Monitor project status and take remedial action on projects behind schedule and/or over budget Provide subject matter expertise for ongoing support of third-party tools like Splunk Provide expert-level technical mentoring to more junior members of the team Resolve complex support issues in non-production and production environments. Have an understanding of Cloud Native applications running on Kubernetes within AWS and how exposed APIs may be used to monitor them Assist production support and development staff in debugging environment defects using logging monitors and/or APM-related profiling data Create procedural and troubleshooting documentation related to enterprise monitoring systems and the applications they are monitoring Write complex automation scripts using common automation tools, such as Jenkins, Ansible, and Terraform for the installation, configuration, and/or upgrade of monitoring systems Qualifications Systems administration and change management practices Enterprise monitoring and reporting tools Experience Scripting and/or coding against APIs In-depth knowledge of common used management and monitoring tech Internet/Web based technologies ITLT Best Practices Experience with tech used to support microservices Network technologies AWS log collection such as CloudTrail, CloudWatch, VPC Flow Logs Monitoring and reporting using SNMP CI/CD tools such as Artifactory, Jenkins, and GIT Cloud native applications, including Terraform experience Technologies used to support microservices Encryption technologies (SSL/TLS, PKI Infrastructure management) Security controls as applied to software technologies Bachelor's degree 10+ years of related experience Minimum 10 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with cloud native applications Minimum 3 years experience managing technical projects
NO SPONSORSHIP Principal, Software Engineering Enterprise Cloud Monitoring - Splunk SALARY: $200k- $215k base w/up to 30% bonus LOCATION: Dallas, TX 3 days onsite, 2 days remote It is all about on-premises monitoring and cloud monitoring The products they are looking for outside of Splunk is Data Dog, Dynatrace, New Relic Heavy cloud, AWS, EC2, Automation, application performance monitoring, enterprise monitoring, any EMC patrol, Tivoli, and regulatory experience Responsibilities Translate middle and senior management strategic directives into workable technical directives Monitor project status and take remedial action on projects behind schedule and/or over budget Provide subject matter expertise for ongoing support of third-party tools like Splunk Provide expert-level technical mentoring to more junior members of the team Resolve complex support issues in non-production and production environments. Have an understanding of Cloud Native applications running on Kubernetes within AWS and how exposed APIs may be used to monitor them Assist production support and development staff in debugging environment defects using logging monitors and/or APM-related profiling data Create procedural and troubleshooting documentation related to enterprise monitoring systems and the applications they are monitoring Write complex automation scripts using common automation tools, such as Jenkins, Ansible, and Terraform for the installation, configuration, and/or upgrade of monitoring systems Qualifications Systems administration and change management practices Enterprise monitoring and reporting tools Experience Scripting and/or coding against APIs In-depth knowledge of common used management and monitoring tech Internet/Web based technologies ITLT Best Practices Experience with tech used to support microservices Network technologies AWS log collection such as CloudTrail, CloudWatch, VPC Flow Logs Monitoring and reporting using SNMP CI/CD tools such as Artifactory, Jenkins, and GIT Cloud native applications, including Terraform experience Technologies used to support microservices Encryption technologies (SSL/TLS, PKI Infrastructure management) Security controls as applied to software technologies Bachelor's degree 10+ years of related experience Minimum 10 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with cloud native applications Minimum 3 years experience managing technical projects
07/05/2024
Full time
NO SPONSORSHIP Principal, Software Engineering Enterprise Cloud Monitoring - Splunk SALARY: $200k- $215k base w/up to 30% bonus LOCATION: Dallas, TX 3 days onsite, 2 days remote It is all about on-premises monitoring and cloud monitoring The products they are looking for outside of Splunk is Data Dog, Dynatrace, New Relic Heavy cloud, AWS, EC2, Automation, application performance monitoring, enterprise monitoring, any EMC patrol, Tivoli, and regulatory experience Responsibilities Translate middle and senior management strategic directives into workable technical directives Monitor project status and take remedial action on projects behind schedule and/or over budget Provide subject matter expertise for ongoing support of third-party tools like Splunk Provide expert-level technical mentoring to more junior members of the team Resolve complex support issues in non-production and production environments. Have an understanding of Cloud Native applications running on Kubernetes within AWS and how exposed APIs may be used to monitor them Assist production support and development staff in debugging environment defects using logging monitors and/or APM-related profiling data Create procedural and troubleshooting documentation related to enterprise monitoring systems and the applications they are monitoring Write complex automation scripts using common automation tools, such as Jenkins, Ansible, and Terraform for the installation, configuration, and/or upgrade of monitoring systems Qualifications Systems administration and change management practices Enterprise monitoring and reporting tools Experience Scripting and/or coding against APIs In-depth knowledge of common used management and monitoring tech Internet/Web based technologies ITLT Best Practices Experience with tech used to support microservices Network technologies AWS log collection such as CloudTrail, CloudWatch, VPC Flow Logs Monitoring and reporting using SNMP CI/CD tools such as Artifactory, Jenkins, and GIT Cloud native applications, including Terraform experience Technologies used to support microservices Encryption technologies (SSL/TLS, PKI Infrastructure management) Security controls as applied to software technologies Bachelor's degree 10+ years of related experience Minimum 10 years experience working in a distributed multi-platform environment. Minimum 3 years experience working with cloud native applications Minimum 3 years experience managing technical projects
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer
06/05/2024
Full time
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar