*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
08/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
NO SPONSORSHIP Associate Principal, Software Engineering - QRM SALARY: $135k - $145k - $150kish plus 15% bonus LOCATION: CHICAGO, IL Hybrid 3 days onsite and 2 days remote SELLING POINTS: develops and maintains risk models for managing clearing fund and stress testing risk model software in production. AWS develop CICD pipelines JAVA C# Python Agile Scrum financial products a plus understand markets financial derivatives equities interest rates commodity products Java preferred cicd infrastructure as a code Kubernetes terraform splunk open telemetry SQL big data Scripting in python This role is responsible for one or more functions within Quantitative Risk Management (QRM) who develops and maintains risk models for margin, clearing fund and stress testing with the focus on developing and maintaining risk model software in production, and environments and infrastructure used in model implementation and testing. This role will collaborate with other developers, quantitative analysts, business users, data & technology staff to expand QRM's technical capabilities for model development, backtesting and monitoring. Primary Duties and Responsibilities: Develop and maintain software and environments used to implement and test systems for pricing, margin risk and stress testing of financial products and derivatives. Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Configure, execute, and monitor execution pipelines for model testing, backtesting and monitoring. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Qualifications: Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. Track record of complex production implementations and a demonstrated ability in developing and maintaining enterprise level software, including in the cloud environment. Proficiency in technical and/or scientific documentation (eg, white papers, user guides, etc.) Strong problem-solving skills: Be able to accurately identify a problem's source, severity, and impact to determine possible solutions and needed resources. Experience with Agile/SCRUM or another rapid development framework. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Experience with logging, profiling, monitoring, telemetry (eg Splunk, OpenTelemetry). Good command of database technology and query languages (SQL) and non-relational DB and other Big Data technology, including efficient storage and serialization protocols (eg Parquet, Avro, Protocol Buffers). Experience with automated quality assurance frameworks (eg, Junit, TestNG, PyTest, etc.). Experience with high performance and distributed computing. Experience with productivity tools such as Jira, Confluence, MS Office. Experience with Scripting languages such as Python is a plus. Experience with numerical libraries and/or scientific computing is a plus. Education and/or Experience: Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas Certificates or Licenses:
08/05/2024
Full time
NO SPONSORSHIP Associate Principal, Software Engineering - QRM SALARY: $135k - $145k - $150kish plus 15% bonus LOCATION: CHICAGO, IL Hybrid 3 days onsite and 2 days remote SELLING POINTS: develops and maintains risk models for managing clearing fund and stress testing risk model software in production. AWS develop CICD pipelines JAVA C# Python Agile Scrum financial products a plus understand markets financial derivatives equities interest rates commodity products Java preferred cicd infrastructure as a code Kubernetes terraform splunk open telemetry SQL big data Scripting in python This role is responsible for one or more functions within Quantitative Risk Management (QRM) who develops and maintains risk models for margin, clearing fund and stress testing with the focus on developing and maintaining risk model software in production, and environments and infrastructure used in model implementation and testing. This role will collaborate with other developers, quantitative analysts, business users, data & technology staff to expand QRM's technical capabilities for model development, backtesting and monitoring. Primary Duties and Responsibilities: Develop and maintain software and environments used to implement and test systems for pricing, margin risk and stress testing of financial products and derivatives. Configure and manage resources in the local and AWS cloud environments and deploy QRM's software on these resources. Develop CI/CD pipelines. Configure, execute, and monitor execution pipelines for model testing, backtesting and monitoring. Contribute to development of QRM's databases and ETLs. Integrate model prototypes, model library and model testing tools using best industry practices and innovations. Create unit and integration tests; build and enhance test automation tools. Participate in code reviews and demo accomplishments. Write technical documentation and user manuals. Provide production support and perform troubleshooting. Qualifications: Strong programming skills. Able to read and/or write code using a programming language (eg, Java, C++, Python, etc.) in a collaborative software development setting: The role requires advanced coding, database and environment manipulation skills. Track record of complex production implementations and a demonstrated ability in developing and maintaining enterprise level software, including in the cloud environment. Proficiency in technical and/or scientific documentation (eg, white papers, user guides, etc.) Strong problem-solving skills: Be able to accurately identify a problem's source, severity, and impact to determine possible solutions and needed resources. Experience with Agile/SCRUM or another rapid development framework. Financial products knowledge is a plus: understanding of markets and financial derivatives in equities, interest rate, and commodity products. Background in Financial mathematics is a plus: derivatives pricing models, stochastic calculus, statistics and probability theory, linear algebra. Technical Skills: Proficiency in Java (preferred) or another object-oriented language is required, including effective application of design patterns and best coding practices. DevOps experience, with a good command of CI/CD process and tools (eg, Git, GitHub, Gradle, Jenkins, Docker, Helm, Harness). Experience in containerized deployment in cloud environments. Experienced with cloud technology (AWS preferred), infrastructure-as-code (eg Terraform), managing and orchestrating containerized workloads (eg Kubernetes). Experience with logging, profiling, monitoring, telemetry (eg Splunk, OpenTelemetry). Good command of database technology and query languages (SQL) and non-relational DB and other Big Data technology, including efficient storage and serialization protocols (eg Parquet, Avro, Protocol Buffers). Experience with automated quality assurance frameworks (eg, Junit, TestNG, PyTest, etc.). Experience with high performance and distributed computing. Experience with productivity tools such as Jira, Confluence, MS Office. Experience with Scripting languages such as Python is a plus. Experience with numerical libraries and/or scientific computing is a plus. Education and/or Experience: Master's degree or equivalent in a computational or numerical field such as computer science, information systems, mathematics, physics 7+ years of experience as a software developer with exposure to the cloud or high-performance computing areas Certificates or Licenses:
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer
06/05/2024
Full time
ASSOCIATE PRINCIPAL, APPIAN SOFTWARE ENGINEERING SALARY: $140k - $145k - $152k plus 15% bonus LOCATION: Chicago, IL Hybrid 3 days onsite, 2 days remote Looking for someone to design development testing and do the implementation of appian software. You will need 5 years Front End user experience, JavaScript automating workflows inside appian aws unix linux Java python node js angular 2.0 or react js and Middleware technologies. Working knowledge of devops terraform ansible Jenkins Kubernetes helm and cicd pipelines. Must have a degree and be apian certified developer required Contribute to design, technical direction and architecture including collaborating with various teams to build fit for purpose solutions. Applies expert knowledge of Java, Python, JavaScript, NodeJS, Angular 2.0 or ReactJS and middle-ware technologies in independently designing and developing key services with a focus on continuous integration and delivery Participates in code reviews, proactively identifying and mitigating potential issues and defects as well as assisting with continuous improvement Drives continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality Qualifications: 5+ years of Front End, User Experience, development (required) 5+ years of experience in JavaScript skills (required) 3 + years of experience automating workflows inside Appian and in conjunction with integration to other tools (required) 3+ years of experience in React application development (required) 3+ years of hands-on HTML5/CSS3 experience (required) Experience with Java and/or Python (required) Experience with popular Javascript frameworks such as React, Node JS, Vue, Angular 2.0 (required) Experience of working with websockets, HTTP 1.1 and HTTP/2 (required) Experience with RESTful APIs and JSON RPC (required) Ability to write clean, bug-free code that is easy to understand and easily maintainable (required) Experience with BDD methodologies & automated acceptance testing (required) Technical Skills: 5+ years hands-on experience in Java, including good understanding of Java fundamentals such as Memory Model, Runtime Environment, Concurrency and Multithreading (required) Past/Current experience of 3+ years working on a large scale cloud native project (platform: Unix/Linux, Type of Systems: event-driven/transaction processing/high performance computing) as Technical Lead. These experiences should include developing/architecting core libraries or framework used by the platform to support fundamental services like storage, alert notifications, security, etc. (required) Appian Process Modeling, Smart Services, Rules and Tempo event services, database, and Web services (required) Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required) Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics (required) Education and/or Experience: BS degree in Computer Science, similar technical field Appian certified developer
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
01/05/2024
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for a Principal, Data Architecture. This principal will focus on data architecture and design to support all IT departments throughout the company. These responsibilities include design of data lakes, data warehouses, data messaging, data modelling, data science, etc. The company wants someone with 10+ years of data architect/engineering/DBA work experience. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Qualifications: 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead Bachelor's degree or higher in a technical field Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC Experience in solution integration and operability. Experience working with Infrastructure Technologies and Teams. Experience using Service-now or similar
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM
01/05/2024
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Principal Data Architect with Kafka and Data Lakes experience. Candidate will be responsible for data architecture and design to support all IT areas of the business. This role will establish standards, coordinate solution design with subject matter owners, document and design solutions and ensure strategic goals are met at the operational level. The role will be a primary contributor to the Joint Technology Strategy and oversee and conduct evaluations of technology and process during proofs-of-concept/value. Responsibilities: Design the data architecture of organization to support data driven vision Create design and blueprint of the data capabilities for the organization within the data framework Analyze structural requirements for new solutions and applications Optimize new and current database systems Responsible for requirements-based analysis and selection of data tools Responsible for setting up and enforcement of Data Modeling standards Responsible for creating logical and conceptual data models Ensure that data architecture principles are adhered to across the enterprise Assist in building data taxonomy and aligning it with business processes Work with Data Governance, IT, and data stewards on design of the strategic solution to data quality issues Advises team of IT technology standards requirements, methodologies, and processes. Drives short and long term architecture strategy for the overall IT project portfolio for key business segments. Responsible for comprehensive infrastructure designs including all aspects of IT. Participates in proof of concepts to assist in defining technology direction and enabling business strategy. Communicates and validates program architecture with infrastructure, project management and technology services teams. Conducts end to end technical plan design. Develop enterprise standards to ensure compatibility and integration of platforms in a multi-vendor platform environment. Design and develop infrastructure and solution documentation and blueprints. Responsible for impact analysis and design modifications of existing systems to support new solutions and integrations. Develops specifications for solutions integrations. Maintain documentation library on standard procedures and approved solution configurations. Communicate and coordinate between IT, Application Development, Operations, and Management. Uses traditional and Agile project/product approaches to execute projects and achieve business outcomes. Drive business results through process and informal leadership. Qualifications: Experience in design of data lake/warehouse solutions, preferably in the cloud Experience in schema design for relational and non-relational data and messaging protocols Experience in design of data science and data analytics solutions [Required] Ability to prioritize critical versus non-critical issues and communicate effectively to management. [Required] Proven ability to contribute consistently and positively in a dynamic, fast-paced, and highly regulated environment. [Required] Proven ability to facilitate project alignment between business and technical teams. [Required] Demonstrated ability to dig beyond the surface to uncover root causes and offer solutions that could deliver effective and efficient outcomes. [Required] Experience operating in a collaborative environment to solve cross-functional problems. [Required] Self-directed and detail oriented. [Required] Highly effective organization and planning skills. Technical Skills: Experience with Kafka and Protocol Buffers Expertise in both SQL and No SQL databases Expertise with BI tools (Tableau, Power BI etc) Expertise with federated query tools such as Presto/Trino Experience with data lake file formats such as Avro, Parquet, ORC [Required] Experience in extracting and developing technical requirements from business goals and needs. [Required] Experience in solution integration and operability. [Required] Experience working with Infrastructure Technologies and Teams. [Required] Proficiency in using Microsoft Office products (Word, Excel, PowerPoint, Visio) [Required] Experience using Service-now or similar [Required] 10+ years of progressive experience leading to a Senior-level Data Architect, Data Engineer, DBA, consultant, technical lead [Preferred] Bachelor's degree or higher in a technical field [Preferred] Process improvement certifications such as Lean/Six Sigma [Preferred] IT service or process management certifications such as ITIL or ITAM