Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this 6+ month straight contract role* Prestigious Financial Institution is currently seeking a Business Technical Data Governance Analyst. Candidate will a ct as a liaison and translation layer between business and technical teams and operates at system and detailed technical level for analysis purposes. Work on new project requirements from initial requirements through full development life cycle and implementation. Responsibilities: Responsible for gathering business and technical requirements to capture metadata and lineage working with Business and Technical SMEs. Lead Data Domain and Data Steward Workgroup meetings Work closely with Data Domain Owners and SME's to identify CDE's (Critical Data Elements) and define data elements for the Business Glossary Collaborate with Data Domain Owners to identify and define appropriate business rules Collaborate with the Metadata specialist to identify data sources Collaborate with data modelers to review definitions of business terms vs. technical terms Utilize data profiling and data quality tools to expose and determine causes of data quality issues when defining business rules Participate in metadata management utilizing IBM Metadata Asset Manager and the IBM Information Governance Catalog to build out the business glossary Implement data governance controls, procedures, and standards Develop and deliver presentations for department and senior leadership Perform other duties as assigned Qualifications: Structured Query Language (SQL) Data Governance Tools example, Informatica, IBM ISEE, Collibra etc Experience working on APIs, Kafka as Data Sources is preferred. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift). Strong data analysis capabilities. Bachelor or Master degree in data analytics, computer science or related field.
08/02/2025
Project-based
*We are unable to sponsor for this 6+ month straight contract role* Prestigious Financial Institution is currently seeking a Business Technical Data Governance Analyst. Candidate will a ct as a liaison and translation layer between business and technical teams and operates at system and detailed technical level for analysis purposes. Work on new project requirements from initial requirements through full development life cycle and implementation. Responsibilities: Responsible for gathering business and technical requirements to capture metadata and lineage working with Business and Technical SMEs. Lead Data Domain and Data Steward Workgroup meetings Work closely with Data Domain Owners and SME's to identify CDE's (Critical Data Elements) and define data elements for the Business Glossary Collaborate with Data Domain Owners to identify and define appropriate business rules Collaborate with the Metadata specialist to identify data sources Collaborate with data modelers to review definitions of business terms vs. technical terms Utilize data profiling and data quality tools to expose and determine causes of data quality issues when defining business rules Participate in metadata management utilizing IBM Metadata Asset Manager and the IBM Information Governance Catalog to build out the business glossary Implement data governance controls, procedures, and standards Develop and deliver presentations for department and senior leadership Perform other duties as assigned Qualifications: Structured Query Language (SQL) Data Governance Tools example, Informatica, IBM ISEE, Collibra etc Experience working on APIs, Kafka as Data Sources is preferred. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift). Strong data analysis capabilities. Bachelor or Master degree in data analytics, computer science or related field.
NO SPONSORSHIP Business Data Governance Rate: 55-65/hr - C2C 6 months LOCATION: CHICAGO, IL HYBRID 3 DAYS ONSITE AND 2 DAYS REMOTE Looking for a Data Governance, Business Technical Analyst. This is both the business and technical perspective. You will define business rules and determine causes of data quality issues when defining business rules Must have a holistic perspective on the entire data governance process from both a business and technical perspective. Act as a liaison and translation layer between business and technical teams and operates at system- and detailed technical level for analysis purposes. Work on new project requirements from initial requirements through full development life cycle and implementation. Responsible for gathering business and technical requirements to capture metadata and lineage working with Business and Technical SMEs. Lead Data Domain and Data Steward Workgroup meetings Work closely with Data Domain Owners and SME's to identify CDE's (Critical Data Elements) and define data elements for the Business Glossary Collaborate with Data Domain Owners to identify and define appropriate business rules Collaborate with the Metadata specialist to identify data sources Collaborate with data modelers to review definitions of business terms vs. technical terms Utilize data profiling and data quality tools to expose and determine causes of data quality issues when defining business rules Participate in metadata management utilizing IBM Metadata Asset Manager and the IBM Information Governance Catalog to build out the business glossary Implement data governance controls, procedures, and standards Develop and deliver presentations for department and senior leadership Perform other duties as assigned Qualifications: Structured Query Language (SQL) Data Governance Tools example, Informatica, IBM ISEE, Collibra etc Experience working on APIs, Kafka as Data Sources is preferred. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift). Strong data analysis capabilities.
07/02/2025
Project-based
NO SPONSORSHIP Business Data Governance Rate: 55-65/hr - C2C 6 months LOCATION: CHICAGO, IL HYBRID 3 DAYS ONSITE AND 2 DAYS REMOTE Looking for a Data Governance, Business Technical Analyst. This is both the business and technical perspective. You will define business rules and determine causes of data quality issues when defining business rules Must have a holistic perspective on the entire data governance process from both a business and technical perspective. Act as a liaison and translation layer between business and technical teams and operates at system- and detailed technical level for analysis purposes. Work on new project requirements from initial requirements through full development life cycle and implementation. Responsible for gathering business and technical requirements to capture metadata and lineage working with Business and Technical SMEs. Lead Data Domain and Data Steward Workgroup meetings Work closely with Data Domain Owners and SME's to identify CDE's (Critical Data Elements) and define data elements for the Business Glossary Collaborate with Data Domain Owners to identify and define appropriate business rules Collaborate with the Metadata specialist to identify data sources Collaborate with data modelers to review definitions of business terms vs. technical terms Utilize data profiling and data quality tools to expose and determine causes of data quality issues when defining business rules Participate in metadata management utilizing IBM Metadata Asset Manager and the IBM Information Governance Catalog to build out the business glossary Implement data governance controls, procedures, and standards Develop and deliver presentations for department and senior leadership Perform other duties as assigned Qualifications: Structured Query Language (SQL) Data Governance Tools example, Informatica, IBM ISEE, Collibra etc Experience working on APIs, Kafka as Data Sources is preferred. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift). Strong data analysis capabilities.
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for a Java Developer - Metadata Lineage Analyst. This is a Java Developer position and they will focus on data analysis, Metadata data flows, data mappings, data lineage solutions. This analyst will not be programming but will be developing custom metadata connections/scanners using Java, python, etc. They will need hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Responsibilities: Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.
07/02/2025
Project-based
*Hybrid, 3 days onsite, 2 days remote* A prestigious company is looking for a Java Developer - Metadata Lineage Analyst. This is a Java Developer position and they will focus on data analysis, Metadata data flows, data mappings, data lineage solutions. This analyst will not be programming but will be developing custom metadata connections/scanners using Java, python, etc. They will need hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Responsibilities: Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.
NO SPONSORSHIP Technical Data Analysis - SQL and Metadata SALARY: $125k - $145k plus 10% - 15% bonus LOCATION: CHICAGO, IL 3 days onsite and 2 days remote Looking for a Data Analysis with SQL, Metadata management and data quality. You will use tools within the collibra platform. Should have knowledge of data lineage, MDM, data catalog, data dictionary, heavy SQL data structured. Must come from a financial firm with knowledge of data governance is a plus doing data flows data mapping will look at 3 years plus to a senior Identify data sources and build out business glossary Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Qualifications: Ability to multitask and meet aggressive deadlines efficiently and effectively. Knowledge of Data Governance tools such as Collibra Proficient with SQL. Strong data analysis capabilities. Experience with Databases Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Strong written and oral communication skills with ability to work with users, peers and management. Capital market or banking domain experience is preferred.
06/02/2025
Full time
NO SPONSORSHIP Technical Data Analysis - SQL and Metadata SALARY: $125k - $145k plus 10% - 15% bonus LOCATION: CHICAGO, IL 3 days onsite and 2 days remote Looking for a Data Analysis with SQL, Metadata management and data quality. You will use tools within the collibra platform. Should have knowledge of data lineage, MDM, data catalog, data dictionary, heavy SQL data structured. Must come from a financial firm with knowledge of data governance is a plus doing data flows data mapping will look at 3 years plus to a senior Identify data sources and build out business glossary Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Qualifications: Ability to multitask and meet aggressive deadlines efficiently and effectively. Knowledge of Data Governance tools such as Collibra Proficient with SQL. Strong data analysis capabilities. Experience with Databases Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Strong written and oral communication skills with ability to work with users, peers and management. Capital market or banking domain experience is preferred.
Data Governance Tech Lead Salary: open + bonus Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications Bachelor's degree 7+ years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Financial Services industry experience DAMA certified (preferred) Responsibilities Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands-on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting.
05/02/2025
Full time
Data Governance Tech Lead Salary: open + bonus Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications Bachelor's degree 7+ years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Financial Services industry experience DAMA certified (preferred) Responsibilities Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands-on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting.
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Data Governance MDM Analyst. Candidate will act as a liaison and translation layer between business and technical teams and operate at system- and detailed technical level for analysis purposes. Implement and support Metadata Management, Data Lineage, Data Quality and other essential Data Governance functions. Responsibilities: Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements), define data elements for the Business Glossary and define business rules. Identify data sources and build out business glossary collaborating with data owners/stewards, collaborate with data modelers to review definitions of business terms vs technical terms. Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting. Perform other duties as assigned. Qualifications: Ability to work independently and as part of a team to successfully execute projects. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Strong data analysis capabilities. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Strong written and oral communication skills with ability to work with users, peers and management. Capital market or banking domain experience is preferred. Prior development/coding experience is preferred. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Structured Query Language (SQL) Data Governance Tools example Collibra, IBM ISEE, Informatica etc. Bachelors or masters degree in data analytics, computer science or related field. 7+ years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. DAMA certification preferred.
05/02/2025
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Data Governance MDM Analyst. Candidate will act as a liaison and translation layer between business and technical teams and operate at system- and detailed technical level for analysis purposes. Implement and support Metadata Management, Data Lineage, Data Quality and other essential Data Governance functions. Responsibilities: Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements), define data elements for the Business Glossary and define business rules. Identify data sources and build out business glossary collaborating with data owners/stewards, collaborate with data modelers to review definitions of business terms vs technical terms. Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting. Perform other duties as assigned. Qualifications: Ability to work independently and as part of a team to successfully execute projects. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Strong data analysis capabilities. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Strong written and oral communication skills with ability to work with users, peers and management. Capital market or banking domain experience is preferred. Prior development/coding experience is preferred. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Structured Query Language (SQL) Data Governance Tools example Collibra, IBM ISEE, Informatica etc. Bachelors or masters degree in data analytics, computer science or related field. 7+ years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. DAMA certification preferred.
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Database Administrator with strong Postgre and preferably some DB2 LUW experience. Candidate will design and maintain databases and database solutions that operate within prescribed resource limits and meet company standards. Participate in and contribute to all phases of systems development projects. Monitor and tune database and database application performance. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Supports business studies, proposal teams and costing/feasibility studies Prepares system documentation Maintains metadata repositories Qualifications: [Required] 3+ years experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years experience as an associate DBA [Required] Well versed in all phases of Systems Analysis and Design [Required] Experienced in two or more programming languages and two or more Scripting languages [Required] Practiced at Entity/Relationship or Object modelling and translation to physical database designs [Required] Proficient in DML, DDL, and database utilities for at least two DBMS technologies [Required] Proficient in all access methods of a DBMS as well as the underlying operating system access methods [Required] Knowledge of hardware and operating system capabilities within one Environment [Required] Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment [Required] Accepts ownership in assignments, team, and company and takes initiative outside immediate area of responsibility [Required] Speed/Sense of Urgency Contributes additional effort when necessary to get the job done and to help others meet their objectives [Required] Seeks additional responsibility, shows initiative to learn every aspect of the job, and strives to become a mentor to others in area of expertise [Required] Communicates openly and effectively. Challenges established practices appropriately [Required] Ability to maintain composure under pressure and avoid defensive or irritated or reactions in challenging situations Technical Skills: [Required] 7+ years experience with PostgreSQL (preferred EnterpriseDB (EDB) version) [Required] 3+ year Terraform, Ansible, Jenkins & CI/CD skills [Preferred] 3+ years experience with CTE(CipherTrust Transparent Encryption), Barman (EDB Backup and Recovery Manager) and AWS [Preferred] 5+ years experience with DB2 LUW; preferably on Red Hat Linux [Preferred] 1+ years experience with SQL Server [Preferred] 1+ years experience with MySQL/MariaDB [Preferred] 1+ years experience with DB2 in a z/OS environment [Required] bachelors degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business [Preferred] Related financial industry experiences [Preferred] PostgreSQL Professional Certification [Preferred] IBM Certified Database Administrator - DB2 for Linux UNIX and Windows
04/02/2025
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Database Administrator with strong Postgre and preferably some DB2 LUW experience. Candidate will design and maintain databases and database solutions that operate within prescribed resource limits and meet company standards. Participate in and contribute to all phases of systems development projects. Monitor and tune database and database application performance. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Supports business studies, proposal teams and costing/feasibility studies Prepares system documentation Maintains metadata repositories Qualifications: [Required] 3+ years experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years experience as an associate DBA [Required] Well versed in all phases of Systems Analysis and Design [Required] Experienced in two or more programming languages and two or more Scripting languages [Required] Practiced at Entity/Relationship or Object modelling and translation to physical database designs [Required] Proficient in DML, DDL, and database utilities for at least two DBMS technologies [Required] Proficient in all access methods of a DBMS as well as the underlying operating system access methods [Required] Knowledge of hardware and operating system capabilities within one Environment [Required] Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment [Required] Accepts ownership in assignments, team, and company and takes initiative outside immediate area of responsibility [Required] Speed/Sense of Urgency Contributes additional effort when necessary to get the job done and to help others meet their objectives [Required] Seeks additional responsibility, shows initiative to learn every aspect of the job, and strives to become a mentor to others in area of expertise [Required] Communicates openly and effectively. Challenges established practices appropriately [Required] Ability to maintain composure under pressure and avoid defensive or irritated or reactions in challenging situations Technical Skills: [Required] 7+ years experience with PostgreSQL (preferred EnterpriseDB (EDB) version) [Required] 3+ year Terraform, Ansible, Jenkins & CI/CD skills [Preferred] 3+ years experience with CTE(CipherTrust Transparent Encryption), Barman (EDB Backup and Recovery Manager) and AWS [Preferred] 5+ years experience with DB2 LUW; preferably on Red Hat Linux [Preferred] 1+ years experience with SQL Server [Preferred] 1+ years experience with MySQL/MariaDB [Preferred] 1+ years experience with DB2 in a z/OS environment [Required] bachelors degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business [Preferred] Related financial industry experiences [Preferred] PostgreSQL Professional Certification [Preferred] IBM Certified Database Administrator - DB2 for Linux UNIX and Windows
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Database Administrator with strong Postgre and preferably some DB2 LUW experience. Candidate will design and maintain databases and database solutions that operate within prescribed resource limits and meet company standards. Participate in and contribute to all phases of systems development projects. Monitor and tune database and database application performance. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Supports business studies, proposal teams and costing/feasibility studies Prepares system documentation Maintains metadata repositories Qualifications: [Required] 3+ years experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years experience as an associate DBA [Required] Well versed in all phases of Systems Analysis and Design [Required] Experienced in two or more programming languages and two or more Scripting languages [Required] Practiced at Entity/Relationship or Object modelling and translation to physical database designs [Required] Proficient in DML, DDL, and database utilities for at least two DBMS technologies [Required] Proficient in all access methods of a DBMS as well as the underlying operating system access methods [Required] Knowledge of hardware and operating system capabilities within one Environment [Required] Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment [Required] Accepts ownership in assignments, team, and company and takes initiative outside immediate area of responsibility [Required] Speed/Sense of Urgency Contributes additional effort when necessary to get the job done and to help others meet their objectives [Required] Seeks additional responsibility, shows initiative to learn every aspect of the job, and strives to become a mentor to others in area of expertise [Required] Communicates openly and effectively. Challenges established practices appropriately [Required] Ability to maintain composure under pressure and avoid defensive or irritated or reactions in challenging situations Technical Skills: [Required] 7+ years experience with PostgreSQL (preferred EnterpriseDB (EDB) version) [Required] 3+ year Terraform, Ansible, Jenkins & CI/CD skills [Preferred] 3+ years experience with CTE(CipherTrust Transparent Encryption), Barman (EDB Backup and Recovery Manager) and AWS [Preferred] 5+ years experience with DB2 LUW; preferably on Red Hat Linux [Preferred] 1+ years experience with SQL Server [Preferred] 1+ years experience with MySQL/MariaDB [Preferred] 1+ years experience with DB2 in a z/OS environment [Required] bachelors degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business [Preferred] Related financial industry experiences [Preferred] PostgreSQL Professional Certification [Preferred] IBM Certified Database Administrator - DB2 for Linux UNIX and Windows
04/02/2025
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Institution is currently seeking a Database Administrator with strong Postgre and preferably some DB2 LUW experience. Candidate will design and maintain databases and database solutions that operate within prescribed resource limits and meet company standards. Participate in and contribute to all phases of systems development projects. Monitor and tune database and database application performance. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Supports business studies, proposal teams and costing/feasibility studies Prepares system documentation Maintains metadata repositories Qualifications: [Required] 3+ years experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years experience as an associate DBA [Required] Well versed in all phases of Systems Analysis and Design [Required] Experienced in two or more programming languages and two or more Scripting languages [Required] Practiced at Entity/Relationship or Object modelling and translation to physical database designs [Required] Proficient in DML, DDL, and database utilities for at least two DBMS technologies [Required] Proficient in all access methods of a DBMS as well as the underlying operating system access methods [Required] Knowledge of hardware and operating system capabilities within one Environment [Required] Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment [Required] Accepts ownership in assignments, team, and company and takes initiative outside immediate area of responsibility [Required] Speed/Sense of Urgency Contributes additional effort when necessary to get the job done and to help others meet their objectives [Required] Seeks additional responsibility, shows initiative to learn every aspect of the job, and strives to become a mentor to others in area of expertise [Required] Communicates openly and effectively. Challenges established practices appropriately [Required] Ability to maintain composure under pressure and avoid defensive or irritated or reactions in challenging situations Technical Skills: [Required] 7+ years experience with PostgreSQL (preferred EnterpriseDB (EDB) version) [Required] 3+ year Terraform, Ansible, Jenkins & CI/CD skills [Preferred] 3+ years experience with CTE(CipherTrust Transparent Encryption), Barman (EDB Backup and Recovery Manager) and AWS [Preferred] 5+ years experience with DB2 LUW; preferably on Red Hat Linux [Preferred] 1+ years experience with SQL Server [Preferred] 1+ years experience with MySQL/MariaDB [Preferred] 1+ years experience with DB2 in a z/OS environment [Required] bachelors degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business [Preferred] Related financial industry experiences [Preferred] PostgreSQL Professional Certification [Preferred] IBM Certified Database Administrator - DB2 for Linux UNIX and Windows
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Database Administrator. This DBA will focus on production support, performance back up, and DBMS. This company needs someone with 7+ years of working with DB2 LUW on Red Hat and PostgreSQL. This DBA is also expected to have heavy experience coding, reviewing SQL, and some experience with other Scripting languages such as basic Java, linux Shell, Pearl, etc. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Maintains metadata repositories Qualifications: Bachelor's degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business 3+ years' experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years' experience as an associate DBA Experienced in two or more programming languages and two or more Scripting languages Practiced at Entity/Relationship or Object modelling and translation to physical database designs Proficient in DML, DDL, and database utilities for at least two DBMS technologies Proficient in all access methods of a DBMS as well as the underlying operating system access methods Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment Technical Skills: 7+ years' experience with PostgreSQL 7+ years' experience with DB2 LUW; preferably on Red Hat Linux Proficient with coding and review of SQL, stored procedures, and triggers 1+ year' Terraform, Ansible, Jenkins & CI/CD skills 1+ years' EDB Postgres & EDB Postgres Distributed experience Basic Java, Perl & Linux Shell script skills 1+ years' experience with SQL Server 1+ years' experience with DB2 in a z/OS environment 1+ years' experience with MySQL/MariaDB Experience with BMC tools for DB2 (Change/Catalog Manager, MainView, Log Master)
04/02/2025
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Database Administrator. This DBA will focus on production support, performance back up, and DBMS. This company needs someone with 7+ years of working with DB2 LUW on Red Hat and PostgreSQL. This DBA is also expected to have heavy experience coding, reviewing SQL, and some experience with other Scripting languages such as basic Java, linux Shell, Pearl, etc. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Maintains metadata repositories Qualifications: Bachelor's degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business 3+ years' experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years' experience as an associate DBA Experienced in two or more programming languages and two or more Scripting languages Practiced at Entity/Relationship or Object modelling and translation to physical database designs Proficient in DML, DDL, and database utilities for at least two DBMS technologies Proficient in all access methods of a DBMS as well as the underlying operating system access methods Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment Technical Skills: 7+ years' experience with PostgreSQL 7+ years' experience with DB2 LUW; preferably on Red Hat Linux Proficient with coding and review of SQL, stored procedures, and triggers 1+ year' Terraform, Ansible, Jenkins & CI/CD skills 1+ years' EDB Postgres & EDB Postgres Distributed experience Basic Java, Perl & Linux Shell script skills 1+ years' experience with SQL Server 1+ years' experience with DB2 in a z/OS environment 1+ years' experience with MySQL/MariaDB Experience with BMC tools for DB2 (Change/Catalog Manager, MainView, Log Master)
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Database Administrator. This DBA will focus on production support, performance back up, and DBMS. This company needs someone with 7+ years of working with DB2 LUW on Red Hat and PostgreSQL. This DBA is also expected to have heavy experience coding, reviewing SQL, and some experience with other Scripting languages such as basic Java, linux Shell, Pearl, etc. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Maintains metadata repositories Qualifications: Bachelor's degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business 3+ years' experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years' experience as an associate DBA Experienced in two or more programming languages and two or more Scripting languages Practiced at Entity/Relationship or Object modelling and translation to physical database designs Proficient in DML, DDL, and database utilities for at least two DBMS technologies Proficient in all access methods of a DBMS as well as the underlying operating system access methods Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment Technical Skills: 7+ years' experience with PostgreSQL 7+ years' experience with DB2 LUW; preferably on Red Hat Linux Proficient with coding and review of SQL, stored procedures, and triggers 1+ year' Terraform, Ansible, Jenkins & CI/CD skills 1+ years' EDB Postgres & EDB Postgres Distributed experience Basic Java, Perl & Linux Shell script skills 1+ years' experience with SQL Server 1+ years' experience with DB2 in a z/OS environment 1+ years' experience with MySQL/MariaDB Experience with BMC tools for DB2 (Change/Catalog Manager, MainView, Log Master)
04/02/2025
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Database Administrator. This DBA will focus on production support, performance back up, and DBMS. This company needs someone with 7+ years of working with DB2 LUW on Red Hat and PostgreSQL. This DBA is also expected to have heavy experience coding, reviewing SQL, and some experience with other Scripting languages such as basic Java, linux Shell, Pearl, etc. Responsibilities: Assists with the design, implementation, and maintaining databases Manage database performance and disk usage Provides support in database access methods Provides consultation support in database analysis, modelling, coding, and production problem resolution. Develops maintenance, backup and recovery procedures and documentation Participates in Disaster Recovery drills Provides Primary On-Call Support for production problems Understands and supports corporate data standards Recommends and assists with new DBMS and operational standards. Participates in testing and in evaluations of new software and software release upgrades Maintains metadata repositories Qualifications: Bachelor's degree (or equivalent) in Computer Science, Engineering, Mathematics, or Business 3+ years' experience developing and maintaining complex applications that make extensive use of a supported database technology or 3+ years' experience as an associate DBA Experienced in two or more programming languages and two or more Scripting languages Practiced at Entity/Relationship or Object modelling and translation to physical database designs Proficient in DML, DDL, and database utilities for at least two DBMS technologies Proficient in all access methods of a DBMS as well as the underlying operating system access methods Understanding of all software subsystems (DBMS, TP Managers, etc.) for one environment Technical Skills: 7+ years' experience with PostgreSQL 7+ years' experience with DB2 LUW; preferably on Red Hat Linux Proficient with coding and review of SQL, stored procedures, and triggers 1+ year' Terraform, Ansible, Jenkins & CI/CD skills 1+ years' EDB Postgres & EDB Postgres Distributed experience Basic Java, Perl & Linux Shell script skills 1+ years' experience with SQL Server 1+ years' experience with DB2 in a z/OS environment 1+ years' experience with MySQL/MariaDB Experience with BMC tools for DB2 (Change/Catalog Manager, MainView, Log Master)
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Technical Data Analysis. This person will be the liaison between business/technical teams and will be focused on supporting and implementing data quality, Metadata management, data lineage, data mapping, data governance, etc. This person will need to be proficient with SQL data structure and will work within the Collibra Platform. This person must come from a financial company. Responsibilities: Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements), define data elements for the Business Glossary and define business rules. Identify data sources and build out business glossary collaborating with data owners/stewards, collaborate with data modelers to review definitions of business terms vs technical terms. Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting. Qualifications: Bachelors or masters degree in data analytics, computer science or related field. 7+ years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. DAMA certification preferred. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Strong data analysis capabilities. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Prior development/coding experience is preferred. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Data Governance Tools example Collibra, IBM ISEE, Informatica etc.
04/02/2025
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Technical Data Analysis. This person will be the liaison between business/technical teams and will be focused on supporting and implementing data quality, Metadata management, data lineage, data mapping, data governance, etc. This person will need to be proficient with SQL data structure and will work within the Collibra Platform. This person must come from a financial company. Responsibilities: Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements), define data elements for the Business Glossary and define business rules. Identify data sources and build out business glossary collaborating with data owners/stewards, collaborate with data modelers to review definitions of business terms vs technical terms. Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting. Qualifications: Bachelors or masters degree in data analytics, computer science or related field. 7+ years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. DAMA certification preferred. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Strong data analysis capabilities. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Prior development/coding experience is preferred. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Data Governance Tools example Collibra, IBM ISEE, Informatica etc.
Contract Technical Business Data Governance Analyst Rate: Open Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications Bachelor's or Masters degree in data analytics, computer science or related field. Structured Query Language (SQL) Data Governance Tools example, Informatica, IBM ISEE, Collibra etc. Experience working on APIs, Kafka as Data Sources is preferred. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift). Strong data analysis capabilities. Experience with metadata management Responsibilities Responsible for gathering business and technical requirements to capture metadata and lineage working with Business and Technical SMEs. Lead Data Domain and Data Steward Workgroup meetings Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements) and define data elements for the Business Glossary Collaborate with Data Domain Owners to identify and define appropriate business rules Collaborate with the Metadata specialist to identify data sources Collaborate with data modelers to review definitions of business terms vs. technical terms Utilize data profiling and data quality tools to expose and determine causes of data quality issues when defining business rules Participate in metadata management utilizing IBM Metadata Asset Manager and the IBM Information Governance Catalog to build out the business glossary Implement data governance controls, procedures, and standards Develop and deliver presentations for department and senior leadership
04/02/2025
Project-based
Contract Technical Business Data Governance Analyst Rate: Open Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications Bachelor's or Masters degree in data analytics, computer science or related field. Structured Query Language (SQL) Data Governance Tools example, Informatica, IBM ISEE, Collibra etc. Experience working on APIs, Kafka as Data Sources is preferred. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift). Strong data analysis capabilities. Experience with metadata management Responsibilities Responsible for gathering business and technical requirements to capture metadata and lineage working with Business and Technical SMEs. Lead Data Domain and Data Steward Workgroup meetings Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements) and define data elements for the Business Glossary Collaborate with Data Domain Owners to identify and define appropriate business rules Collaborate with the Metadata specialist to identify data sources Collaborate with data modelers to review definitions of business terms vs. technical terms Utilize data profiling and data quality tools to expose and determine causes of data quality issues when defining business rules Participate in metadata management utilizing IBM Metadata Asset Manager and the IBM Information Governance Catalog to build out the business glossary Implement data governance controls, procedures, and standards Develop and deliver presentations for department and senior leadership
Metadata Solutions Developer Rate: Open Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns. Responsibilities Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary.
04/02/2025
Project-based
Metadata Solutions Developer Rate: Open Location: Chicago, IL Hybrid: 3 days onsite, 2 days remote *We are unable to provide sponsorship for this role* Qualifications 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns. Responsibilities Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary.
MUST BE BASED IN BELGIUM Fluent in English and at least one local language (Dutch or French). 50% on-site in brussels, 50% remote working Our Brussels-based client is seeking a Senior Data Engineer who understands banking operations and enjoys coaching. This role focuses on creating digital solutions, dashboards, and reports for Corporate Banking users like Sales, Metiers, and Management. Key Responsibilities: Data Integration and Security: Work with the Data Engineering Team to ensure timely and secure data ingestion from various systems. Data Modelling: Design data models for accurate business process representation and analysis. Data Lineage and Historization: Implement systems for tracking data flow and maintaining historical data Metadata Management: Build metadata management systems for well-documented data assets. Monitoring and Simplification: Set up monitoring systems for data quality and simplify data flows for efficiency. Multi-Instance Management: Oversee daily batch management for consistency and resource use. Data Governance: Ensure compliance with company policies and standards. Mentorship: Mentor junior colleagues in data techniques and support their growth. Required Skills and Experience: Education: Master's degree in computer science, Business, or equivalent experience. Proficient in Alteryx, SQL (especially Transact-SQL), and data preparation tools. Knowledge of relational databases, including SQL Server and analytics. Familiar with data visualisation techniques Experience in banking or financial institutions is a plus. hands-on experience with Agile methodology. Strong analytical skills and ability to see the big picture. Ability to coach and share knowledge. Language Requirements: Fluent in English and at least one local language (Dutch or French). This is a great opportunity to use your technical skills in a collaborative and innovative environment. Apply today or email (see below)
04/02/2025
Project-based
MUST BE BASED IN BELGIUM Fluent in English and at least one local language (Dutch or French). 50% on-site in brussels, 50% remote working Our Brussels-based client is seeking a Senior Data Engineer who understands banking operations and enjoys coaching. This role focuses on creating digital solutions, dashboards, and reports for Corporate Banking users like Sales, Metiers, and Management. Key Responsibilities: Data Integration and Security: Work with the Data Engineering Team to ensure timely and secure data ingestion from various systems. Data Modelling: Design data models for accurate business process representation and analysis. Data Lineage and Historization: Implement systems for tracking data flow and maintaining historical data Metadata Management: Build metadata management systems for well-documented data assets. Monitoring and Simplification: Set up monitoring systems for data quality and simplify data flows for efficiency. Multi-Instance Management: Oversee daily batch management for consistency and resource use. Data Governance: Ensure compliance with company policies and standards. Mentorship: Mentor junior colleagues in data techniques and support their growth. Required Skills and Experience: Education: Master's degree in computer science, Business, or equivalent experience. Proficient in Alteryx, SQL (especially Transact-SQL), and data preparation tools. Knowledge of relational databases, including SQL Server and analytics. Familiar with data visualisation techniques Experience in banking or financial institutions is a plus. hands-on experience with Agile methodology. Strong analytical skills and ability to see the big picture. Ability to coach and share knowledge. Language Requirements: Fluent in English and at least one local language (Dutch or French). This is a great opportunity to use your technical skills in a collaborative and innovative environment. Apply today or email (see below)
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this straight 6+ Month Contract role* Prestigious Financial Institution Firm is currently seeking a Java Metadata Data Lineage Analyst. Candidate will develop Metadata and Data Lineage Solutions for multiple data sources across On-Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies. Responsibilities: Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.
04/02/2025
Project-based
*We are unable to sponsor for this straight 6+ Month Contract role* Prestigious Financial Institution Firm is currently seeking a Java Metadata Data Lineage Analyst. Candidate will develop Metadata and Data Lineage Solutions for multiple data sources across On-Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies. Responsibilities: Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings. Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows. Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources. Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools. Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping. Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures. Design and build data capabilities like data quality, metadata, data catalog and data dictionary. Qualifications: 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings. Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark. Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams. Experience working with various types of databases like Relational, NoSQL, Object-based. Ability to review the application development code to ensure it meets functional requirements, architectural and data standards. Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch. Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience in object-oriented design and software design patterns.