Head of Data Engineering | Data Engineer | Azure | Databricks | Data Factory | Synpase | ETL | Python | SQL Head of Data Engineering - PropTech Up to £90,000 + Benefits Remote working - multiple UK offices Method Resourcing have partnered with an industry leading company in the PropTech space who are looking to bring in a Head of Data Engineering to drive the businesses data capabilities forward. As the Head of Data Engineering you will be responsible for the end-to-end operation of the Data platform. A summary of the role includes: Development and maintenance of the company Data architecture, models and platform Implementation and advocating for Data Governance measures and policies Ensuring the timely delivery of accurate data through a service model that will be consumed by a variety of stakeholders Set standards and practices to allow for the continuous improvement of the company's data capabilities The Head of Data Engineering will have proven experience in a majority of the following areas: Leading a team of Data Engineers and/or Data Architects Engaging with stakeholders at various levels to translate business needs into technical solutions Previously built and delivered Data Analytics and integration solutions in an enterprise environment Extensive experience in the Azure cloud - ADF, ADLS, Synapse, Databricks Ability in languages such as SQL, Python/PySpark for Scripting Experience with both ETL and CI/CD pipelines Understanding of various Data Architectures including On-prem, Cloud and hybrid-cloud and the relevant data models for each use case The Head of Data Engineering role is paying up to £90,000 per year + Benefits and operates on a remote-working model to provide you with flexibility and a positive work-life balance. Please apply now for immediate consideration! Head of Data Engineering | Data Engineer | Azure | Databricks | Data Factory | Synpase | ETL | Python | SQL Head of Data Engineering - PropTech Up to £90,000 + Benefits Remote working - multiple UK offices
19/04/2024
Full time
Head of Data Engineering | Data Engineer | Azure | Databricks | Data Factory | Synpase | ETL | Python | SQL Head of Data Engineering - PropTech Up to £90,000 + Benefits Remote working - multiple UK offices Method Resourcing have partnered with an industry leading company in the PropTech space who are looking to bring in a Head of Data Engineering to drive the businesses data capabilities forward. As the Head of Data Engineering you will be responsible for the end-to-end operation of the Data platform. A summary of the role includes: Development and maintenance of the company Data architecture, models and platform Implementation and advocating for Data Governance measures and policies Ensuring the timely delivery of accurate data through a service model that will be consumed by a variety of stakeholders Set standards and practices to allow for the continuous improvement of the company's data capabilities The Head of Data Engineering will have proven experience in a majority of the following areas: Leading a team of Data Engineers and/or Data Architects Engaging with stakeholders at various levels to translate business needs into technical solutions Previously built and delivered Data Analytics and integration solutions in an enterprise environment Extensive experience in the Azure cloud - ADF, ADLS, Synapse, Databricks Ability in languages such as SQL, Python/PySpark for Scripting Experience with both ETL and CI/CD pipelines Understanding of various Data Architectures including On-prem, Cloud and hybrid-cloud and the relevant data models for each use case The Head of Data Engineering role is paying up to £90,000 per year + Benefits and operates on a remote-working model to provide you with flexibility and a positive work-life balance. Please apply now for immediate consideration! Head of Data Engineering | Data Engineer | Azure | Databricks | Data Factory | Synpase | ETL | Python | SQL Head of Data Engineering - PropTech Up to £90,000 + Benefits Remote working - multiple UK offices
Oracle Cloud Reporting Lead £650 p/d outside IR35: You will need to be proficient in Oracle Cloud reporting tools such as OTBI (Oracle Transactional Business Intelligence), BIP (BI Publisher), (OAC) Oracle Analytics Cloud and FAW (Fusion Analytics Warehouse). As a Technical Lead, you will be responsible for leading the development and maintenance of reporting solutions within the Oracle Cloud environment, ensuring accurate and timely delivery of analytical insights to support decision-making processes. Key Responsibilities: Lead the design, development, and implementation of Oracle Cloud reporting solutions to meet the business requirements. Collaborate with key stakeholders to understand reporting needs and translate them into technical specifications and design documents. Provide technical leadership and guidance to a team of developers and analysts involved in reporting solution development. Configure and customise OTBI reports, BIP templates, and data models to support various reporting requirements. Stay updated with the latest Oracle Cloud updates, patches, and enhancements, assessing their impact on reporting solutions and making necessary adjustments. Troubleshoot and resolve technical issues related to Oracle Cloud reporting tools and integrations. Stay current with Oracle Cloud updates and enhancements, assessing their impact on existing reporting solutions and recommending necessary adjustments. Conduct regular performance tuning and optimisation of reporting solutions to improve efficiency and responsiveness. Document technical specifications, configurations, and procedures for reporting solutions, ensuring knowledge transfer and supportability. Required Skills Minimum of 8 years of experience in Oracle Cloud reporting tools, including OTBI, BIP, OAC and FAW. Strong proficiency in SQL for data querying and manipulation. Familiarity with Oracle Cloud Security and Role-Based Access Control (RBAC). Good working knowledge of Oracle Analytics Cloud (OAC) and Fusion Analytics Warehouse (FAW). Experience with Datamodelling and ETL processes for data integration. Solid understanding of Oracle Cloud applications and underlying data structures. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.
19/04/2024
Project-based
Oracle Cloud Reporting Lead £650 p/d outside IR35: You will need to be proficient in Oracle Cloud reporting tools such as OTBI (Oracle Transactional Business Intelligence), BIP (BI Publisher), (OAC) Oracle Analytics Cloud and FAW (Fusion Analytics Warehouse). As a Technical Lead, you will be responsible for leading the development and maintenance of reporting solutions within the Oracle Cloud environment, ensuring accurate and timely delivery of analytical insights to support decision-making processes. Key Responsibilities: Lead the design, development, and implementation of Oracle Cloud reporting solutions to meet the business requirements. Collaborate with key stakeholders to understand reporting needs and translate them into technical specifications and design documents. Provide technical leadership and guidance to a team of developers and analysts involved in reporting solution development. Configure and customise OTBI reports, BIP templates, and data models to support various reporting requirements. Stay updated with the latest Oracle Cloud updates, patches, and enhancements, assessing their impact on reporting solutions and making necessary adjustments. Troubleshoot and resolve technical issues related to Oracle Cloud reporting tools and integrations. Stay current with Oracle Cloud updates and enhancements, assessing their impact on existing reporting solutions and recommending necessary adjustments. Conduct regular performance tuning and optimisation of reporting solutions to improve efficiency and responsiveness. Document technical specifications, configurations, and procedures for reporting solutions, ensuring knowledge transfer and supportability. Required Skills Minimum of 8 years of experience in Oracle Cloud reporting tools, including OTBI, BIP, OAC and FAW. Strong proficiency in SQL for data querying and manipulation. Familiarity with Oracle Cloud Security and Role-Based Access Control (RBAC). Good working knowledge of Oracle Analytics Cloud (OAC) and Fusion Analytics Warehouse (FAW). Experience with Datamodelling and ETL processes for data integration. Solid understanding of Oracle Cloud applications and underlying data structures. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.
Job Title: Data Engineer Location: Manchester Package: from £40,000 - £55,000 + Benefits Type: Permanent Sanderson Recruitment is recruiting for a Data Engineer on behalf of our leading Insurance client based in Manchester. Company Overview: Are you interested in joining a leading insurance company headquartered in the UK? Established over a decade ago, my client specialises in providing a range of insurance services tailored to meet the diverse needs of their customers. With a primary focus on the motor insurance market, they offer comprehensive car insurance directly through their brand, as well as underwriting services to other insurers. In addition to motor insurance, they also provide various supporting services related to insurance, including financing, distribution, and legal assistance. My client's commitment to utilising technology and data-driven strategies ensures they deliver high-quality products and services to their customers while mitigating risks effectively. Role & Responsibilities: As a Data Engineer, you will be actively participating in technical tasks, focusing on constructing data solutions for projects and ongoing data products. Your responsibilities will include Develop secure, efficient data pipelines of varying complexity, integrating data from diverse sources, both on-premise and off-premise, internal and external. Ensure data integrity and quality by cleansing, mapping, transforming, and optimising data for storage, aligning with business and technical requirements. Incorporate data observability and quality measures into pipelines to facilitate self-testing and early detection of processing issues or discrepancies. Construct solutions to transform and store data across different storage areas, including data lakes, databases, and reporting structures, spanning data warehouse, Business Intelligence systems, and analytics applications. Design physical data models tailored to business needs and storage optimisation, emphasising reusability and scalability. Conduct thorough unit testing of own code and peer testing to maintain high quality and integrity. Document pipelines and code comprehensively to ensure transparency and facilitate understanding. Adhere to coding standards, architectural principles, and release management processes to ensure code safety, quality, and compliance. Provide guidance and support to Associate data engineers through coaching and mentoring. Develop BI solutions of varying complexity, including data marts, semantic layers, and reporting & visualisation solutions using recognised BI tools such as PowerBI. Essential Requirements: To thrive in this role, candidates must possess: Demonstrated proficiency in PySpark and SQL development, with a strong interest in advancing your career in data engineering. Enthusiastic about leveraging Azure best practices to facilitate seamless data delivery from source to consumption on a daily basis. Excels at translating customer requirements into actionable designs and timely delivery. 2-5 years of experience in designing and implementing end-to-end data solutions. Proficiency in SQL Server and Azure technologies such as Data Factory and Synapse, along with expertise in associated ETL technologies. Experience working with large, event-based datasets within an enterprise setting. Familiarity with testing techniques and tools to ensure data quality and integrity. Strong interpersonal and communication skills, with an ability to build strong relationships. Active engagement in the data community with a keen interest in leveraging data to drive business value. Comprehensive understanding of the complete data life cycle. Experience with Continuous Integration/Continuous Delivery (CI/CD) practices. Proven track record of thriving in agile environments and adeptness at self-managing teams. This role offers an exciting opportunity to drive data innovation within a forward-thinking organisation. If you're ready to make a direct and meaningful contribution to my clients dynamic work environment, apply now.
19/04/2024
Full time
Job Title: Data Engineer Location: Manchester Package: from £40,000 - £55,000 + Benefits Type: Permanent Sanderson Recruitment is recruiting for a Data Engineer on behalf of our leading Insurance client based in Manchester. Company Overview: Are you interested in joining a leading insurance company headquartered in the UK? Established over a decade ago, my client specialises in providing a range of insurance services tailored to meet the diverse needs of their customers. With a primary focus on the motor insurance market, they offer comprehensive car insurance directly through their brand, as well as underwriting services to other insurers. In addition to motor insurance, they also provide various supporting services related to insurance, including financing, distribution, and legal assistance. My client's commitment to utilising technology and data-driven strategies ensures they deliver high-quality products and services to their customers while mitigating risks effectively. Role & Responsibilities: As a Data Engineer, you will be actively participating in technical tasks, focusing on constructing data solutions for projects and ongoing data products. Your responsibilities will include Develop secure, efficient data pipelines of varying complexity, integrating data from diverse sources, both on-premise and off-premise, internal and external. Ensure data integrity and quality by cleansing, mapping, transforming, and optimising data for storage, aligning with business and technical requirements. Incorporate data observability and quality measures into pipelines to facilitate self-testing and early detection of processing issues or discrepancies. Construct solutions to transform and store data across different storage areas, including data lakes, databases, and reporting structures, spanning data warehouse, Business Intelligence systems, and analytics applications. Design physical data models tailored to business needs and storage optimisation, emphasising reusability and scalability. Conduct thorough unit testing of own code and peer testing to maintain high quality and integrity. Document pipelines and code comprehensively to ensure transparency and facilitate understanding. Adhere to coding standards, architectural principles, and release management processes to ensure code safety, quality, and compliance. Provide guidance and support to Associate data engineers through coaching and mentoring. Develop BI solutions of varying complexity, including data marts, semantic layers, and reporting & visualisation solutions using recognised BI tools such as PowerBI. Essential Requirements: To thrive in this role, candidates must possess: Demonstrated proficiency in PySpark and SQL development, with a strong interest in advancing your career in data engineering. Enthusiastic about leveraging Azure best practices to facilitate seamless data delivery from source to consumption on a daily basis. Excels at translating customer requirements into actionable designs and timely delivery. 2-5 years of experience in designing and implementing end-to-end data solutions. Proficiency in SQL Server and Azure technologies such as Data Factory and Synapse, along with expertise in associated ETL technologies. Experience working with large, event-based datasets within an enterprise setting. Familiarity with testing techniques and tools to ensure data quality and integrity. Strong interpersonal and communication skills, with an ability to build strong relationships. Active engagement in the data community with a keen interest in leveraging data to drive business value. Comprehensive understanding of the complete data life cycle. Experience with Continuous Integration/Continuous Delivery (CI/CD) practices. Proven track record of thriving in agile environments and adeptness at self-managing teams. This role offers an exciting opportunity to drive data innovation within a forward-thinking organisation. If you're ready to make a direct and meaningful contribution to my clients dynamic work environment, apply now.
Harvey Nash have partnered exclusively with the University of Sheffield as they continue revolutionise their solutions and solidify its position as a leading Russell Group university. They are looking for a Senior Developer to join them. IT Services are advertising a challenging and rewarding role as part of our growing Integration Team. We are building a modern API led approach to Integrations across our estate with the design and implementation of Spring Boot REST APIs using KONG API Gateway Enterprise Products, as well as AWS Infrastructure and tools. Over time we will have a Full Lifecycle API Management Framework in place as part of this work. The team also designs and builds ETL pipelines using a modern data and event driven architecture. We provide a central point of expertise to own and manage our integration tools, processes and standards, and set our future approach to integration. As part of this we provide support to colleagues and suppliers who use our tooling to build their own integrations Essential criteria: Experience in developing systems using a variety of technologies. SpringBoot and Java are our current stack. Experience with Python is a bonus. Expertise with relational and non-relational databases. Expertise in designing and building APIs (REST, GraphQL, etc) Understanding of the life cycle of API management issues such as Security and Traffic Management, Access Control, etc. Expertise in effective collaborative working as part of a team, and the associated tools (Git, Jira, etc) and practices (Agile). Experience of driving continual improvements to systems, processes and working practices to deliver increased performance, efficiency and quality on the systems we maintain. Experience developing, monitoring, debugging and fault handling complex integrations between different systems using a variety of methods and approaches. Wide range of knowledge of tools and techniques for developing high quality software (eg continuous integration/deployment, software testing, containerisation, dependency management, etc.) Ability to learn new technologies and techniques, set standards and support team members on their use. Manage own time when working on several projects simultaneously, with an ability to prioritise and complete urgent fixes as they occur. Ability to support and mentor more junior members of the team. What we offer A minimum of 41 days annual leave including bank holiday and closure days (pro rata) with the ability to purchase more Generous pension scheme A wide range of discounts and rewards on shopping, eating out and travel A commitment to your development access to learning and mentoring schemes A range of generous family-friendly policies + more! The University of Sheffield is a certified Disability Confident Employer. Disability Confident is a recognition given by the Government's Department for Work and Pensions (DWP) to employers based in Great Britain who have agreed to take action to meet thirteen commitments regarding the employment, retention, training and career development of disabled employees. One of these commitments is to offer an interview to disabled people who meet the minimum criteria for the job. A false declaration of disability to obtain an interview will result in the invalidation of any offer made. If you consider yourself to have a disability as defined by the Equality Act 2010 and would like your application to be considered under the Disability Confident Scheme, please make this visible in your application or send an email to the consultant (see below) Criminal record A basic DBS check will be needed for this role. More details on the checks can be found on the Government website: gov.uk/criminal-record-checks-apply-role. Possession of a criminal record is not an automatic bar to employment at the University of Sheffield. We recognise the value of steady employment in the rehabilitation process and examine each case in its own right. More information can be found on our Information for candidates page
19/04/2024
Full time
Harvey Nash have partnered exclusively with the University of Sheffield as they continue revolutionise their solutions and solidify its position as a leading Russell Group university. They are looking for a Senior Developer to join them. IT Services are advertising a challenging and rewarding role as part of our growing Integration Team. We are building a modern API led approach to Integrations across our estate with the design and implementation of Spring Boot REST APIs using KONG API Gateway Enterprise Products, as well as AWS Infrastructure and tools. Over time we will have a Full Lifecycle API Management Framework in place as part of this work. The team also designs and builds ETL pipelines using a modern data and event driven architecture. We provide a central point of expertise to own and manage our integration tools, processes and standards, and set our future approach to integration. As part of this we provide support to colleagues and suppliers who use our tooling to build their own integrations Essential criteria: Experience in developing systems using a variety of technologies. SpringBoot and Java are our current stack. Experience with Python is a bonus. Expertise with relational and non-relational databases. Expertise in designing and building APIs (REST, GraphQL, etc) Understanding of the life cycle of API management issues such as Security and Traffic Management, Access Control, etc. Expertise in effective collaborative working as part of a team, and the associated tools (Git, Jira, etc) and practices (Agile). Experience of driving continual improvements to systems, processes and working practices to deliver increased performance, efficiency and quality on the systems we maintain. Experience developing, monitoring, debugging and fault handling complex integrations between different systems using a variety of methods and approaches. Wide range of knowledge of tools and techniques for developing high quality software (eg continuous integration/deployment, software testing, containerisation, dependency management, etc.) Ability to learn new technologies and techniques, set standards and support team members on their use. Manage own time when working on several projects simultaneously, with an ability to prioritise and complete urgent fixes as they occur. Ability to support and mentor more junior members of the team. What we offer A minimum of 41 days annual leave including bank holiday and closure days (pro rata) with the ability to purchase more Generous pension scheme A wide range of discounts and rewards on shopping, eating out and travel A commitment to your development access to learning and mentoring schemes A range of generous family-friendly policies + more! The University of Sheffield is a certified Disability Confident Employer. Disability Confident is a recognition given by the Government's Department for Work and Pensions (DWP) to employers based in Great Britain who have agreed to take action to meet thirteen commitments regarding the employment, retention, training and career development of disabled employees. One of these commitments is to offer an interview to disabled people who meet the minimum criteria for the job. A false declaration of disability to obtain an interview will result in the invalidation of any offer made. If you consider yourself to have a disability as defined by the Equality Act 2010 and would like your application to be considered under the Disability Confident Scheme, please make this visible in your application or send an email to the consultant (see below) Criminal record A basic DBS check will be needed for this role. More details on the checks can be found on the Government website: gov.uk/criminal-record-checks-apply-role. Possession of a criminal record is not an automatic bar to employment at the University of Sheffield. We recognise the value of steady employment in the rehabilitation process and examine each case in its own right. More information can be found on our Information for candidates page
GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. Client Details The Government Property Agency is changing the way the Civil Service works and is at the forefront of Government's transformation agenda; reshaping the relationship civil servants have with their place of work. The Agency is central to the delivery of key Government policies including moving 22,000 Civil Service roles out of London by 2030 and tackling climate change by contributing to the Net Zero agenda. To do this we are delivering a major change programme across the UK and consolidating our portfolio in order to save £1.4 Bn over 10 years. Beyond the bricks and mortar, the GPA is about providing great workplaces for our people. Through programmes like Hubs, Whitehall Campus and Smart Working you will be in the vanguard of creating model working environments and promoting flexible working practices. This is an ambitious and exciting task, for which we need innovative people, with strong commercial acumen, who are passionate about visualising and implementing customer needs. Launched as an Executive Agency of the Cabinet Office in 2018, we're a relatively new department and we are growing fast so we also need people who thrive in ambiguity, can adapt quickly to change and are comfortable stepping outside of their remit to drive outcomes. This role has a G6 salary - As such, the package is broken down as per the below: National - £62,900 - £67,900 Birmingham Leeds Nottingham Manchester Newport Norwich Swindon There is also a potential Recruitment and Retention Allowance of £5,000 which is non-pensionable. GPA is also committed to recognising and rewarding where our staff hold the "Gold Standard' accreditation relevant to their Specialism and offer a specific non-pensionable allowance to staff who have achieved this' - This amounts to an EXTRA £5,000 which is also non-pensionable. Description GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. The Head of Bi & Data will be integral to helping delivery this strategy. Overseeing data governance and quality, information management, data maturity, data analytics, data architecture, data management, data integration and data engineering and platforms. The individual will have 3 direct reports in areas of Data Governance, Data Platforms & Integrations and Data Analytics, also supporting a wider team of professionals including Data Engineers, Data Architects and Data developers. The candidate will: Support the delivery of GPA's Information & Data Strategy Be responsible for the definition of the organisation's data strategy Champion data architecture across GPA Set the standards and ways of working for the data architecture community Oversee the design of multiple data models and have a broad understanding of how each model fulfils the needs of the business Provide advice to project teams and oversee the management of the full data product life cycle Be responsible for ensuring that GPA's systems are designed in accordance with the data architecture Data operations & integration - Support the wider team that will provide all sourcing, extraction, reference, and onboarding of key data into GPA's data warehouse and between source systems. Data analysis and synthesis - lead on the vision to embed new analytics initiatives that enhance user experience and decision making and be actively involved in the delivery. Data Governance and Quality - provide strategic direction and support to focus on delivering the highest quality data in a timely manner, supported by governance processes including data security. Data Platforms - able to support and understand the operations of AWS Redshift Data Warehouse as well as other industry leading data platforms to support master data management, governance, architecture and quality. Data Standards - Strong understanding of standards across Government and / or within similar sectors and experience of adoption and integration. Data Architecture & Integration - strong understanding of data integration between systems, transactional and reference data, and leading on the design and mapping. Datamodelling and engineering - able to produce data models and understand where to use different types of data models. Understands different tools and is able to compare between different data models. Programming and build (data engineering) - be able to lead by example and design, write and iterate code to support data operations. Understanding of security, accessibility and version control. Can use a range of coding tools and languages. Business engagement - Facilitate interactions between business divisions to optimize data usage and optimisation. Profile We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. We pride ourselves on being an employer of choice. We champion diversity, inclusion and wellbeing and aim to create a sense of belonging in a workplace where everyone feels valued. Data Integration - ETL design & development Data Engineering - Implementation of performant models within an AWS and Azure data warehouse environment Datamodelling - Conceptual, logical and physical Datamodelling Strong experience of Data Governance & Quality Strong understanding of Data Architecture & Integration Strong experience of data analytics and Business Intelligence platforms Job Offer Alongside your salary of £62,900, GPA contributes £13,959 towards you being a member of the CS DBP Pension scheme. Learning and development tailored to your role An environment with flexible working options A culture encouraging inclusion and diversity A Civil Service pension with an average employer contribution of 27% Generous annual leave This vacancy is using Civil Service Success Profiles: These will assess your Behaviours, Strengths, Experience and Technical skills. We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. The Civil Service Code sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles () The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme (RIS) to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy. This vacancy is part of the Great Place to Work for Veterans initiative. The Civil Service welcomes applications from people who have recently left prison or have an unspent conviction. Read more about prison leaver recruitment on our website. Sift The closing date is 6.5.24, the sift is due to take place by 10.5.24 but is subject to change. (At interview, applicants will be scored against 4 behaviours - Managing a Quality Service, Seeing The Bigger Picture, Changing & Improving and Leadership) Applicants successful at sift will be invited to interviews, due to take place week commencing 13.5.24 and 20.5.24 and will be a virtual interview. This is subject to change dependent upon where most successful candidates are based. Interview questions will be a blend of Behaviour, Experience, Strength and Technical (core skill) questions.
19/04/2024
Full time
GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. Client Details The Government Property Agency is changing the way the Civil Service works and is at the forefront of Government's transformation agenda; reshaping the relationship civil servants have with their place of work. The Agency is central to the delivery of key Government policies including moving 22,000 Civil Service roles out of London by 2030 and tackling climate change by contributing to the Net Zero agenda. To do this we are delivering a major change programme across the UK and consolidating our portfolio in order to save £1.4 Bn over 10 years. Beyond the bricks and mortar, the GPA is about providing great workplaces for our people. Through programmes like Hubs, Whitehall Campus and Smart Working you will be in the vanguard of creating model working environments and promoting flexible working practices. This is an ambitious and exciting task, for which we need innovative people, with strong commercial acumen, who are passionate about visualising and implementing customer needs. Launched as an Executive Agency of the Cabinet Office in 2018, we're a relatively new department and we are growing fast so we also need people who thrive in ambiguity, can adapt quickly to change and are comfortable stepping outside of their remit to drive outcomes. This role has a G6 salary - As such, the package is broken down as per the below: National - £62,900 - £67,900 Birmingham Leeds Nottingham Manchester Newport Norwich Swindon There is also a potential Recruitment and Retention Allowance of £5,000 which is non-pensionable. GPA is also committed to recognising and rewarding where our staff hold the "Gold Standard' accreditation relevant to their Specialism and offer a specific non-pensionable allowance to staff who have achieved this' - This amounts to an EXTRA £5,000 which is also non-pensionable. Description GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. The Head of Bi & Data will be integral to helping delivery this strategy. Overseeing data governance and quality, information management, data maturity, data analytics, data architecture, data management, data integration and data engineering and platforms. The individual will have 3 direct reports in areas of Data Governance, Data Platforms & Integrations and Data Analytics, also supporting a wider team of professionals including Data Engineers, Data Architects and Data developers. The candidate will: Support the delivery of GPA's Information & Data Strategy Be responsible for the definition of the organisation's data strategy Champion data architecture across GPA Set the standards and ways of working for the data architecture community Oversee the design of multiple data models and have a broad understanding of how each model fulfils the needs of the business Provide advice to project teams and oversee the management of the full data product life cycle Be responsible for ensuring that GPA's systems are designed in accordance with the data architecture Data operations & integration - Support the wider team that will provide all sourcing, extraction, reference, and onboarding of key data into GPA's data warehouse and between source systems. Data analysis and synthesis - lead on the vision to embed new analytics initiatives that enhance user experience and decision making and be actively involved in the delivery. Data Governance and Quality - provide strategic direction and support to focus on delivering the highest quality data in a timely manner, supported by governance processes including data security. Data Platforms - able to support and understand the operations of AWS Redshift Data Warehouse as well as other industry leading data platforms to support master data management, governance, architecture and quality. Data Standards - Strong understanding of standards across Government and / or within similar sectors and experience of adoption and integration. Data Architecture & Integration - strong understanding of data integration between systems, transactional and reference data, and leading on the design and mapping. Datamodelling and engineering - able to produce data models and understand where to use different types of data models. Understands different tools and is able to compare between different data models. Programming and build (data engineering) - be able to lead by example and design, write and iterate code to support data operations. Understanding of security, accessibility and version control. Can use a range of coding tools and languages. Business engagement - Facilitate interactions between business divisions to optimize data usage and optimisation. Profile We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. We pride ourselves on being an employer of choice. We champion diversity, inclusion and wellbeing and aim to create a sense of belonging in a workplace where everyone feels valued. Data Integration - ETL design & development Data Engineering - Implementation of performant models within an AWS and Azure data warehouse environment Datamodelling - Conceptual, logical and physical Datamodelling Strong experience of Data Governance & Quality Strong understanding of Data Architecture & Integration Strong experience of data analytics and Business Intelligence platforms Job Offer Alongside your salary of £62,900, GPA contributes £13,959 towards you being a member of the CS DBP Pension scheme. Learning and development tailored to your role An environment with flexible working options A culture encouraging inclusion and diversity A Civil Service pension with an average employer contribution of 27% Generous annual leave This vacancy is using Civil Service Success Profiles: These will assess your Behaviours, Strengths, Experience and Technical skills. We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. The Civil Service Code sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles () The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme (RIS) to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy. This vacancy is part of the Great Place to Work for Veterans initiative. The Civil Service welcomes applications from people who have recently left prison or have an unspent conviction. Read more about prison leaver recruitment on our website. Sift The closing date is 6.5.24, the sift is due to take place by 10.5.24 but is subject to change. (At interview, applicants will be scored against 4 behaviours - Managing a Quality Service, Seeing The Bigger Picture, Changing & Improving and Leadership) Applicants successful at sift will be invited to interviews, due to take place week commencing 13.5.24 and 20.5.24 and will be a virtual interview. This is subject to change dependent upon where most successful candidates are based. Interview questions will be a blend of Behaviour, Experience, Strength and Technical (core skill) questions.
GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. Client Details The Government Property Agency is changing the way the Civil Service works and is at the forefront of Government's transformation agenda; reshaping the relationship civil servants have with their place of work. The Agency is central to the delivery of key Government policies including moving 22,000 Civil Service roles out of London by 2030 and tackling climate change by contributing to the Net Zero agenda. To do this we are delivering a major change programme across the UK and consolidating our portfolio in order to save £1.4 Bn over 10 years. Beyond the bricks and mortar, the GPA is about providing great workplaces for our people. Through programmes like Hubs, Whitehall Campus and Smart Working you will be in the vanguard of creating model working environments and promoting flexible working practices. This is an ambitious and exciting task, for which we need innovative people, with strong commercial acumen, who are passionate about visualising and implementing customer needs. Launched as an Executive Agency of the Cabinet Office in 2018, we're a relatively new department and we are growing fast so we also need people who thrive in ambiguity, can adapt quickly to change and are comfortable stepping outside of their remit to drive outcomes. This role has a G6 salary - As such, the package is broken down as per the below: National - £62,900 - £67,900 Birmingham Leeds Nottingham Manchester Newport Norwich Swindon There is also a potential Recruitment and Retention Allowance of £5,000 which is non-pensionable. GPA is also committed to recognising and rewarding where our staff hold the "Gold Standard' accreditation relevant to their Specialism and offer a specific non-pensionable allowance to staff who have achieved this' - This amounts to an EXTRA £5,000 which is also non-pensionable. Description GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. The Head of Bi & Data will be integral to helping delivery this strategy. Overseeing data governance and quality, information management, data maturity, data analytics, data architecture, data management, data integration and data engineering and platforms. The individual will have 3 direct reports in areas of Data Governance, Data Platforms & Integrations and Data Analytics, also supporting a wider team of professionals including Data Engineers, Data Architects and Data developers. The candidate will: Support the delivery of GPA's Information & Data Strategy Be responsible for the definition of the organisation's data strategy Champion data architecture across GPA Set the standards and ways of working for the data architecture community Oversee the design of multiple data models and have a broad understanding of how each model fulfils the needs of the business Provide advice to project teams and oversee the management of the full data product life cycle Be responsible for ensuring that GPA's systems are designed in accordance with the data architecture Data operations & integration - Support the wider team that will provide all sourcing, extraction, reference, and onboarding of key data into GPA's data warehouse and between source systems. Data analysis and synthesis - lead on the vision to embed new analytics initiatives that enhance user experience and decision making and be actively involved in the delivery. Data Governance and Quality - provide strategic direction and support to focus on delivering the highest quality data in a timely manner, supported by governance processes including data security. Data Platforms - able to support and understand the operations of AWS Redshift Data Warehouse as well as other industry leading data platforms to support master data management, governance, architecture and quality. Data Standards - Strong understanding of standards across Government and / or within similar sectors and experience of adoption and integration. Data Architecture & Integration - strong understanding of data integration between systems, transactional and reference data, and leading on the design and mapping. Datamodelling and engineering - able to produce data models and understand where to use different types of data models. Understands different tools and is able to compare between different data models. Programming and build (data engineering) - be able to lead by example and design, write and iterate code to support data operations. Understanding of security, accessibility and version control. Can use a range of coding tools and languages. Business engagement - Facilitate interactions between business divisions to optimize data usage and optimisation. Profile We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. We pride ourselves on being an employer of choice. We champion diversity, inclusion and wellbeing and aim to create a sense of belonging in a workplace where everyone feels valued. Data Integration - ETL design & development Data Engineering - Implementation of performant models within an AWS and Azure data warehouse environment Datamodelling - Conceptual, logical and physical Datamodelling Strong experience of Data Governance & Quality Strong understanding of Data Architecture & Integration Strong experience of data analytics and Business Intelligence platforms Job Offer Alongside your salary of £62,900, GPA contributes £13,959 towards you being a member of the CS DBP Pension scheme. Learning and development tailored to your role An environment with flexible working options A culture encouraging inclusion and diversity A Civil Service pension with an average employer contribution of 27% Generous annual leave This vacancy is using Civil Service Success Profiles: These will assess your Behaviours, Strengths, Experience and Technical skills. We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. The Civil Service Code sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles. The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme (RIS) to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy. This vacancy is part of the Great Place to Work for Veterans initiative. The Civil Service welcomes applications from people who have recently left prison or have an unspent conviction. Read more about prison leaver recruitment on our website. Sift The closing date is 9.5.24, the sift is due to take place by 16.5.24 but is subject to change. (At interview, applicants will be scored against 4 behaviours - Managing a Quality Service, Seeing The Bigger Picture, Changing & Improving and Leadership) Applicants successful at sift will be invited to interviews, due to take place week commencing 20.5.24 and 27.5.24 and will be a virtual interview. This is subject to change dependent upon where most successful candidates are based. Interview questions will be a blend of Behaviour, Experience, Strength and Technical (core skill) questions.
19/04/2024
Full time
GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. Client Details The Government Property Agency is changing the way the Civil Service works and is at the forefront of Government's transformation agenda; reshaping the relationship civil servants have with their place of work. The Agency is central to the delivery of key Government policies including moving 22,000 Civil Service roles out of London by 2030 and tackling climate change by contributing to the Net Zero agenda. To do this we are delivering a major change programme across the UK and consolidating our portfolio in order to save £1.4 Bn over 10 years. Beyond the bricks and mortar, the GPA is about providing great workplaces for our people. Through programmes like Hubs, Whitehall Campus and Smart Working you will be in the vanguard of creating model working environments and promoting flexible working practices. This is an ambitious and exciting task, for which we need innovative people, with strong commercial acumen, who are passionate about visualising and implementing customer needs. Launched as an Executive Agency of the Cabinet Office in 2018, we're a relatively new department and we are growing fast so we also need people who thrive in ambiguity, can adapt quickly to change and are comfortable stepping outside of their remit to drive outcomes. This role has a G6 salary - As such, the package is broken down as per the below: National - £62,900 - £67,900 Birmingham Leeds Nottingham Manchester Newport Norwich Swindon There is also a potential Recruitment and Retention Allowance of £5,000 which is non-pensionable. GPA is also committed to recognising and rewarding where our staff hold the "Gold Standard' accreditation relevant to their Specialism and offer a specific non-pensionable allowance to staff who have achieved this' - This amounts to an EXTRA £5,000 which is also non-pensionable. Description GPA's data ambition is to deliver high quality, standardised, easily accessible data systems across all GPA functions that enable data flows across business processes. Data will be clearly owned, managed and maintained, it will be secure and assured, and will be created with the purpose of enabling earlier, better decisions to drive value for money. The Head of Bi & Data will be integral to helping delivery this strategy. Overseeing data governance and quality, information management, data maturity, data analytics, data architecture, data management, data integration and data engineering and platforms. The individual will have 3 direct reports in areas of Data Governance, Data Platforms & Integrations and Data Analytics, also supporting a wider team of professionals including Data Engineers, Data Architects and Data developers. The candidate will: Support the delivery of GPA's Information & Data Strategy Be responsible for the definition of the organisation's data strategy Champion data architecture across GPA Set the standards and ways of working for the data architecture community Oversee the design of multiple data models and have a broad understanding of how each model fulfils the needs of the business Provide advice to project teams and oversee the management of the full data product life cycle Be responsible for ensuring that GPA's systems are designed in accordance with the data architecture Data operations & integration - Support the wider team that will provide all sourcing, extraction, reference, and onboarding of key data into GPA's data warehouse and between source systems. Data analysis and synthesis - lead on the vision to embed new analytics initiatives that enhance user experience and decision making and be actively involved in the delivery. Data Governance and Quality - provide strategic direction and support to focus on delivering the highest quality data in a timely manner, supported by governance processes including data security. Data Platforms - able to support and understand the operations of AWS Redshift Data Warehouse as well as other industry leading data platforms to support master data management, governance, architecture and quality. Data Standards - Strong understanding of standards across Government and / or within similar sectors and experience of adoption and integration. Data Architecture & Integration - strong understanding of data integration between systems, transactional and reference data, and leading on the design and mapping. Datamodelling and engineering - able to produce data models and understand where to use different types of data models. Understands different tools and is able to compare between different data models. Programming and build (data engineering) - be able to lead by example and design, write and iterate code to support data operations. Understanding of security, accessibility and version control. Can use a range of coding tools and languages. Business engagement - Facilitate interactions between business divisions to optimize data usage and optimisation. Profile We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. We pride ourselves on being an employer of choice. We champion diversity, inclusion and wellbeing and aim to create a sense of belonging in a workplace where everyone feels valued. Data Integration - ETL design & development Data Engineering - Implementation of performant models within an AWS and Azure data warehouse environment Datamodelling - Conceptual, logical and physical Datamodelling Strong experience of Data Governance & Quality Strong understanding of Data Architecture & Integration Strong experience of data analytics and Business Intelligence platforms Job Offer Alongside your salary of £62,900, GPA contributes £13,959 towards you being a member of the CS DBP Pension scheme. Learning and development tailored to your role An environment with flexible working options A culture encouraging inclusion and diversity A Civil Service pension with an average employer contribution of 27% Generous annual leave This vacancy is using Civil Service Success Profiles: These will assess your Behaviours, Strengths, Experience and Technical skills. We encourage applications from people from all backgrounds and aim to have a workforce that represents the communities and wider society that we serve. The Civil Service Code sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles. The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme (RIS) to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy. This vacancy is part of the Great Place to Work for Veterans initiative. The Civil Service welcomes applications from people who have recently left prison or have an unspent conviction. Read more about prison leaver recruitment on our website. Sift The closing date is 9.5.24, the sift is due to take place by 16.5.24 but is subject to change. (At interview, applicants will be scored against 4 behaviours - Managing a Quality Service, Seeing The Bigger Picture, Changing & Improving and Leadership) Applicants successful at sift will be invited to interviews, due to take place week commencing 20.5.24 and 27.5.24 and will be a virtual interview. This is subject to change dependent upon where most successful candidates are based. Interview questions will be a blend of Behaviour, Experience, Strength and Technical (core skill) questions.
NO SPONSORSHIP Company is located in Chicago - Can be hybrid or remote if needed Contract to Hire role Data Quality Test Engineer This can be a straight contract role or a contract to hire role. The Test Engineer will be responsible for ensuring the successful implementation of new features as well as improvement of existing functionality for data architecture projects. This includes Integration testing, Customer Data Quality testing, Data verification testing, Regression testing etc. This is a hands-on testing position which requires deep knowledge and understanding of all aspects of developing and executing Back End testing. The Test Engineer must be an advocate for all things Testing and Quality Assurance while driving and implementing best practices. The ideal candidate must possess strong analytical and critical thinking abilities. Data testing skills are required. Position Requirements Bachelor's degree At least 3-5 years of testing experience in data Strong experience with writing queries in relational databases like Oracle and SQL server. Familiar with SQL server queries including calling stored procedures by passing parameters, functions, views, and indexes to be used during ETL process. Possess extensive and in-depth knowledge of developing SQL scripts using SQL functions, grouping operations, sub queries, analytical function and joins to test ETL projects. ETL testing using SQL server Integration Services (SSIS) and SSRS. Experience in testing database applications to validate source to destination data movement and transformation. Knowledge of automated data testing is preferred. Hands-on experience in Planning and Manual test execution. Experience in Regression Test Strategies, Regression suite maintenance and execution. Hands-on experience with Jira, ALM, TFS/ADO or similar systems of record. Understanding of Development and Test cycles, including respective best practices. Hands-on experience in Agile and Waterfall methodologies. Ability to troubleshoot issues, identifying root causes and support development team in development of resolutions. Strong communication and Client/Business interfacing and interpersonal skills are a must.
18/04/2024
NO SPONSORSHIP Company is located in Chicago - Can be hybrid or remote if needed Contract to Hire role Data Quality Test Engineer This can be a straight contract role or a contract to hire role. The Test Engineer will be responsible for ensuring the successful implementation of new features as well as improvement of existing functionality for data architecture projects. This includes Integration testing, Customer Data Quality testing, Data verification testing, Regression testing etc. This is a hands-on testing position which requires deep knowledge and understanding of all aspects of developing and executing Back End testing. The Test Engineer must be an advocate for all things Testing and Quality Assurance while driving and implementing best practices. The ideal candidate must possess strong analytical and critical thinking abilities. Data testing skills are required. Position Requirements Bachelor's degree At least 3-5 years of testing experience in data Strong experience with writing queries in relational databases like Oracle and SQL server. Familiar with SQL server queries including calling stored procedures by passing parameters, functions, views, and indexes to be used during ETL process. Possess extensive and in-depth knowledge of developing SQL scripts using SQL functions, grouping operations, sub queries, analytical function and joins to test ETL projects. ETL testing using SQL server Integration Services (SSIS) and SSRS. Experience in testing database applications to validate source to destination data movement and transformation. Knowledge of automated data testing is preferred. Hands-on experience in Planning and Manual test execution. Experience in Regression Test Strategies, Regression suite maintenance and execution. Hands-on experience with Jira, ALM, TFS/ADO or similar systems of record. Understanding of Development and Test cycles, including respective best practices. Hands-on experience in Agile and Waterfall methodologies. Ability to troubleshoot issues, identifying root causes and support development team in development of resolutions. Strong communication and Client/Business interfacing and interpersonal skills are a must.
Cloud Support Lead - Azure Location: London/Hybrid Azure Support Lead , with significant experience managing applications within Azure is required for a prominent specialist insurer in the City of London. This would be a brand-new team engaged in transitioning and transforming the technology landscape of the organisation. Role Overview: The organisation are undergoing a generational transformation and are looking for an experienced 2nd or 3rd line support analyst who can act as the Azure expert for the organisation. You will come with strong knowledge reporting and fixing bugs within Azure and API support. You will have expert experience in Azure Logic Apps Service bus and Azure functions. Initially this will be working and supporting vendors but will grow to driving the internal Cloud Integration and Orchestration platform. You will also have experience in the cloud security element to ensure a robust cyber security element. Key Responsibilities: Support and maintain API services, ensuring seamless connectivity across applications. Act as an Azure SME and expert, able to fix bugs and issues within Azure provide second and third-line support, resolving incidents, and fulfilling requests in line with defined SLAs. Analyse technical and business requirements, designing enterprise solutions integrating various applications and systems. Work closely with third-party suppliers to troubleshoot integration issues and identify improvement opportunities. Maintain technical documentation and a knowledge base of solutions and procedures. Experience with Azure Logic Apps, Service Bus and Azure functions as well as Azure Logic Apps. Good knowledge of cloud security and Cyber Security principles. Desirable Skills: Familiarity with Azure Data Factory, ETL processes, and data manipulation. Experience within the Financial Services sector or Specialist insurance. Understanding of ITIL-based service management concepts (Incident Management, Problem Management, Change Management). Why Join: Collaborate on a major technical transition for a brand new team and business unit Hybrid work model with a City of London office presence. Contribute to a transformative journey in the insurance domain. Supportive and inclusive work environment valuing diverse perspectives. This is a brand new opening within a new team so apply now for consideration!
17/04/2024
Full time
Cloud Support Lead - Azure Location: London/Hybrid Azure Support Lead , with significant experience managing applications within Azure is required for a prominent specialist insurer in the City of London. This would be a brand-new team engaged in transitioning and transforming the technology landscape of the organisation. Role Overview: The organisation are undergoing a generational transformation and are looking for an experienced 2nd or 3rd line support analyst who can act as the Azure expert for the organisation. You will come with strong knowledge reporting and fixing bugs within Azure and API support. You will have expert experience in Azure Logic Apps Service bus and Azure functions. Initially this will be working and supporting vendors but will grow to driving the internal Cloud Integration and Orchestration platform. You will also have experience in the cloud security element to ensure a robust cyber security element. Key Responsibilities: Support and maintain API services, ensuring seamless connectivity across applications. Act as an Azure SME and expert, able to fix bugs and issues within Azure provide second and third-line support, resolving incidents, and fulfilling requests in line with defined SLAs. Analyse technical and business requirements, designing enterprise solutions integrating various applications and systems. Work closely with third-party suppliers to troubleshoot integration issues and identify improvement opportunities. Maintain technical documentation and a knowledge base of solutions and procedures. Experience with Azure Logic Apps, Service Bus and Azure functions as well as Azure Logic Apps. Good knowledge of cloud security and Cyber Security principles. Desirable Skills: Familiarity with Azure Data Factory, ETL processes, and data manipulation. Experience within the Financial Services sector or Specialist insurance. Understanding of ITIL-based service management concepts (Incident Management, Problem Management, Change Management). Why Join: Collaborate on a major technical transition for a brand new team and business unit Hybrid work model with a City of London office presence. Contribute to a transformative journey in the insurance domain. Supportive and inclusive work environment valuing diverse perspectives. This is a brand new opening within a new team so apply now for consideration!
Seeburger Developer Whitehall Resources are currently looking for a Seeburger Developer. You will be required to use an FCSA Accredited Umbrella Company for this role. Key Requirements: - Design, develop and implement Seeburger solutions which are scalable, resilient and performant. - Understand the internals of seeburger products and solution. - You have used Business integration suite extensively and are well aware of different components like adapter engines, transport engine, process engine, conversion engine, data store and system information layer. - Well versed with code management and deployment techniques used in seeburger products. - Knowledge of BIC md and landscape manager at an expert level. - You are able to build technically efficient and scalable application design. - Be subject matter expert for Seeburger application development. - You should have hands on experience in mapping, validations, solution design and Scripting. - Delivery Management - Accountable for the End-to-End delivery of all initiatives for the service, engage the vendors wherever applicable in order to accelerate the delivery. - Through your work you review, identify and manage Design & implementation risks, issues and dependencies, ensuring action is taken to quantify and mitigate them. You constantly seek opportunity to continuously improve whether that be design methods and governance, or pioneering the use of new architectural patterns, frameworks and methods. - Be involved in reviewing the solutions and hotfixes provided by Seeburger. Evaluate the implementation risks and make appropriate decisions. - Be the face of engagement with Seeburger team. Have information of the latest developments in Seeburger product and understand its impact on implementations. - Have experience in other similar ETL/BPM tools apart from Seeburger BIS so that you are able to compare benefits and challenges of the tool. - Provide architectural and engineering direction to seeburger product development so that they are built to scale and be resilient to meet the needs. - Stakeholder Management: Engaging all relevant stakeholders (Delivery heads, Technical leads and etc.) to maintain visibility on the design, operability metrics, risk appetite, and to provide robust challenge to the design. Key Experience: - Min. 12-15 years in IT with 6+ years in Banking and Financial Services domain. At least 3 years experience with ETL and BPM tools. - Hands on experience with Seeburger tools. - Leading deliveries in highly complex eco system. - Strong technical knowledge and understanding of Seeburger products, internal working and ETL/BPM tools in general. - Technically strong in Java, Spring Frameworks and Oracle SQL. - Expert in unix Shell Scripting. - Working knowledge of python. - Infrastructure implementation patterns like containers and VM deployments. - Exposure to implementations on prem and GCP. All of our opportunities require that applicants are eligible to work in the specified country/location, unless otherwise stated in the job description. Whitehall Resources are an equal opportunities employer who value a diverse and inclusive working environment. All qualified applicants will receive consideration for employment without regard to race, religion, gender identity or expression, sexual orientation, national origin, pregnancy, disability, age, veteran status, or other characteristics.
16/04/2024
Project-based
Seeburger Developer Whitehall Resources are currently looking for a Seeburger Developer. You will be required to use an FCSA Accredited Umbrella Company for this role. Key Requirements: - Design, develop and implement Seeburger solutions which are scalable, resilient and performant. - Understand the internals of seeburger products and solution. - You have used Business integration suite extensively and are well aware of different components like adapter engines, transport engine, process engine, conversion engine, data store and system information layer. - Well versed with code management and deployment techniques used in seeburger products. - Knowledge of BIC md and landscape manager at an expert level. - You are able to build technically efficient and scalable application design. - Be subject matter expert for Seeburger application development. - You should have hands on experience in mapping, validations, solution design and Scripting. - Delivery Management - Accountable for the End-to-End delivery of all initiatives for the service, engage the vendors wherever applicable in order to accelerate the delivery. - Through your work you review, identify and manage Design & implementation risks, issues and dependencies, ensuring action is taken to quantify and mitigate them. You constantly seek opportunity to continuously improve whether that be design methods and governance, or pioneering the use of new architectural patterns, frameworks and methods. - Be involved in reviewing the solutions and hotfixes provided by Seeburger. Evaluate the implementation risks and make appropriate decisions. - Be the face of engagement with Seeburger team. Have information of the latest developments in Seeburger product and understand its impact on implementations. - Have experience in other similar ETL/BPM tools apart from Seeburger BIS so that you are able to compare benefits and challenges of the tool. - Provide architectural and engineering direction to seeburger product development so that they are built to scale and be resilient to meet the needs. - Stakeholder Management: Engaging all relevant stakeholders (Delivery heads, Technical leads and etc.) to maintain visibility on the design, operability metrics, risk appetite, and to provide robust challenge to the design. Key Experience: - Min. 12-15 years in IT with 6+ years in Banking and Financial Services domain. At least 3 years experience with ETL and BPM tools. - Hands on experience with Seeburger tools. - Leading deliveries in highly complex eco system. - Strong technical knowledge and understanding of Seeburger products, internal working and ETL/BPM tools in general. - Technically strong in Java, Spring Frameworks and Oracle SQL. - Expert in unix Shell Scripting. - Working knowledge of python. - Infrastructure implementation patterns like containers and VM deployments. - Exposure to implementations on prem and GCP. All of our opportunities require that applicants are eligible to work in the specified country/location, unless otherwise stated in the job description. Whitehall Resources are an equal opportunities employer who value a diverse and inclusive working environment. All qualified applicants will receive consideration for employment without regard to race, religion, gender identity or expression, sexual orientation, national origin, pregnancy, disability, age, veteran status, or other characteristics.
We are IT Recruitment Specialists partnered with a prestigious Global Consultancy who requires a Live Service Talend Engineers for one of their sector clients based in Telford. Live Service Talend Engineers Telford 6 months £358 inside of IR35 Candidates MUST hold Active SC clearance Software development activity across the full range of development life cycle; requirements gathering, analysis, design, coding/development, testing, implementation and live support. This may be within new systems development projects or enhancements and fixes to existing applications. Carrying out development in accordance with the agreed requirements and development standards Testing of products in accordance with the test strategy to ensure that they are fit for purpose Assisting the team in examining packages of work and giving realistic timescales for completion Completion of work allocated within agreed time, cost and quality criteria and providing progress reports on assigned work as required Management and control of problems and change within their area of responsibility, including negotiation with other team members Problem analysis, investigation and resolution Playing an active part in Process improvement, awareness of and compliance with all relevant quality processes and procedures, including completion of all the specified quality records Mandatory Skills: FTEs will be used across Live Support systems wherever required Technologies - Denodo/Talend/PDI/Git/MySQL/Redshift/Grafana Dashboard creation/consolidation Gathering requirements, understanding the service and delivery alerts to suit Skills must have: FTE ideally should have experience in Grafana monitoring: Dashboard creation/consolidation A software engineering background with proven ability to rapidly learn and apply application development process, tools and programming/Scripting languages. Awareness of key SE engineering concepts and governance (design, development, config management and version control, quality control, continuous integration, release/deployment, documentation, built in supportability, built in maintainability, re-use and extensibility) Specific experience of data solutions development addressing data extraction, transformation and load (ETL) processing and data analytics and reporting requirements. Familiar with Agile software development principles and practices and experience of delivery as a member of an Agile Scrum team utilising Scrum methodology.
15/04/2024
Project-based
We are IT Recruitment Specialists partnered with a prestigious Global Consultancy who requires a Live Service Talend Engineers for one of their sector clients based in Telford. Live Service Talend Engineers Telford 6 months £358 inside of IR35 Candidates MUST hold Active SC clearance Software development activity across the full range of development life cycle; requirements gathering, analysis, design, coding/development, testing, implementation and live support. This may be within new systems development projects or enhancements and fixes to existing applications. Carrying out development in accordance with the agreed requirements and development standards Testing of products in accordance with the test strategy to ensure that they are fit for purpose Assisting the team in examining packages of work and giving realistic timescales for completion Completion of work allocated within agreed time, cost and quality criteria and providing progress reports on assigned work as required Management and control of problems and change within their area of responsibility, including negotiation with other team members Problem analysis, investigation and resolution Playing an active part in Process improvement, awareness of and compliance with all relevant quality processes and procedures, including completion of all the specified quality records Mandatory Skills: FTEs will be used across Live Support systems wherever required Technologies - Denodo/Talend/PDI/Git/MySQL/Redshift/Grafana Dashboard creation/consolidation Gathering requirements, understanding the service and delivery alerts to suit Skills must have: FTE ideally should have experience in Grafana monitoring: Dashboard creation/consolidation A software engineering background with proven ability to rapidly learn and apply application development process, tools and programming/Scripting languages. Awareness of key SE engineering concepts and governance (design, development, config management and version control, quality control, continuous integration, release/deployment, documentation, built in supportability, built in maintainability, re-use and extensibility) Specific experience of data solutions development addressing data extraction, transformation and load (ETL) processing and data analytics and reporting requirements. Familiar with Agile software development principles and practices and experience of delivery as a member of an Agile Scrum team utilising Scrum methodology.