Role: Lead Data Engineer (Hands-on) Rate: £425.00 - £470.00 Inside Ir35 Clearance: SC Cleared (actice, used within last 12 months) Location: Hybrid/London (once every month) Start date: ASAP Company Overview: Recruiting for a leading cloud solutions provider specialising in digital transformation and data engineering services. The mission is to empower organisations to unlock their data's full potential through innovative cloud solutions. Seeking a Lead Data Engineer to support the data engineering team in designing and implementing solutions for complex migration projects. Job Overview: The Lead Data Engineer will oversee the design and implementation of data engineering solutions, leading a team in migrating Legacy Oracle databases to AWS managed databases. Expertise in AWS services like AWS Glue, AWS Managed Flink, S3, AWS RDS, and Lambda is essential for mentoring, troubleshooting, and optimizing data solutions. Key Responsibilities: Lead ETL processes using AWS Glue to migrate data from Legacy Oracle databases to AWS managed databases, document databases, and data lakes. Oversee development and maintenance of scalable data pipelines. Provide technical leadership and mentorship to Data Engineers. Collaborate with solution architects, data scientists, and stakeholders. Optimize data workflows using AWS services. Ensure adherence to data quality, security, and governance standards. Automate and monitor data pipelines with proactive alerting mechanisms. Drive continuous improvement and stay updated with emerging AWS services. Lead technical discussions and ensure alignment on design and implementation. Provide regular updates on project status and challenges to senior leadership. Qualifications: Proven expertise in designing and developing ETL processes using AWS Glue. Experience migrating Legacy Oracle databases to AWS managed databases. Strong knowledge of AWS services (S3, AWS Managed Flink, Lambda, CloudWatch). Expertise in building and maintaining data lakes, document databases, and relational databases. Hands-on experience with data pipeline architecture and development. Proficiency in SQL and Python. Strong understanding of data governance, quality, and security. Desirable - AWS certifications (AWS Certified Data Analytics, AWS Certified Solutions Architect). Preferred Skills: Experience with big data processing technologies (Apache Spark, PySpark). Knowledge of DevOps practices and tools. Familiarity with data warehousing, dimensional modelling, and data architecture best practices.
08/01/2025
Project-based
Role: Lead Data Engineer (Hands-on) Rate: £425.00 - £470.00 Inside Ir35 Clearance: SC Cleared (actice, used within last 12 months) Location: Hybrid/London (once every month) Start date: ASAP Company Overview: Recruiting for a leading cloud solutions provider specialising in digital transformation and data engineering services. The mission is to empower organisations to unlock their data's full potential through innovative cloud solutions. Seeking a Lead Data Engineer to support the data engineering team in designing and implementing solutions for complex migration projects. Job Overview: The Lead Data Engineer will oversee the design and implementation of data engineering solutions, leading a team in migrating Legacy Oracle databases to AWS managed databases. Expertise in AWS services like AWS Glue, AWS Managed Flink, S3, AWS RDS, and Lambda is essential for mentoring, troubleshooting, and optimizing data solutions. Key Responsibilities: Lead ETL processes using AWS Glue to migrate data from Legacy Oracle databases to AWS managed databases, document databases, and data lakes. Oversee development and maintenance of scalable data pipelines. Provide technical leadership and mentorship to Data Engineers. Collaborate with solution architects, data scientists, and stakeholders. Optimize data workflows using AWS services. Ensure adherence to data quality, security, and governance standards. Automate and monitor data pipelines with proactive alerting mechanisms. Drive continuous improvement and stay updated with emerging AWS services. Lead technical discussions and ensure alignment on design and implementation. Provide regular updates on project status and challenges to senior leadership. Qualifications: Proven expertise in designing and developing ETL processes using AWS Glue. Experience migrating Legacy Oracle databases to AWS managed databases. Strong knowledge of AWS services (S3, AWS Managed Flink, Lambda, CloudWatch). Expertise in building and maintaining data lakes, document databases, and relational databases. Hands-on experience with data pipeline architecture and development. Proficiency in SQL and Python. Strong understanding of data governance, quality, and security. Desirable - AWS certifications (AWS Certified Data Analytics, AWS Certified Solutions Architect). Preferred Skills: Experience with big data processing technologies (Apache Spark, PySpark). Knowledge of DevOps practices and tools. Familiarity with data warehousing, dimensional modelling, and data architecture best practices.
Role: Lead Data Engineer (Hands-on) Rate: £425.00 - £470.00 Inside Ir35 Clearance: SC Cleared (actice, used within last 12 months) Location: Hybrid/London (once every month) Start date: ASAP Company Overview: Recruiting for a leading cloud solutions provider specialising in digital transformation and data engineering services. The mission is to empower organisations to unlock their data's full potential through innovative cloud solutions. Seeking a Lead Data Engineer to support the data engineering team in designing and implementing solutions for complex migration projects. Job Overview: The Lead Data Engineer will oversee the design and implementation of data engineering solutions, leading a team in migrating Legacy Oracle databases to AWS managed databases. Expertise in AWS services like AWS Glue, AWS Managed Flink, S3, AWS RDS, and Lambda is essential for mentoring, troubleshooting, and optimizing data solutions. Key Responsibilities: Lead ETL processes using AWS Glue to migrate data from Legacy Oracle databases to AWS managed databases, document databases, and data lakes. Oversee development and maintenance of scalable data pipelines. Provide technical leadership and mentorship to Data Engineers. Collaborate with solution architects, data scientists, and stakeholders. Optimize data workflows using AWS services. Ensure adherence to data quality, security, and governance standards. Automate and monitor data pipelines with proactive alerting mechanisms. Drive continuous improvement and stay updated with emerging AWS services. Lead technical discussions and ensure alignment on design and implementation. Provide regular updates on project status and challenges to senior leadership. Qualifications: Proven expertise in designing and developing ETL processes using AWS Glue. Experience migrating Legacy Oracle databases to AWS managed databases. Strong knowledge of AWS services (S3, AWS Managed Flink, Lambda, CloudWatch). Expertise in building and maintaining data lakes, document databases, and relational databases. Hands-on experience with data pipeline architecture and development. Proficiency in SQL and Python. Strong understanding of data governance, quality, and security. Desirable - AWS certifications (AWS Certified Data Analytics, AWS Certified Solutions Architect). Preferred Skills: Experience with big data processing technologies (Apache Spark, PySpark). Knowledge of DevOps practices and tools. Familiarity with data warehousing, dimensional modelling, and data architecture best practices.
08/01/2025
Project-based
Role: Lead Data Engineer (Hands-on) Rate: £425.00 - £470.00 Inside Ir35 Clearance: SC Cleared (actice, used within last 12 months) Location: Hybrid/London (once every month) Start date: ASAP Company Overview: Recruiting for a leading cloud solutions provider specialising in digital transformation and data engineering services. The mission is to empower organisations to unlock their data's full potential through innovative cloud solutions. Seeking a Lead Data Engineer to support the data engineering team in designing and implementing solutions for complex migration projects. Job Overview: The Lead Data Engineer will oversee the design and implementation of data engineering solutions, leading a team in migrating Legacy Oracle databases to AWS managed databases. Expertise in AWS services like AWS Glue, AWS Managed Flink, S3, AWS RDS, and Lambda is essential for mentoring, troubleshooting, and optimizing data solutions. Key Responsibilities: Lead ETL processes using AWS Glue to migrate data from Legacy Oracle databases to AWS managed databases, document databases, and data lakes. Oversee development and maintenance of scalable data pipelines. Provide technical leadership and mentorship to Data Engineers. Collaborate with solution architects, data scientists, and stakeholders. Optimize data workflows using AWS services. Ensure adherence to data quality, security, and governance standards. Automate and monitor data pipelines with proactive alerting mechanisms. Drive continuous improvement and stay updated with emerging AWS services. Lead technical discussions and ensure alignment on design and implementation. Provide regular updates on project status and challenges to senior leadership. Qualifications: Proven expertise in designing and developing ETL processes using AWS Glue. Experience migrating Legacy Oracle databases to AWS managed databases. Strong knowledge of AWS services (S3, AWS Managed Flink, Lambda, CloudWatch). Expertise in building and maintaining data lakes, document databases, and relational databases. Hands-on experience with data pipeline architecture and development. Proficiency in SQL and Python. Strong understanding of data governance, quality, and security. Desirable - AWS certifications (AWS Certified Data Analytics, AWS Certified Solutions Architect). Preferred Skills: Experience with big data processing technologies (Apache Spark, PySpark). Knowledge of DevOps practices and tools. Familiarity with data warehousing, dimensional modelling, and data architecture best practices.
Senior Python Developer (m/f/d) - programming/JSON/parsing and generation/REST API in Python/ChatGPT API/ChatGPT API/RDF graph databases/Communication Skills/English Project : For our customer a big pharmaceutical company in Basel we are looking for Senior Python Developer (m/f/d). Background : In Roche's Pharmaceutical Research and Early Development organization (pRED), we make transformative medicines for patients in order to tackle some of the world's toughest unmet healthcare needs. At pRED, we are united by our mission to transform science into medicines. Together, we create a culture defined by curiosity, responsibility and humility, where our talented people are empowered and inspired to bring forward extraordinary life-changing innovation at speed. This position is located in Data Products & Platforms, a chapter within the Data & Analytics function, which pushes boundaries of drug discovery and development, enabling pRED to achieve its goals. The perfect candidate: The perfect candidate has strong proven programming skills in Python and has the ability to work independently. Additionally the perfect candidate can manage multiple priorities and communicate effectively with both technical and non-technical stakeholders. Tasks & Responsibilities: * (Re-) Implementation of an easy-to-use Python library that works on top of an existing REST API * (Re-) Implementation of loader scripts that perform bulk operations by using the Python library * Implementation of a PoC mapping concept algorithm with input from a GraphDB RDF graph and usage of LLM services (ChatGPT API) * Document the architecture, usage, and operational procedures for future reference and maintenance. * Communicating and training the scientists in using the library and scripts * Collaborate with stakeholders to gather requirements and ensure the system meets the needs of the organization. * Conduct testing and quality assurance to ensure the reliability and accuracy of the code Must Haves: * Minimum level of education: IT Apprenticeship; preferably Bachelor, Master * Strong proven programming skills in Python 5- 10 years * Understanding and ability to discuss software architecture best practices * Experience with JSON format, especially parsing and generation in Python * Experience with using REST API in Python * OPTIONAL: experience with using the ChatGPT API * OPTIONAL: experience with RDF graph databases, eg GraphDB * Strong communication skills in English - speaking and writing * Ability to write easy to understand documentation about the code * Explaining to non-engineers how to use the code * Ability to work independently, manage multiple priorities, and communicate effectively with both technical and non-technical stakeholders. Reference Nr.: 923907SDA Role : Senior Python Developer (m/f/d) Industrie : Pharma Workplace : Basel (60% onsite is a must) Pensum : 100% Start : ASAP (Latest start Date: 01.01.2025) Duration : 3 months + extension Deadline : 12/01/2025 If you are interested in this position, please send us your complete dossier via the link in this advertisement. About us: ITech Consult is an ISO 9001:2015 certified Swiss company with offices in Germany and Ireland. ITech Consult specialises in the placement of highly qualified candidates for recruitment in the fields of IT, Life Science & Engineering. We offer staff leasing & payroll services. For our candidates this is free of charge, also for Payroll we do not charge you any additional fees.
06/01/2025
Project-based
Senior Python Developer (m/f/d) - programming/JSON/parsing and generation/REST API in Python/ChatGPT API/ChatGPT API/RDF graph databases/Communication Skills/English Project : For our customer a big pharmaceutical company in Basel we are looking for Senior Python Developer (m/f/d). Background : In Roche's Pharmaceutical Research and Early Development organization (pRED), we make transformative medicines for patients in order to tackle some of the world's toughest unmet healthcare needs. At pRED, we are united by our mission to transform science into medicines. Together, we create a culture defined by curiosity, responsibility and humility, where our talented people are empowered and inspired to bring forward extraordinary life-changing innovation at speed. This position is located in Data Products & Platforms, a chapter within the Data & Analytics function, which pushes boundaries of drug discovery and development, enabling pRED to achieve its goals. The perfect candidate: The perfect candidate has strong proven programming skills in Python and has the ability to work independently. Additionally the perfect candidate can manage multiple priorities and communicate effectively with both technical and non-technical stakeholders. Tasks & Responsibilities: * (Re-) Implementation of an easy-to-use Python library that works on top of an existing REST API * (Re-) Implementation of loader scripts that perform bulk operations by using the Python library * Implementation of a PoC mapping concept algorithm with input from a GraphDB RDF graph and usage of LLM services (ChatGPT API) * Document the architecture, usage, and operational procedures for future reference and maintenance. * Communicating and training the scientists in using the library and scripts * Collaborate with stakeholders to gather requirements and ensure the system meets the needs of the organization. * Conduct testing and quality assurance to ensure the reliability and accuracy of the code Must Haves: * Minimum level of education: IT Apprenticeship; preferably Bachelor, Master * Strong proven programming skills in Python 5- 10 years * Understanding and ability to discuss software architecture best practices * Experience with JSON format, especially parsing and generation in Python * Experience with using REST API in Python * OPTIONAL: experience with using the ChatGPT API * OPTIONAL: experience with RDF graph databases, eg GraphDB * Strong communication skills in English - speaking and writing * Ability to write easy to understand documentation about the code * Explaining to non-engineers how to use the code * Ability to work independently, manage multiple priorities, and communicate effectively with both technical and non-technical stakeholders. Reference Nr.: 923907SDA Role : Senior Python Developer (m/f/d) Industrie : Pharma Workplace : Basel (60% onsite is a must) Pensum : 100% Start : ASAP (Latest start Date: 01.01.2025) Duration : 3 months + extension Deadline : 12/01/2025 If you are interested in this position, please send us your complete dossier via the link in this advertisement. About us: ITech Consult is an ISO 9001:2015 certified Swiss company with offices in Germany and Ireland. ITech Consult specialises in the placement of highly qualified candidates for recruitment in the fields of IT, Life Science & Engineering. We offer staff leasing & payroll services. For our candidates this is free of charge, also for Payroll we do not charge you any additional fees.
Senior Platform Engineer - DV Cleared Up to £100,000 + Stock Options London, Hybrid (Must have active DV clearance) Would you be interested in joining a new Platform Engineering team for a self-funded scale-up that build and deliver Data Analytics products for government bodies? This data scale-up operate mainly within the public sector, helping government bodies that have vast swathes of data to make better use of it. They're looking to build a new team of Platform Engineers for a new longstanding secure government project, that involves working alongside data engineers and data scientists to manage, support and build the infrastructure. Tech stack, therefore, skills needed: Kubernetes or Openshift ELK Stack, ElasticSearch or Opensearch Linux CI/CD Nice to haves, but definitely not essential: Cloudera Hadoop Spark Kafka Hbase Hue Atlas Logistics: up to £100,000 base salary 30 days holiday + bank holidays Private health care Annual trips to see the northern lights! Location: Central London Please note, candidates MUST already have DV clearance, SC clearance alone doesn't meet the requirement. Apply now, or contact (see below) for more information We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age
02/01/2025
Full time
Senior Platform Engineer - DV Cleared Up to £100,000 + Stock Options London, Hybrid (Must have active DV clearance) Would you be interested in joining a new Platform Engineering team for a self-funded scale-up that build and deliver Data Analytics products for government bodies? This data scale-up operate mainly within the public sector, helping government bodies that have vast swathes of data to make better use of it. They're looking to build a new team of Platform Engineers for a new longstanding secure government project, that involves working alongside data engineers and data scientists to manage, support and build the infrastructure. Tech stack, therefore, skills needed: Kubernetes or Openshift ELK Stack, ElasticSearch or Opensearch Linux CI/CD Nice to haves, but definitely not essential: Cloudera Hadoop Spark Kafka Hbase Hue Atlas Logistics: up to £100,000 base salary 30 days holiday + bank holidays Private health care Annual trips to see the northern lights! Location: Central London Please note, candidates MUST already have DV clearance, SC clearance alone doesn't meet the requirement. Apply now, or contact (see below) for more information We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age