Investment Banking Python/Javascript AI Engineer - AI/ML Models/risk/NLP - Glasgow (Contract) Our IB client is looking for a skilled and experienced Developer to join our architecture delivery team. This role focuses on build AI Architect empowering architects and developers in making informed, data-driven decisions, automating repetitive architecture tasks, and streamlining documentation workflows. Key Responsibilities: Design, develop, and implement a scalable, AI-driven architecture platform. Work closely with architects and data scientists to embed AI/ML models into the system for enhanced decision-making, such as recommendation engines. Drive the adoption of AI Architect and best practices across the development teams, ensuring consistency and alignment with enterprise standards. Participate in and lead architecture communities of practice to foster knowledge-sharing and innovation within the organization. Stay updated on the latest architecture and technology trends relevant to financial services, such as cloud computing, data security, AI, and distributed systems. Skills/Qualifications: 5+ years of experience in at least one of the following: JavaScript, Java, TypeScript, or Python End-to-end Systems Development: Proven ability to architect and build complex systems with a long-term vision Expertise in financial services applications, including knowledge of transaction processing, risk management, and data security. Excellent communication skills, with the ability to present complex architectural ideas to diverse stakeholders. Strong problem-solving and critical thinking skills, with a track record of innovative solution design in complex environments. Understanding of experimental design, statistical analysis, and data-driven decision making. Proficiency in collaborating with data scientists to translate advanced models into scalable production code Familiarity with AI-driven frameworks like knowledge graphs, natural language processing (NLP), or recommendation systems is a big plus. Inside IR35 - Hybrid - Glasgow based - 12 months initial contract By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you. Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.
03/04/2025
Project-based
Investment Banking Python/Javascript AI Engineer - AI/ML Models/risk/NLP - Glasgow (Contract) Our IB client is looking for a skilled and experienced Developer to join our architecture delivery team. This role focuses on build AI Architect empowering architects and developers in making informed, data-driven decisions, automating repetitive architecture tasks, and streamlining documentation workflows. Key Responsibilities: Design, develop, and implement a scalable, AI-driven architecture platform. Work closely with architects and data scientists to embed AI/ML models into the system for enhanced decision-making, such as recommendation engines. Drive the adoption of AI Architect and best practices across the development teams, ensuring consistency and alignment with enterprise standards. Participate in and lead architecture communities of practice to foster knowledge-sharing and innovation within the organization. Stay updated on the latest architecture and technology trends relevant to financial services, such as cloud computing, data security, AI, and distributed systems. Skills/Qualifications: 5+ years of experience in at least one of the following: JavaScript, Java, TypeScript, or Python End-to-end Systems Development: Proven ability to architect and build complex systems with a long-term vision Expertise in financial services applications, including knowledge of transaction processing, risk management, and data security. Excellent communication skills, with the ability to present complex architectural ideas to diverse stakeholders. Strong problem-solving and critical thinking skills, with a track record of innovative solution design in complex environments. Understanding of experimental design, statistical analysis, and data-driven decision making. Proficiency in collaborating with data scientists to translate advanced models into scalable production code Familiarity with AI-driven frameworks like knowledge graphs, natural language processing (NLP), or recommendation systems is a big plus. Inside IR35 - Hybrid - Glasgow based - 12 months initial contract By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you. Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.
Job Title: AI/ML Contractor Location : Eastern Europe Contract Duration : 6-12 months (with possibility of extension) About Us : We are a fast-growing technology company working on innovative AI and ML solutions that are revolutionizing industries worldwide. Our team is dynamic, collaborative, and passionate about pushing the boundaries of what's possible with cutting-edge technologies. We're looking for a talented and motivated AI/ML Contractor to join our team and help us build advanced machine learning models and AI-driven products. Key Responsibilities : Design, develop, and implement machine learning models to solve complex business problems. Google Analytics Python Work with large datasets, ensuring data preprocessing, feature engineering, and model validation. Collaborate closely with cross-functional teams, including engineers, product managers, and data scientists. Analyze model performance and provide insights for optimization and improvement. Research the latest trends in AI/ML technologies and incorporate them into the development process. Write clean, scalable, and efficient code for AI/ML solutions. Provide expertise in model deployment, testing, and monitoring in production environm
03/04/2025
Project-based
Job Title: AI/ML Contractor Location : Eastern Europe Contract Duration : 6-12 months (with possibility of extension) About Us : We are a fast-growing technology company working on innovative AI and ML solutions that are revolutionizing industries worldwide. Our team is dynamic, collaborative, and passionate about pushing the boundaries of what's possible with cutting-edge technologies. We're looking for a talented and motivated AI/ML Contractor to join our team and help us build advanced machine learning models and AI-driven products. Key Responsibilities : Design, develop, and implement machine learning models to solve complex business problems. Google Analytics Python Work with large datasets, ensuring data preprocessing, feature engineering, and model validation. Collaborate closely with cross-functional teams, including engineers, product managers, and data scientists. Analyze model performance and provide insights for optimization and improvement. Research the latest trends in AI/ML technologies and incorporate them into the development process. Write clean, scalable, and efficient code for AI/ML solutions. Provide expertise in model deployment, testing, and monitoring in production environm
Senior Data Scientist (Biostats Engineering) - Remote (RL7733) Job Title - Senior Data Scientist (Biostats Engineering) Location - Remote Ref - RL7733 Salary - Competitive The Client We are partnering with a design-led data, software, and cloud company specializing in AI and advanced analytics. They design, build, and operate data- and AI-driven solutions, products, and experiences on Azure, enabling their business customers to tackle challenges and seize opportunities with greater efficiency and certainty. The Candidate We are looking for an experienced Data Scientist with extensive expertise in: Best practices for R package development Model development and deployment on Databricks Collaboration using version control systems Additional knowledge of data architecture and cloud infrastructure is highly desirable The Role You will work with one of our global biopharma clients, developing high-quality R packages and providing consultancy on Biostatistics model development. Key responsibilities include: Reviewing and optimizing code Integrating existing modelling code into packages Designing and implementing end-to-end modelling and deployment processes on Databricks Focusing on delivering high-impact solutions that exceed customer expectations Key Responsibilities Develop high-quality R packages Provide consultancy on Biostatistics model development and deployment best practices Review and optimize code, integrating existing modelling code into packages Design and implement end-to-end modelling and deployment processes on Databricks Support and collaborate with adjacent teams (eg, product and IT teams) to integrate modelling solutions Continuously innovate with the team and customer, utilizing modern tools to enhance model development and deployment Skills & Experience A successful candidate will demonstrate: A background or work experience in biostatistics or a related field Strong proficiency in R programming and R package development Experience in statistical model deployment and end-to-end MLOps (preferred) Extensive experience with cloud infrastructure, preferably Databricks and Azure Experience with Shiny development (preferred) Ability to work with customer stakeholders to understand business processes and workflows, designing solutions to optimize and automate them DevOps experience and familiarity with software release processes Familiarity with Agile delivery methods To apply for this Senior Data Scientist permanent job, please click the button below and submit your latest CV. Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience. Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.
02/04/2025
Full time
Senior Data Scientist (Biostats Engineering) - Remote (RL7733) Job Title - Senior Data Scientist (Biostats Engineering) Location - Remote Ref - RL7733 Salary - Competitive The Client We are partnering with a design-led data, software, and cloud company specializing in AI and advanced analytics. They design, build, and operate data- and AI-driven solutions, products, and experiences on Azure, enabling their business customers to tackle challenges and seize opportunities with greater efficiency and certainty. The Candidate We are looking for an experienced Data Scientist with extensive expertise in: Best practices for R package development Model development and deployment on Databricks Collaboration using version control systems Additional knowledge of data architecture and cloud infrastructure is highly desirable The Role You will work with one of our global biopharma clients, developing high-quality R packages and providing consultancy on Biostatistics model development. Key responsibilities include: Reviewing and optimizing code Integrating existing modelling code into packages Designing and implementing end-to-end modelling and deployment processes on Databricks Focusing on delivering high-impact solutions that exceed customer expectations Key Responsibilities Develop high-quality R packages Provide consultancy on Biostatistics model development and deployment best practices Review and optimize code, integrating existing modelling code into packages Design and implement end-to-end modelling and deployment processes on Databricks Support and collaborate with adjacent teams (eg, product and IT teams) to integrate modelling solutions Continuously innovate with the team and customer, utilizing modern tools to enhance model development and deployment Skills & Experience A successful candidate will demonstrate: A background or work experience in biostatistics or a related field Strong proficiency in R programming and R package development Experience in statistical model deployment and end-to-end MLOps (preferred) Extensive experience with cloud infrastructure, preferably Databricks and Azure Experience with Shiny development (preferred) Ability to work with customer stakeholders to understand business processes and workflows, designing solutions to optimize and automate them DevOps experience and familiarity with software release processes Familiarity with Agile delivery methods To apply for this Senior Data Scientist permanent job, please click the button below and submit your latest CV. Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience. Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.
Snowflake Engineer (m/f/d) - cloud platform- AWS/Python/Java/C++/Scala/SQL/ETL/ELT/Cloud Native tools and Services/English Project : For our customer a big pharmaceutical company in Basel we are looking Snowflake Engineer (m/f/d). Background : Clinical trials generate a lot of data, at Roche/Genentech, we use this data in order to make clinical trials more efficient, and effective. Joining Roche/Genentech, you will be at the forefront of innovation within drug discovery and clinical trials. Whether it is in developing open source tools or developing tools that will be used by more than 1000 fellow Roche/Genentech data scientists, you will be working in a multifaceted international team that will be developing and testing software prototypes. The tool that you will be working on automatically generates discrepancies based on the metadata ingested with minimum input from users. Those discrepancies can be optimized to speed up the data cleaning process and allow a fit for purpose data cleaning approach by sending the right query at the right time to the appropriate user reducing the query burden at the sites. Joining a team of inspired professionals with your skills as snowflake engineer you will design and implement data architecture, writing functions and integrating various data models. Besides, you will also focus on optimizing performance, ensuring security, and maintaining the health of the data warehouse through monitoring and troubleshooting. Additionally, you will automate data workflows, collaborate with data stakeholders, and provide documentation and training to ensure effective use of the Snowflake platform for data analytics and business intelligence The perfect has minimum 1-2 years experience working with Snowflake's cloud data platform and is proficient in programming languages such as Python. Additionally we are looking for someone who is experienced with SQL for querying and managing databases. English must be fluent for this role. Tasks & Responsibilities: * Design and implement scalable and efficient data storage and data processing solutions within Snowflake. * Write, optimize, and troubleshoot business owned and developed tables, views, functions and SQL queries within the Snowflake environment. * Ensure data security and compliance with industry standards. * Collaborate with cross-functional teams to meet data-driven business objectives. * Potentially integrate Snowflake with ServiceNow. Must Haves: * Minimum 1-2 years experience working with Snowflake's cloud data platform. * Proficiency in programming languages such as Python (and/or Java, C++, or Scala.) * Experience with SQL for querying and managing databases. * Understanding of data warehousing concepts and architecture. * Familiarity with concepts like ETL/ELT processes. * Performance tuning abilities * Knowledge of cloud platforms such as AWS * Experience using cloud-native tools and services. * English fluent Reference Nr.: 924106SDA Role : Snowflake Engineer (m/f/d) Industrie : Pharma Workplace : Basel Pensum : 100% (during onboarding only onsite, afterwards 1-2 days remote (depending on performance) Start : ASAP (latest Start Date: 1.6.25) Duration : 6 Deadline : 06/04/2025 If you are interested in this position, please send us your complete dossier via the link in this advertisement. About us: ITech Consult is an ISO 9001:2015 certified Swiss company with offices in Germany and Ireland. ITech Consult specialises in the placement of highly qualified candidates for recruitment in the fields of IT, Life Science & Engineering. We offer staff leasing & payroll services. For our candidates this is free of charge, also for Payroll we do not charge you any additional fees.
31/03/2025
Project-based
Snowflake Engineer (m/f/d) - cloud platform- AWS/Python/Java/C++/Scala/SQL/ETL/ELT/Cloud Native tools and Services/English Project : For our customer a big pharmaceutical company in Basel we are looking Snowflake Engineer (m/f/d). Background : Clinical trials generate a lot of data, at Roche/Genentech, we use this data in order to make clinical trials more efficient, and effective. Joining Roche/Genentech, you will be at the forefront of innovation within drug discovery and clinical trials. Whether it is in developing open source tools or developing tools that will be used by more than 1000 fellow Roche/Genentech data scientists, you will be working in a multifaceted international team that will be developing and testing software prototypes. The tool that you will be working on automatically generates discrepancies based on the metadata ingested with minimum input from users. Those discrepancies can be optimized to speed up the data cleaning process and allow a fit for purpose data cleaning approach by sending the right query at the right time to the appropriate user reducing the query burden at the sites. Joining a team of inspired professionals with your skills as snowflake engineer you will design and implement data architecture, writing functions and integrating various data models. Besides, you will also focus on optimizing performance, ensuring security, and maintaining the health of the data warehouse through monitoring and troubleshooting. Additionally, you will automate data workflows, collaborate with data stakeholders, and provide documentation and training to ensure effective use of the Snowflake platform for data analytics and business intelligence The perfect has minimum 1-2 years experience working with Snowflake's cloud data platform and is proficient in programming languages such as Python. Additionally we are looking for someone who is experienced with SQL for querying and managing databases. English must be fluent for this role. Tasks & Responsibilities: * Design and implement scalable and efficient data storage and data processing solutions within Snowflake. * Write, optimize, and troubleshoot business owned and developed tables, views, functions and SQL queries within the Snowflake environment. * Ensure data security and compliance with industry standards. * Collaborate with cross-functional teams to meet data-driven business objectives. * Potentially integrate Snowflake with ServiceNow. Must Haves: * Minimum 1-2 years experience working with Snowflake's cloud data platform. * Proficiency in programming languages such as Python (and/or Java, C++, or Scala.) * Experience with SQL for querying and managing databases. * Understanding of data warehousing concepts and architecture. * Familiarity with concepts like ETL/ELT processes. * Performance tuning abilities * Knowledge of cloud platforms such as AWS * Experience using cloud-native tools and services. * English fluent Reference Nr.: 924106SDA Role : Snowflake Engineer (m/f/d) Industrie : Pharma Workplace : Basel Pensum : 100% (during onboarding only onsite, afterwards 1-2 days remote (depending on performance) Start : ASAP (latest Start Date: 1.6.25) Duration : 6 Deadline : 06/04/2025 If you are interested in this position, please send us your complete dossier via the link in this advertisement. About us: ITech Consult is an ISO 9001:2015 certified Swiss company with offices in Germany and Ireland. ITech Consult specialises in the placement of highly qualified candidates for recruitment in the fields of IT, Life Science & Engineering. We offer staff leasing & payroll services. For our candidates this is free of charge, also for Payroll we do not charge you any additional fees.