Data Scientist/Engineer in Financial Markets - Trading/Azure/Python/Scala/Machine Learning Project/Role : For our clients in finance sector in Zurich we are looking for Data Scientist/Engineer in Financial Markets. Main Tasks : Build data infrastructure for Trading Analytics, including Databases and Data Lake. Develop Machine Learning and Financial models for Trading Execution, Market Making, and Sales. Ensure data quality, including mining, cleansing, and analysing large datasets for use-case development. Write well-documented production code and algorithms, mainly in Python and Scala. Create software applications, user interfaces, APIs, dashboards, and automated reports for business users. Build systems for collection and transformation of large data sets to enable model development. Understand Machine Learning literature and techniques to mentor team members. Assist with the execution of Big Data solutions and related technologies. Present results from AI and Financial models in a proficient manner to stakeholders. Collaborate with Data Architects, Developers, and Business Developers to jointly meet deadlines. Skills/Requirements: Master or PhD in a technical field, such as Mathematics, Physics, Engineering, Computer Science, or alike. 4+ years of work experience as a Data Scientist. Proven knowledge of Machine Learning and Artificial Intelligence, including Advanced Statistical Analysis, Data Mining, Bayesian Statistics, Reinforcement Learning, and Deep Learning. Advanced coding skills in programming languages such as Python, R, Matlab, Scala, C++, and/or Java. Knowledge of Webapp Development, Docker, Kubernetes, Git, and Dash (or similar). Knowledge of Data Architecture, Databases (including Oracle, MSSQL, and MongoDB), Data Lakes, and Cloud Computing (eg MS Azure or Snowflake). Knowledge of PowerBI, Tableau, and VBA. Knowledge of Mathematical Finance and Financial Markets, including Algorithmic Trading, Financial Derivatives, Portfolio Management, and Dynamic Optimization methods. Knowledge of Spark, Databricks, Kafka, and Real Time analytics is a plus. Collaborative, team player, open minded, and easy to work with. Proficient presentation skills and high attention to detail. Able to explain technical results to non-technical audiences. Language: English C1 Reference No.: 923397OK Role: Data Scientist/Engineer in Financial Markets Industry: Finance Location: Zurich und Region Workload: 80%-100% Start: ASAP (latest 01.08.2024) End: 31.12.2024 (with possibility of extension) Should you find yourself suitable for this position then please send us your complete CV using the link in this advert. About us: ITech Consult is a certified ISO 9001:2015 Swiss company with offices also located in Germany and Ireland. ITech Consult is specialised in delivering IT candidates for contract work. We were founded in 1997 by IT professionals; hence we well understand what it means to be professionally supported in your search for a new project and being employed.
25/04/2024
Project-based
Data Scientist/Engineer in Financial Markets - Trading/Azure/Python/Scala/Machine Learning Project/Role : For our clients in finance sector in Zurich we are looking for Data Scientist/Engineer in Financial Markets. Main Tasks : Build data infrastructure for Trading Analytics, including Databases and Data Lake. Develop Machine Learning and Financial models for Trading Execution, Market Making, and Sales. Ensure data quality, including mining, cleansing, and analysing large datasets for use-case development. Write well-documented production code and algorithms, mainly in Python and Scala. Create software applications, user interfaces, APIs, dashboards, and automated reports for business users. Build systems for collection and transformation of large data sets to enable model development. Understand Machine Learning literature and techniques to mentor team members. Assist with the execution of Big Data solutions and related technologies. Present results from AI and Financial models in a proficient manner to stakeholders. Collaborate with Data Architects, Developers, and Business Developers to jointly meet deadlines. Skills/Requirements: Master or PhD in a technical field, such as Mathematics, Physics, Engineering, Computer Science, or alike. 4+ years of work experience as a Data Scientist. Proven knowledge of Machine Learning and Artificial Intelligence, including Advanced Statistical Analysis, Data Mining, Bayesian Statistics, Reinforcement Learning, and Deep Learning. Advanced coding skills in programming languages such as Python, R, Matlab, Scala, C++, and/or Java. Knowledge of Webapp Development, Docker, Kubernetes, Git, and Dash (or similar). Knowledge of Data Architecture, Databases (including Oracle, MSSQL, and MongoDB), Data Lakes, and Cloud Computing (eg MS Azure or Snowflake). Knowledge of PowerBI, Tableau, and VBA. Knowledge of Mathematical Finance and Financial Markets, including Algorithmic Trading, Financial Derivatives, Portfolio Management, and Dynamic Optimization methods. Knowledge of Spark, Databricks, Kafka, and Real Time analytics is a plus. Collaborative, team player, open minded, and easy to work with. Proficient presentation skills and high attention to detail. Able to explain technical results to non-technical audiences. Language: English C1 Reference No.: 923397OK Role: Data Scientist/Engineer in Financial Markets Industry: Finance Location: Zurich und Region Workload: 80%-100% Start: ASAP (latest 01.08.2024) End: 31.12.2024 (with possibility of extension) Should you find yourself suitable for this position then please send us your complete CV using the link in this advert. About us: ITech Consult is a certified ISO 9001:2015 Swiss company with offices also located in Germany and Ireland. ITech Consult is specialised in delivering IT candidates for contract work. We were founded in 1997 by IT professionals; hence we well understand what it means to be professionally supported in your search for a new project and being employed.
Unlock a Promising Career Opportunity in the Environmental Technology Sector. Our client, a forward-thinking company in the CleanTech industry, is seeking a talented Senior Data Scientist to join their team. If you have a strong background in data analysis, machine learning, and time series forecasting, this is the perfect opportunity for you to make a lasting impact in the renewable energy space. Don't miss out on this exciting chance to take your career to new heights. Key Responsibilities: Develop cutting-edge predictive models to analyse data related to energy consumption and renewable energy initiatives. Collaborate with skilled software engineers to effectively implement and deploy models. Establish and promote best practises in data science, ensuring continuous improvement and innovation. Dive deep into the vast ocean of available data sources and explore novel methodologies to uncover groundbreaking insights. Required Skills: Proficient in Python 3 and popular data science tools such as pandas, scikit-learn, and more. Extensive experience in creating and fine-tuning machine learning models. Knowledge of time series forecasting and optimisation techniques for accurate predictions. Strong analytical skills combined with a true passion for research-driven development. Excellent communication skills to effectively convey findings and drive meaningful change. - Bonus: If you have a background in forecasting, it will be a significant advantage. Our client's organisation offers a nurturing and supportive work environment where your expertise and contributions are highly valued. With a commitment to work-life balance, they provide remote work options and flexible hours that empower you to excel both professionally and personally. This is a full-time, permanent position working predominately remotely. As part of our client's visionary team, you'll enjoy a competitive salary ranging from £70,000 to £100,000 per year, commensurate with your experience and capabilities. Join our client's team today and become an essential force in shaping a more sustainable future. Let's work together to make a positive impact in the environmental technology sector. Ready to take the leap and secure this incredible opportunity? Submit your CV now and let us propel your career towards greatness. We can't wait to hear from you.
25/04/2024
Full time
Unlock a Promising Career Opportunity in the Environmental Technology Sector. Our client, a forward-thinking company in the CleanTech industry, is seeking a talented Senior Data Scientist to join their team. If you have a strong background in data analysis, machine learning, and time series forecasting, this is the perfect opportunity for you to make a lasting impact in the renewable energy space. Don't miss out on this exciting chance to take your career to new heights. Key Responsibilities: Develop cutting-edge predictive models to analyse data related to energy consumption and renewable energy initiatives. Collaborate with skilled software engineers to effectively implement and deploy models. Establish and promote best practises in data science, ensuring continuous improvement and innovation. Dive deep into the vast ocean of available data sources and explore novel methodologies to uncover groundbreaking insights. Required Skills: Proficient in Python 3 and popular data science tools such as pandas, scikit-learn, and more. Extensive experience in creating and fine-tuning machine learning models. Knowledge of time series forecasting and optimisation techniques for accurate predictions. Strong analytical skills combined with a true passion for research-driven development. Excellent communication skills to effectively convey findings and drive meaningful change. - Bonus: If you have a background in forecasting, it will be a significant advantage. Our client's organisation offers a nurturing and supportive work environment where your expertise and contributions are highly valued. With a commitment to work-life balance, they provide remote work options and flexible hours that empower you to excel both professionally and personally. This is a full-time, permanent position working predominately remotely. As part of our client's visionary team, you'll enjoy a competitive salary ranging from £70,000 to £100,000 per year, commensurate with your experience and capabilities. Join our client's team today and become an essential force in shaping a more sustainable future. Let's work together to make a positive impact in the environmental technology sector. Ready to take the leap and secure this incredible opportunity? Submit your CV now and let us propel your career towards greatness. We can't wait to hear from you.
This is your opportunity to join one of the most recognisable names in international financial services. With a presence in over 60 countries, they are one of Europe's biggest employers and have achieved Top Employer Europe certification. This means you'll be joining a responsible, positive, and thriving business that puts well-being and personal development at the top of its agenda. Expectation: 50% on-site & 50% homeworking At least 5 years of relevant experience Master's degree in Computer Science or equivalent work experience Sound knowledge of English as well as at least one local language (Dutch or French) (both are a strong plus). Duties include: As we migrate from on-premise Servers to Domino Datalab, we are looking for someone with experience in navigating constraining environments and identifying solutions with available building blocks. The challenges of the platform are combining data availability, security and traceability in order to build the Data Science platform of tomorrow. As a Product Owner you must take into account the requirements of your stakeholders, your clients (both data scientists, and AI consumers) as well as current and upcoming regulations in order to deliver a working AI platform for both model design and industrialisation. Domino Datalab allows our data scientist to work mostly in a self-service manner, but nevertheless we must propose an industrialization path for their outputs with an integration into the banking ecosystem and it's constraints (integrity, availability, throughput, ) Another key challenge is to make data available to data scientists in a secure and compliant way. Access should be simple, forthcoming and compliant while enabling both data exploration and use case industrialisation scenarios. As a Product Owner, you build the roadmap for the coming year making sure that the Development Team understands both the product and its target vision. With the Development Team's help, you make sure that work is prioritized in consistent Sprints. Design phase and project methodology: Build the product's big picture including its roadmap and translate this picture towards the development team Clarify the need and draft the product's key functional specifications according to the agile methodology Iteration planning and priorities definition to ensure the proper content delivery Defines the priorities and monitors the Product Backlog, thus s/he continuously prioritizes the key business needs Make sure that the different stakeholders become and stay aligned on the product to be delivered Project follow-up and support: Initiate a priority management approach, ensure the product consistency and quality Actively participate in Scrum ceremonies (agile retrospectives, demos, test labs ) Is the decision maker or has the authority to arbitrate on delivered functionalities Conduct Sprints Backlog live adjustments and follow-up Language requirements (Mandatory) Sound knowledge of English as well as at least one local language (Dutch or French) (both are a strong plus). Required experience/knowledge Technical experience: (Mandatory) 5 years working with AI models and their deployment Business experience: (Mandatory) Expertise in agile methodologies (Mandatory) At least 2 years of recent banking/financial services experience Budget Management Project Management Soft skills: Ability to see the overall picture (helicopter view) - strategic thinking Analytical mind - conceptual thinking Negotiation and persuasion skills Structured approach Quality-minded and eye for detail Goal-oriented Open to change Leadership Agile requirements An analyst involved in Agile projects must have the "Agile mindset" which implies: A positive attitude and pragmatism Thirst for knowledge: Agile is about learning and adapting. Knowledge sharing is key to success. The goal of team success: Agile is about the success of the team, no individual success or heroic behaviour. It is more important for the team to succeed than for the individual to have completed his/her tasks. There is no failure, only feedback: Agile is about taking everything as lessons, and adjusting actions based on the feedback, resulting in continuous improvement. Contact Epiphany Hatch via e-mail at (see below) or call
25/04/2024
Project-based
This is your opportunity to join one of the most recognisable names in international financial services. With a presence in over 60 countries, they are one of Europe's biggest employers and have achieved Top Employer Europe certification. This means you'll be joining a responsible, positive, and thriving business that puts well-being and personal development at the top of its agenda. Expectation: 50% on-site & 50% homeworking At least 5 years of relevant experience Master's degree in Computer Science or equivalent work experience Sound knowledge of English as well as at least one local language (Dutch or French) (both are a strong plus). Duties include: As we migrate from on-premise Servers to Domino Datalab, we are looking for someone with experience in navigating constraining environments and identifying solutions with available building blocks. The challenges of the platform are combining data availability, security and traceability in order to build the Data Science platform of tomorrow. As a Product Owner you must take into account the requirements of your stakeholders, your clients (both data scientists, and AI consumers) as well as current and upcoming regulations in order to deliver a working AI platform for both model design and industrialisation. Domino Datalab allows our data scientist to work mostly in a self-service manner, but nevertheless we must propose an industrialization path for their outputs with an integration into the banking ecosystem and it's constraints (integrity, availability, throughput, ) Another key challenge is to make data available to data scientists in a secure and compliant way. Access should be simple, forthcoming and compliant while enabling both data exploration and use case industrialisation scenarios. As a Product Owner, you build the roadmap for the coming year making sure that the Development Team understands both the product and its target vision. With the Development Team's help, you make sure that work is prioritized in consistent Sprints. Design phase and project methodology: Build the product's big picture including its roadmap and translate this picture towards the development team Clarify the need and draft the product's key functional specifications according to the agile methodology Iteration planning and priorities definition to ensure the proper content delivery Defines the priorities and monitors the Product Backlog, thus s/he continuously prioritizes the key business needs Make sure that the different stakeholders become and stay aligned on the product to be delivered Project follow-up and support: Initiate a priority management approach, ensure the product consistency and quality Actively participate in Scrum ceremonies (agile retrospectives, demos, test labs ) Is the decision maker or has the authority to arbitrate on delivered functionalities Conduct Sprints Backlog live adjustments and follow-up Language requirements (Mandatory) Sound knowledge of English as well as at least one local language (Dutch or French) (both are a strong plus). Required experience/knowledge Technical experience: (Mandatory) 5 years working with AI models and their deployment Business experience: (Mandatory) Expertise in agile methodologies (Mandatory) At least 2 years of recent banking/financial services experience Budget Management Project Management Soft skills: Ability to see the overall picture (helicopter view) - strategic thinking Analytical mind - conceptual thinking Negotiation and persuasion skills Structured approach Quality-minded and eye for detail Goal-oriented Open to change Leadership Agile requirements An analyst involved in Agile projects must have the "Agile mindset" which implies: A positive attitude and pragmatism Thirst for knowledge: Agile is about learning and adapting. Knowledge sharing is key to success. The goal of team success: Agile is about the success of the team, no individual success or heroic behaviour. It is more important for the team to succeed than for the individual to have completed his/her tasks. There is no failure, only feedback: Agile is about taking everything as lessons, and adjusting actions based on the feedback, resulting in continuous improvement. Contact Epiphany Hatch via e-mail at (see below) or call
As a DevOps / Machine Learning Engineer, you will need to build and manage the Data Science ecosystem and integrate models into the IT infrastructure of the bank. You will work to maintain and improve the workbench environment of the data scientists as well as monitor and deploy A&AI models. As part of your duties, you will have to work on the following: Assist data scientist in setting up projects by helping them identify and install the correct dependencies in their environment, and provide them with a basic project that comes with all the required tooling for code quality and continuous testing. The interface between data scientists and IT brings new models to production and guarantees that all quality checks are met. Act in case of incident. Identify possible causes and dispatch requests to the appropriate squads. Communicate with project stakeholders on the status of those incidents and provide post-mortem analysis. Evolve in an environment where innovation and lean processes are praised, straightforward communication is encouraged, and peers understand the meaning of teaming up. Work with a team of colleagues who are ready to collaborate and share their experiences. Strong experience in Python! Experience with HTTP rest APIs Experience with Git (version control system) Experience in MLOps/DevOps Language Requirements: English: Fluent spoken and written Dutch/French: Good level of either Interested? Apply today contact Frankie Mancini at (see below)
25/04/2024
Project-based
As a DevOps / Machine Learning Engineer, you will need to build and manage the Data Science ecosystem and integrate models into the IT infrastructure of the bank. You will work to maintain and improve the workbench environment of the data scientists as well as monitor and deploy A&AI models. As part of your duties, you will have to work on the following: Assist data scientist in setting up projects by helping them identify and install the correct dependencies in their environment, and provide them with a basic project that comes with all the required tooling for code quality and continuous testing. The interface between data scientists and IT brings new models to production and guarantees that all quality checks are met. Act in case of incident. Identify possible causes and dispatch requests to the appropriate squads. Communicate with project stakeholders on the status of those incidents and provide post-mortem analysis. Evolve in an environment where innovation and lean processes are praised, straightforward communication is encouraged, and peers understand the meaning of teaming up. Work with a team of colleagues who are ready to collaborate and share their experiences. Strong experience in Python! Experience with HTTP rest APIs Experience with Git (version control system) Experience in MLOps/DevOps Language Requirements: English: Fluent spoken and written Dutch/French: Good level of either Interested? Apply today contact Frankie Mancini at (see below)
Data Analyst/Data Engineer - Cork, Ireland (Hybrid Working) - Contract TEKsystems is thrilled to offer an exciting opportunity for a Junior data engineer/analyst (2-3 years experience ) to join our dynamic team of software developers and data scientists in the Business Analytics team for one of the world's largest technology companies in the world. Why This Role Is Exciting: Innovative Culture and Collaboration - Our client fosters a creative and collaborative environment. Their visionary leadership, commitment to innovation, and unique culture contribute to employee contentment. Consumer-Centric Approach : Our clients' focus on simplicity and consumer-first attitude sets it apart. In a world filled with complex features and gadgets, our client stands out by prioritising what truly matters. Key Requirements for Success: We are seeking a Data Engineer to support some innovative data pipeline projects working across a broad, modern tech stack. You must have experience working with modern databases such as Snowflake and MySQL and interested in visualization tools such as Tableau and PowerBI. Any skills in Big Data and Process Orchestration are beneficial. Role Details: Location: Cork, Ireland Office Days: 3 days a week 2-3 years experience If you're a Data Analyst/Data Engineer seeking your next opportunity, apply directly or reach out Data Analyst/Data Engineer - Cork, Ireland (Hybrid Working) - Contract Job Title: Data Engineer Location: Cork, Ireland Job Type: Contract Trading as TEKsystems. Allegis Group Limited, Bracknell, RG12 1RT, United Kingdom. No Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as "Allegis Group"). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at our website. To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go our website. We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the "Contacting Us" section of our Online Privacy Notice on our website for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield.
25/04/2024
Project-based
Data Analyst/Data Engineer - Cork, Ireland (Hybrid Working) - Contract TEKsystems is thrilled to offer an exciting opportunity for a Junior data engineer/analyst (2-3 years experience ) to join our dynamic team of software developers and data scientists in the Business Analytics team for one of the world's largest technology companies in the world. Why This Role Is Exciting: Innovative Culture and Collaboration - Our client fosters a creative and collaborative environment. Their visionary leadership, commitment to innovation, and unique culture contribute to employee contentment. Consumer-Centric Approach : Our clients' focus on simplicity and consumer-first attitude sets it apart. In a world filled with complex features and gadgets, our client stands out by prioritising what truly matters. Key Requirements for Success: We are seeking a Data Engineer to support some innovative data pipeline projects working across a broad, modern tech stack. You must have experience working with modern databases such as Snowflake and MySQL and interested in visualization tools such as Tableau and PowerBI. Any skills in Big Data and Process Orchestration are beneficial. Role Details: Location: Cork, Ireland Office Days: 3 days a week 2-3 years experience If you're a Data Analyst/Data Engineer seeking your next opportunity, apply directly or reach out Data Analyst/Data Engineer - Cork, Ireland (Hybrid Working) - Contract Job Title: Data Engineer Location: Cork, Ireland Job Type: Contract Trading as TEKsystems. Allegis Group Limited, Bracknell, RG12 1RT, United Kingdom. No Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as "Allegis Group"). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at our website. To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go our website. We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the "Contacting Us" section of our Online Privacy Notice on our website for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield.
On-site 2-3 days (and otherwise remote/off-site) in Måløv, Denmark Duration: Expected duration 12 months. Expected experience: Strong Vue. skills and experience with Python is a must Preferred experience: Hands-on experience with Titian Mosaic or a similar research sample inventory system is considered an advantage but not a must. Knowledge of Oracle DB, SQL or PL/SQL (fundamental understanding). Be able to communicate in English Software Development: We would rather look also for someone with a more workflow/Front End focus. Application Configuration: Titian/Mosaic experience or worked with a COTS system and administered/configured it. Technologies to be used: Python Vue CI/CD Git REST APIs AWS and ADO Cloud Oracle database SQL Responsibilities and skills: Contributes to an agile product team that is onboarding Pharma R&D wet-labs to the Titian Mosaic sample inventory system Development, bug fixing and second line support on our in-house build application for sample management handling (build in Vue). Analyze complex lab workflows and translate the sample management needs of scientists into requirements for the IT solution Implements changes in the configuration (user roles, data model) as well as metadata (dropdowns) in the Mosaic system Be able to work in highly changeable organization. Communicate and collaborate with R&ED scientists (typically lab associates/scientists), Research IT experts, Software Engineers, and Product Owners You will work with our agile Inventory Management System product team together with eight other colleagues. They are working on implementing our new browser-based inventory system, Mosaic, from Titian software company as well as an in-house develop web application for sample management. You will be in touch with different laboratory teams in Denmark, UK and US regularly, to ensure business consistency and continuity. Finally, you will collaborate with business analysts and developers in the team to share knowledge and develop new solutions.
22/04/2024
Project-based
On-site 2-3 days (and otherwise remote/off-site) in Måløv, Denmark Duration: Expected duration 12 months. Expected experience: Strong Vue. skills and experience with Python is a must Preferred experience: Hands-on experience with Titian Mosaic or a similar research sample inventory system is considered an advantage but not a must. Knowledge of Oracle DB, SQL or PL/SQL (fundamental understanding). Be able to communicate in English Software Development: We would rather look also for someone with a more workflow/Front End focus. Application Configuration: Titian/Mosaic experience or worked with a COTS system and administered/configured it. Technologies to be used: Python Vue CI/CD Git REST APIs AWS and ADO Cloud Oracle database SQL Responsibilities and skills: Contributes to an agile product team that is onboarding Pharma R&D wet-labs to the Titian Mosaic sample inventory system Development, bug fixing and second line support on our in-house build application for sample management handling (build in Vue). Analyze complex lab workflows and translate the sample management needs of scientists into requirements for the IT solution Implements changes in the configuration (user roles, data model) as well as metadata (dropdowns) in the Mosaic system Be able to work in highly changeable organization. Communicate and collaborate with R&ED scientists (typically lab associates/scientists), Research IT experts, Software Engineers, and Product Owners You will work with our agile Inventory Management System product team together with eight other colleagues. They are working on implementing our new browser-based inventory system, Mosaic, from Titian software company as well as an in-house develop web application for sample management. You will be in touch with different laboratory teams in Denmark, UK and US regularly, to ensure business consistency and continuity. Finally, you will collaborate with business analysts and developers in the team to share knowledge and develop new solutions.
Senior Data Scientist/Data Scientist/Machine Learning Engineer/Exeter/Torquay/Weymouth/Python/C#/Unity/Machine Learning - £45,000 - £55,000 My client is looking for a new Data Scientist with experience in Artificial Intelligence and Machine Learning to join a growing development team as a Data and Machine Learning Developer. Your expertise will directly contribute to the advancement of their Immersive technology solutions (working in Unity/C#) and their applications across various sectors. You MUST have Front End development and Machine Learning experience to be suitable for this position. Requirements: Ph.D. in Computer Science, Data Science, Statistics, or a related field (ideally with an application to human factors, psychology or neuroscience). Prior exposure to and knowledge of physiological sensors, biosensors and data acquisition systems would be an advantage It would be an advantage if you already have SC clearance You also must have been based in the UK for the last 5 years Responsibilities: Experience with programming languages such as Python is needed; familiarity with C# is a plus Create, deploy and refine algorithms that process, analyse, and learn from large psychophysiological data sets Collaborate with team members along with communicating to the relevant stakeholders on technical matters Capable of Integrating Machine Learning models into production systems and deploy them for Real Time or batch processing Senior Data Scientist/Data Scientist/Machine Learning Engineer/Exeter/Torquay/Weymouth/Python/C#/Unity/Machine Learning - £45,000 - £55,000 Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
22/04/2024
Full time
Senior Data Scientist/Data Scientist/Machine Learning Engineer/Exeter/Torquay/Weymouth/Python/C#/Unity/Machine Learning - £45,000 - £55,000 My client is looking for a new Data Scientist with experience in Artificial Intelligence and Machine Learning to join a growing development team as a Data and Machine Learning Developer. Your expertise will directly contribute to the advancement of their Immersive technology solutions (working in Unity/C#) and their applications across various sectors. You MUST have Front End development and Machine Learning experience to be suitable for this position. Requirements: Ph.D. in Computer Science, Data Science, Statistics, or a related field (ideally with an application to human factors, psychology or neuroscience). Prior exposure to and knowledge of physiological sensors, biosensors and data acquisition systems would be an advantage It would be an advantage if you already have SC clearance You also must have been based in the UK for the last 5 years Responsibilities: Experience with programming languages such as Python is needed; familiarity with C# is a plus Create, deploy and refine algorithms that process, analyse, and learn from large psychophysiological data sets Collaborate with team members along with communicating to the relevant stakeholders on technical matters Capable of Integrating Machine Learning models into production systems and deploy them for Real Time or batch processing Senior Data Scientist/Data Scientist/Machine Learning Engineer/Exeter/Torquay/Weymouth/Python/C#/Unity/Machine Learning - £45,000 - £55,000 Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Digital Research Infrastructure Engineer - Linux Specialist PML operations grade 4 £30000 - £45000 DOE Full Time Open Ended Appointment The Role We have an exciting opportunity at PML for an individual with skills in Linux system administration to join the PML s Digital Innovation and Marine Autonomy (DIMA) group. The role provides a business critical link between scientists, PML Applications (commercial work) and our IT Group to support the Linux computing infrastructure as it continues to evolve, underpinning PML science in multiple areas and across all levels. This ranges from data generation, (storage technologies and data management), processing and analysis (high performance computing and technologies such as JupyterHub), to making visual outputs for end users (web technologies and virtualisation) to increase the reach and impact of PML science. About You You will enjoy working with others to help deliver a modern and reliable digital infrastructure to underpin the world leading research carried out at PML. You will understand the importance of stability from existing infrastructure but will also be keen to learn and try new technologies. You will have experience of administering Linux systems, ideally using Ubuntu, and will be able to make use of scripts and common tools such as ansible to manage this. You will understand the importance of taking a proactive approach to identify and resolve and problems and will be able to make use of monitoring software (e.g., Nagios, Grafana) to accomplish this. You will understand best practices in cybersecurity and be able to apply these. Skills Required Linux systems administration and monitoring Linux scripting (e.g., bash and Python) Experience in management of data at the Terrabyte to Petabyte scale and storage technologies such as NFS and S3. Cybersecurity (Understand and apply best practices) Container technologies (Docker and Kubernetes) High performance Computing (Slurm) Virtualisation (VMWare) Key Deliverables Maintain our storage infrastructure to ensure data is distributed across servers based on existing capacity and projected changes in data volumes. This includes regular data moves and liaising with stakeholders to ensure data is backed up and archiving projects are completes as needed. Monitor high performance computing infrastructure to identify and resolve problems either on their own or by working with IT (depending on the nature of the problem). Act of a point of contact between scientists and IT to answer questions, help identify solutions and provide training. Work with the data architect to maintain and develop web infrastructure used to provide existing and planned data search and visualisation services. Manage the NEODAAS GPU cluster (MAGEO), including liaising with IT, vendors and system users. About PML As a marine-focused charity we develop and apply innovative science with a view to ensuring ocean sustainability. With over 40 years of experience, we offer evidence-based solutions to societal challenges. Our impact spans from research publications to informing policies and training future scientists. The science undertaken at PML contributes to UN Sustainable Development Goals by promoting healthy, productive and resilient oceans and seas. To support PML s science it operates in house Linux infrastructure used for processing satellite data, running models and making outputs accessible through web visualisation tools. This infrastructure includes a large amount of storage (6 PB), a High-Performance Computing cluster with over 1500 cores, a 40 GPU cluster (the MAssive GPU cluster for Earth Observation; MAGEO) and a virtual machine cluster. The role will be part of the Digital Innovation and Marine Autonomy (DIMA) group within PML. DIMA is a pioneering digital science group dedicated to advancing PML s world-class and cutting-edge environmental research through the utilisation of state-of-the-art digital and autonomous technologies. The team comprises research software engineers, research infrastructure engineers, marine technologists and scientists who work on a variety of projects using autonomous vessels, satellite data, drones, Artificial Intelligence, High Performance Computing and data visualisation tools to help deliver PML s goals. The team have an enthusiasm for solving problems through collaboration and shared learning.
12/04/2024
Full time
Digital Research Infrastructure Engineer - Linux Specialist PML operations grade 4 £30000 - £45000 DOE Full Time Open Ended Appointment The Role We have an exciting opportunity at PML for an individual with skills in Linux system administration to join the PML s Digital Innovation and Marine Autonomy (DIMA) group. The role provides a business critical link between scientists, PML Applications (commercial work) and our IT Group to support the Linux computing infrastructure as it continues to evolve, underpinning PML science in multiple areas and across all levels. This ranges from data generation, (storage technologies and data management), processing and analysis (high performance computing and technologies such as JupyterHub), to making visual outputs for end users (web technologies and virtualisation) to increase the reach and impact of PML science. About You You will enjoy working with others to help deliver a modern and reliable digital infrastructure to underpin the world leading research carried out at PML. You will understand the importance of stability from existing infrastructure but will also be keen to learn and try new technologies. You will have experience of administering Linux systems, ideally using Ubuntu, and will be able to make use of scripts and common tools such as ansible to manage this. You will understand the importance of taking a proactive approach to identify and resolve and problems and will be able to make use of monitoring software (e.g., Nagios, Grafana) to accomplish this. You will understand best practices in cybersecurity and be able to apply these. Skills Required Linux systems administration and monitoring Linux scripting (e.g., bash and Python) Experience in management of data at the Terrabyte to Petabyte scale and storage technologies such as NFS and S3. Cybersecurity (Understand and apply best practices) Container technologies (Docker and Kubernetes) High performance Computing (Slurm) Virtualisation (VMWare) Key Deliverables Maintain our storage infrastructure to ensure data is distributed across servers based on existing capacity and projected changes in data volumes. This includes regular data moves and liaising with stakeholders to ensure data is backed up and archiving projects are completes as needed. Monitor high performance computing infrastructure to identify and resolve problems either on their own or by working with IT (depending on the nature of the problem). Act of a point of contact between scientists and IT to answer questions, help identify solutions and provide training. Work with the data architect to maintain and develop web infrastructure used to provide existing and planned data search and visualisation services. Manage the NEODAAS GPU cluster (MAGEO), including liaising with IT, vendors and system users. About PML As a marine-focused charity we develop and apply innovative science with a view to ensuring ocean sustainability. With over 40 years of experience, we offer evidence-based solutions to societal challenges. Our impact spans from research publications to informing policies and training future scientists. The science undertaken at PML contributes to UN Sustainable Development Goals by promoting healthy, productive and resilient oceans and seas. To support PML s science it operates in house Linux infrastructure used for processing satellite data, running models and making outputs accessible through web visualisation tools. This infrastructure includes a large amount of storage (6 PB), a High-Performance Computing cluster with over 1500 cores, a 40 GPU cluster (the MAssive GPU cluster for Earth Observation; MAGEO) and a virtual machine cluster. The role will be part of the Digital Innovation and Marine Autonomy (DIMA) group within PML. DIMA is a pioneering digital science group dedicated to advancing PML s world-class and cutting-edge environmental research through the utilisation of state-of-the-art digital and autonomous technologies. The team comprises research software engineers, research infrastructure engineers, marine technologists and scientists who work on a variety of projects using autonomous vessels, satellite data, drones, Artificial Intelligence, High Performance Computing and data visualisation tools to help deliver PML s goals. The team have an enthusiasm for solving problems through collaboration and shared learning.