Data Scientist vacancy for our Zurich based client in the financial sector . Your tasks: Developing data infrastructure for Trading Analytics, including Databases and Data Lake Creating Machine Learning and Financial models for Trading Execution, Market Making and Sales Being responsible for data quality, including mining, cleansing, and analysing large datasets Writing well-documented production code and algorithms, using Python and Scala Preparing software applications, user interfaces, APIs, dashboards, and automated reports Contributing to the execution of Big Data solutions as well as presenting results from AI and financial models to stakeholders Your experience/knowledge: 4+ years of experience as a data scientist with expertise in machine learning and AI Knowhow of Advanced Statistical Analysis, Data Mining, Bayesian Statistics and Deep Learning Proficiency in coding with Python, R, Matlab, Scala, C++, or Java A thorough understanding of Webapp Development, Docker, Kubernetes, Git, and Dash Good knowledge of Data Architecture, Databases, Data Lakes, and Cloud Computing Academic degree in a technical field, such as Mathematics, Engineering or Computer Science Language skills: English - fluent in written and spoken Your soft skills: Team player with excellent communication and presentation skills High attention to detail and a solution-oriented way of working Location: Zurich, Switzerland Sector: Finance Start: ASAP Duration: 06MM+ Ref .Nr.: BH21347 Take the next step and send us your resume along with a daytime phone number where we can reach you. Due to Swiss work permit restrictions, we can only consider applications from Swiss nationals, EU citizens as well as current work-permit holders for Switzerland. Ukrainian refugees are warmly welcomed, we will support you all the way. We welcome applications from individuals of all genders, age groups, sexual orientations, personal expressions, ethnic backgrounds, and religious beliefs. Therefore, there is no requirement to provide gender information or a photo in your application. As per client requirements, we need information about your marital status, nationality, date of birth, and a valid Swiss work permit. For applicants with disabilities, we are happy to explore potential solutions with our end client.
02/05/2024
Project-based
Data Scientist vacancy for our Zurich based client in the financial sector . Your tasks: Developing data infrastructure for Trading Analytics, including Databases and Data Lake Creating Machine Learning and Financial models for Trading Execution, Market Making and Sales Being responsible for data quality, including mining, cleansing, and analysing large datasets Writing well-documented production code and algorithms, using Python and Scala Preparing software applications, user interfaces, APIs, dashboards, and automated reports Contributing to the execution of Big Data solutions as well as presenting results from AI and financial models to stakeholders Your experience/knowledge: 4+ years of experience as a data scientist with expertise in machine learning and AI Knowhow of Advanced Statistical Analysis, Data Mining, Bayesian Statistics and Deep Learning Proficiency in coding with Python, R, Matlab, Scala, C++, or Java A thorough understanding of Webapp Development, Docker, Kubernetes, Git, and Dash Good knowledge of Data Architecture, Databases, Data Lakes, and Cloud Computing Academic degree in a technical field, such as Mathematics, Engineering or Computer Science Language skills: English - fluent in written and spoken Your soft skills: Team player with excellent communication and presentation skills High attention to detail and a solution-oriented way of working Location: Zurich, Switzerland Sector: Finance Start: ASAP Duration: 06MM+ Ref .Nr.: BH21347 Take the next step and send us your resume along with a daytime phone number where we can reach you. Due to Swiss work permit restrictions, we can only consider applications from Swiss nationals, EU citizens as well as current work-permit holders for Switzerland. Ukrainian refugees are warmly welcomed, we will support you all the way. We welcome applications from individuals of all genders, age groups, sexual orientations, personal expressions, ethnic backgrounds, and religious beliefs. Therefore, there is no requirement to provide gender information or a photo in your application. As per client requirements, we need information about your marital status, nationality, date of birth, and a valid Swiss work permit. For applicants with disabilities, we are happy to explore potential solutions with our end client.
.* Data Specialist - Portugal - Freelance Opportunity.* RED is a IT Specialist Provider for the past 20 years and we support costumers all over the EU. One of our key customers is currently building a Data Team to work on a long-term project on a freelance contract basis. They are looking to hire Data Engineers, Data Scientists, Data Analytics and BI consultants. This will be a 12 months contract with further extensions. The customer is based in Portugal and we need someone local that speaks Portuguese. Job title: Data Specialist Duration: 12 Months + ext Location: Part Remote Start: ASAP Technologies: Azure, Power BI, SQL, Python Candidates must be based in Portugal If you are interested to discuss the role in more detail send me your CV and I will call you to discuss the positon in more detail
01/05/2024
Project-based
.* Data Specialist - Portugal - Freelance Opportunity.* RED is a IT Specialist Provider for the past 20 years and we support costumers all over the EU. One of our key customers is currently building a Data Team to work on a long-term project on a freelance contract basis. They are looking to hire Data Engineers, Data Scientists, Data Analytics and BI consultants. This will be a 12 months contract with further extensions. The customer is based in Portugal and we need someone local that speaks Portuguese. Job title: Data Specialist Duration: 12 Months + ext Location: Part Remote Start: ASAP Technologies: Azure, Power BI, SQL, Python Candidates must be based in Portugal If you are interested to discuss the role in more detail send me your CV and I will call you to discuss the positon in more detail
Xpertise is on the lookout for a Data Scientist who can play a key role in creating real-world value through the use of data. The organisation is offering someone the autonomy to identify new ways of utilising data using all the latest Microsoft tech stack. Key details: Salary: £40,000-60,000 depending on experience Location: Birmingham + hybrid working model (remote) Future outlook: There's plenty of opportunity for career progression, and we'll offer a dedicated career plan to reach this goal. We'll give you the freedom to unlock value for your team and wider organisation using the tech you think is required! Skills desired/what you will learn: Microsoft Azure Azure SQL Microsoft Fabric Delta Lake, Databricks and Spark Statistical Modelling Azure ML Studio Python and familiarity with libraries and frameworks for data analysis and machine learning (eg, TensorFlow, scikit-learn, Pandas). Microsoft Power BI The ability to use analytical and problem-solving skills, with the ability to work with large, complex datasets and derive actionable insights. Utilise AI tools to assist in the optimisation of algorithms and analysis of data, enhancing the efficiency and accuracy of management solutions. Value generation and insights extraction Role overview: Do you want free reign to explore new concepts and create value for the organisation and its customers? Then this one's for you. We want you to tell us where the interesting data is and how to effectively utilise this Interested? Please apply with your CV and Billy Hall will reach out with more details. Xpertise acts as an employment agency.
01/05/2024
Full time
Xpertise is on the lookout for a Data Scientist who can play a key role in creating real-world value through the use of data. The organisation is offering someone the autonomy to identify new ways of utilising data using all the latest Microsoft tech stack. Key details: Salary: £40,000-60,000 depending on experience Location: Birmingham + hybrid working model (remote) Future outlook: There's plenty of opportunity for career progression, and we'll offer a dedicated career plan to reach this goal. We'll give you the freedom to unlock value for your team and wider organisation using the tech you think is required! Skills desired/what you will learn: Microsoft Azure Azure SQL Microsoft Fabric Delta Lake, Databricks and Spark Statistical Modelling Azure ML Studio Python and familiarity with libraries and frameworks for data analysis and machine learning (eg, TensorFlow, scikit-learn, Pandas). Microsoft Power BI The ability to use analytical and problem-solving skills, with the ability to work with large, complex datasets and derive actionable insights. Utilise AI tools to assist in the optimisation of algorithms and analysis of data, enhancing the efficiency and accuracy of management solutions. Value generation and insights extraction Role overview: Do you want free reign to explore new concepts and create value for the organisation and its customers? Then this one's for you. We want you to tell us where the interesting data is and how to effectively utilise this Interested? Please apply with your CV and Billy Hall will reach out with more details. Xpertise acts as an employment agency.
Contract Length: 8 Months initially (Likely extensions) Work Location: Midlands Work Type: 5 Days Onsite IR35 Status: Inside IR35 Industry: Manufacturing Interview Process: 2 stage interview, MS Teams Security Clearance: Must be SC Cleared Montash is collaborating with an IT tech partner to deliver services to one of the UK's largest manufacturing businesses. We are seeking a motivated software engineer to join a team of developers to contribute to the development of a new data application product. Role Description: Experience as a full-stack engineer or proficient Front End developer with exposure to at least one of the following: JavaScript, Python, or TypeScript. Technical proficiency in one or more of the following technologies: Python, VueJS, TypeScript, React. Familiarity with various Python modules for data manipulation and processing frameworks. Proficient with Elasticsearch and Kibana. Backend development experience with FastAPI and/or Flask, and SQLAlchemy and/or Pytest. Frontend development using JavaScript web frameworks (VueJS or React). Experience with Front End web application state management. Unit testing experience for Front End web applications. Familiarity with Microsoft SQL Server. Responsibilities: Develop and administer functional databases and applications. Plan, execute, and troubleshoot implementation tests to achieve desired outcomes. Conduct code reviews and provide support. Troubleshoot and debug software issues. Support junior developers and contribute to design decisions. Collaborate with data engineers and scientists to improve software application structures. Operate in an Agile environment. Required: SC Clearance. British nationality. Please do not apply if you do not have SC clearance. If interested, please apply and attach your CV below. Note that candidates must be SC cleared to qualify for this role.
01/05/2024
Project-based
Contract Length: 8 Months initially (Likely extensions) Work Location: Midlands Work Type: 5 Days Onsite IR35 Status: Inside IR35 Industry: Manufacturing Interview Process: 2 stage interview, MS Teams Security Clearance: Must be SC Cleared Montash is collaborating with an IT tech partner to deliver services to one of the UK's largest manufacturing businesses. We are seeking a motivated software engineer to join a team of developers to contribute to the development of a new data application product. Role Description: Experience as a full-stack engineer or proficient Front End developer with exposure to at least one of the following: JavaScript, Python, or TypeScript. Technical proficiency in one or more of the following technologies: Python, VueJS, TypeScript, React. Familiarity with various Python modules for data manipulation and processing frameworks. Proficient with Elasticsearch and Kibana. Backend development experience with FastAPI and/or Flask, and SQLAlchemy and/or Pytest. Frontend development using JavaScript web frameworks (VueJS or React). Experience with Front End web application state management. Unit testing experience for Front End web applications. Familiarity with Microsoft SQL Server. Responsibilities: Develop and administer functional databases and applications. Plan, execute, and troubleshoot implementation tests to achieve desired outcomes. Conduct code reviews and provide support. Troubleshoot and debug software issues. Support junior developers and contribute to design decisions. Collaborate with data engineers and scientists to improve software application structures. Operate in an Agile environment. Required: SC Clearance. British nationality. Please do not apply if you do not have SC clearance. If interested, please apply and attach your CV below. Note that candidates must be SC cleared to qualify for this role.
AI developer needed for a 4 Month project with potential to extend in London. This project will onsite 5 days a week for the first month, then2-3 days a week for the remaining duration of the project, paying £550 a day inside IR35. The ideal AI developer will be responsible for designing and developing AI systems, including machine learning algorithms, natural language processing systems, and computer vision systems. This will involve collaborating with cross-functional teams to determine technical specifications, assessing data availability and quality, and selecting appropriate modelling techniques. Testing and validating AI models You will be responsible for testing and validating the accuracy and reliability of AI models, ensuring that they perform in line with business requirements and expectations. The ideal AI developer will also be responsible for maintaining and improving existing AI systems, including monitoring performance, identifying areas for optimization, and implementing improvements and updates as needed. Role Developing AI models and algorithms Analyzing large datasets to identify patterns and trends Creating and maintaining databases for AI-powered systems Collaborating with software developers to integrate AI models into broader systems Working with data scientists to ensure that AI systems align with broader business strategy Advising on AI-related legal and ethical considerations developing and configuring chatbots and integrating them with Azure platform Integrate chatbots with different data sources and APIs Harness Azure services to create, deploy, and scale chatbot applications Experience and skills required - Chat bot experience, including Text to speech and speech to text. Azure stack knowledge AI-powered systems Role - AI Developer Location - London Rate - £550 per day inside IR35 Duration - 4 Months with Potential extension.
01/05/2024
Project-based
AI developer needed for a 4 Month project with potential to extend in London. This project will onsite 5 days a week for the first month, then2-3 days a week for the remaining duration of the project, paying £550 a day inside IR35. The ideal AI developer will be responsible for designing and developing AI systems, including machine learning algorithms, natural language processing systems, and computer vision systems. This will involve collaborating with cross-functional teams to determine technical specifications, assessing data availability and quality, and selecting appropriate modelling techniques. Testing and validating AI models You will be responsible for testing and validating the accuracy and reliability of AI models, ensuring that they perform in line with business requirements and expectations. The ideal AI developer will also be responsible for maintaining and improving existing AI systems, including monitoring performance, identifying areas for optimization, and implementing improvements and updates as needed. Role Developing AI models and algorithms Analyzing large datasets to identify patterns and trends Creating and maintaining databases for AI-powered systems Collaborating with software developers to integrate AI models into broader systems Working with data scientists to ensure that AI systems align with broader business strategy Advising on AI-related legal and ethical considerations developing and configuring chatbots and integrating them with Azure platform Integrate chatbots with different data sources and APIs Harness Azure services to create, deploy, and scale chatbot applications Experience and skills required - Chat bot experience, including Text to speech and speech to text. Azure stack knowledge AI-powered systems Role - AI Developer Location - London Rate - £550 per day inside IR35 Duration - 4 Months with Potential extension.
Nicoll Curtin Technology
Sankt Gallen, Sankt Gallen
Data Scientist We are seeking a skilled Data Scientist fluent in English to join our client's team permanently. In this role, you'll be tasked with crafting machine learning pipelines for document processing, utilizing large language models, assessing the feasibility of innovative algorithms, and collaborating closely with diverse stakeholders and the machine learning team. If you have a university degree in artificial intelligence, computer science, or mathematics with a focus on machine learning, along with a minimum of five years' experience in machine and deep learning, experience in training, evaluating, and deploying neural networks, as well as a solid background in NLP, LLM, OCR, Python, and SQL/NoSQL databases, and experience with large vision models and graph neural networks, this opportunity is tailored for you!
30/04/2024
Full time
Data Scientist We are seeking a skilled Data Scientist fluent in English to join our client's team permanently. In this role, you'll be tasked with crafting machine learning pipelines for document processing, utilizing large language models, assessing the feasibility of innovative algorithms, and collaborating closely with diverse stakeholders and the machine learning team. If you have a university degree in artificial intelligence, computer science, or mathematics with a focus on machine learning, along with a minimum of five years' experience in machine and deep learning, experience in training, evaluating, and deploying neural networks, as well as a solid background in NLP, LLM, OCR, Python, and SQL/NoSQL databases, and experience with large vision models and graph neural networks, this opportunity is tailored for you!
We are seeking a highly skilled Senior Python Programmer with expertise in machine learning (ML) data wrangling, interfacing, and automation. General Responsibilities The ideal candidate will be proficient in building robust data pipelines and automating complex tasks to support ML initiatives. They will have a keen understanding of observability principles and possess hands-on experience with AWS, Linux, and preferably Kubernetes and Argo. Responsibilities: - Develop and maintain robust data pipelines for ML data wrangling, interfacing, and automation. - Implement automation solutions to streamline data processing and model deployment workflows. - Ensure observability and monitoring of systems, providing insights into performance and reliability. - Utilize AWS services such as S3, Lambda, and networking components for data storage, processing, and permissions management. - Collaborate with DevOps teams to deploy and manage applications in Linux environments. - Support Kubernetes and Argo workflows for scalable and efficient ML model training and deployment. - Manage AWS permissions and network configurations to ensure data security and compliance. - Maintain version control of codebase using Git and enforce best practices for code documentation and production readiness. - Collaborate with data scientists to develop small UI tools for querying data from databases and AWS S3. Requirements : - Bachelor's or Master's degree in Computer Science, Engineering, or related field. - Proficiency in Python programming language with a focus on ML data wrangling and automation. - Strong experience with AWS services, including S3, Lambda, networking, and permissions management. - Hands-on experience with Linux environments and Shell Scripting. - Good to have pharmaceutical experience Working remotly and on site in Belgium for the occasional meeting. 6 months + extension
29/04/2024
Project-based
We are seeking a highly skilled Senior Python Programmer with expertise in machine learning (ML) data wrangling, interfacing, and automation. General Responsibilities The ideal candidate will be proficient in building robust data pipelines and automating complex tasks to support ML initiatives. They will have a keen understanding of observability principles and possess hands-on experience with AWS, Linux, and preferably Kubernetes and Argo. Responsibilities: - Develop and maintain robust data pipelines for ML data wrangling, interfacing, and automation. - Implement automation solutions to streamline data processing and model deployment workflows. - Ensure observability and monitoring of systems, providing insights into performance and reliability. - Utilize AWS services such as S3, Lambda, and networking components for data storage, processing, and permissions management. - Collaborate with DevOps teams to deploy and manage applications in Linux environments. - Support Kubernetes and Argo workflows for scalable and efficient ML model training and deployment. - Manage AWS permissions and network configurations to ensure data security and compliance. - Maintain version control of codebase using Git and enforce best practices for code documentation and production readiness. - Collaborate with data scientists to develop small UI tools for querying data from databases and AWS S3. Requirements : - Bachelor's or Master's degree in Computer Science, Engineering, or related field. - Proficiency in Python programming language with a focus on ML data wrangling and automation. - Strong experience with AWS services, including S3, Lambda, networking, and permissions management. - Hands-on experience with Linux environments and Shell Scripting. - Good to have pharmaceutical experience Working remotly and on site in Belgium for the occasional meeting. 6 months + extension
Hybrid Python Developer with a minimum of 5 years' experience is needed for a 12 month contract paying 1000PLN to 1500PLN per day depending on experience based in Krakow, with 3 days working on site. The ideal Python developer will be based in Poland and able to write clean, maintainable Python code for various projects, including data analytics tools, financial models and automation scripts. Collaborate with cross-functional teams like quants, product managers, architects and data scientists to implement feature enhancements. Conduct code reviews to ensure the quality and functionality of the code produced by peers. Use version control systems, like GitLab, to keep track of code changes. Write robust automated tests utilizing industry-standard tools such as PyTest, Robot Framework or Selenium. Document code and procedures, assist in setting project milestones and maintain organizational transparency. Tech Stack Python, Django, Flask, Pandas, AWS, Azure, Google Cloud, GitHub, Gitlab, PyTest, Robot Frame work, Selenium, CI/CD pipelines, Agile, CPLEX Qualifications: Bachelor's degree in computer science, software engineering, or a related field or equivalent experience. Minimum of 5 years of professional experience in Python programming. Familiarity with Python frameworks and extensions like Django, Flask or Pandas. Proficient in using cloud services and technologies such as AWS, Azure, or Google Cloud. Preferred Qualifications : Industry-specific experience in finance or banking. Exposure to DevOps practices and CI/CD pipelines. Experience working in an Agile framework, familiarity with Agile ceremonies. Familiarity with linear programming tools like CPLEX. Role - Python Developer Location - Krakow in Poland Duration - 12 month Rate -1000PLN to 1500PLN per day
29/04/2024
Project-based
Hybrid Python Developer with a minimum of 5 years' experience is needed for a 12 month contract paying 1000PLN to 1500PLN per day depending on experience based in Krakow, with 3 days working on site. The ideal Python developer will be based in Poland and able to write clean, maintainable Python code for various projects, including data analytics tools, financial models and automation scripts. Collaborate with cross-functional teams like quants, product managers, architects and data scientists to implement feature enhancements. Conduct code reviews to ensure the quality and functionality of the code produced by peers. Use version control systems, like GitLab, to keep track of code changes. Write robust automated tests utilizing industry-standard tools such as PyTest, Robot Framework or Selenium. Document code and procedures, assist in setting project milestones and maintain organizational transparency. Tech Stack Python, Django, Flask, Pandas, AWS, Azure, Google Cloud, GitHub, Gitlab, PyTest, Robot Frame work, Selenium, CI/CD pipelines, Agile, CPLEX Qualifications: Bachelor's degree in computer science, software engineering, or a related field or equivalent experience. Minimum of 5 years of professional experience in Python programming. Familiarity with Python frameworks and extensions like Django, Flask or Pandas. Proficient in using cloud services and technologies such as AWS, Azure, or Google Cloud. Preferred Qualifications : Industry-specific experience in finance or banking. Exposure to DevOps practices and CI/CD pipelines. Experience working in an Agile framework, familiarity with Agile ceremonies. Familiarity with linear programming tools like CPLEX. Role - Python Developer Location - Krakow in Poland Duration - 12 month Rate -1000PLN to 1500PLN per day
A global medical device company are looking for a R&D Process Development Engineer to join their Research and Development team on a contract basis. Familiarity with processes associated with biomaterial coating is required. The role will involve Laboratory Work, design experiments, analysing data, and optimizing processes to meet performance and regulatory requirements. You'll be part of a cross-functional collaboration. Working closely with a diverse team of scientists, design engineers, process engineers and other functions to ensure project alignment with goals and milestones. You will also need to complete documentation, maintain comprehensive records, documentation, and reports. The role will also involve Project Management. Taking ownership of assigned tasks within projects, managing task specific timelines, budgets effectively. You will work closely with vendors to ensure that the processes are compliant with regulatory requirements and that costs and timelines are maintain. Essential skills: Biomaterial coating experience, eg Chemical Vapour Deposition, Physical Vapour Deposition, Dip coating Experience working with external vendors High level knowledge of the processes Desirable skills: Combination devices experience Project management experience The start date is for ASAP. The initial contract length is for 12 months (there will be options to extend). The role is based in Limerick and can be done mostly remotely. You will only need to come onsite every other week. The rate is €55 per hour, depending on experience, if you have any expenses please let me know and I can factor that into the rate for you. Please visit our website to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement. Real Staffing, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy| Registered office | London, EC4N 7BE, United Kingdom | Partnership Number | OC387148 England and Wales
26/04/2024
Project-based
A global medical device company are looking for a R&D Process Development Engineer to join their Research and Development team on a contract basis. Familiarity with processes associated with biomaterial coating is required. The role will involve Laboratory Work, design experiments, analysing data, and optimizing processes to meet performance and regulatory requirements. You'll be part of a cross-functional collaboration. Working closely with a diverse team of scientists, design engineers, process engineers and other functions to ensure project alignment with goals and milestones. You will also need to complete documentation, maintain comprehensive records, documentation, and reports. The role will also involve Project Management. Taking ownership of assigned tasks within projects, managing task specific timelines, budgets effectively. You will work closely with vendors to ensure that the processes are compliant with regulatory requirements and that costs and timelines are maintain. Essential skills: Biomaterial coating experience, eg Chemical Vapour Deposition, Physical Vapour Deposition, Dip coating Experience working with external vendors High level knowledge of the processes Desirable skills: Combination devices experience Project management experience The start date is for ASAP. The initial contract length is for 12 months (there will be options to extend). The role is based in Limerick and can be done mostly remotely. You will only need to come onsite every other week. The rate is €55 per hour, depending on experience, if you have any expenses please let me know and I can factor that into the rate for you. Please visit our website to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement. Real Staffing, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy| Registered office | London, EC4N 7BE, United Kingdom | Partnership Number | OC387148 England and Wales
Digital Research Infrastructure Engineer - Linux Specialist PML operations grade 4 £30000 - £45000 DOE Full Time Open Ended Appointment The Role We have an exciting opportunity at PML for an individual with skills in Linux system administration to join the PML s Digital Innovation and Marine Autonomy (DIMA) group. The role provides a business critical link between scientists, PML Applications (commercial work) and our IT Group to support the Linux computing infrastructure as it continues to evolve, underpinning PML science in multiple areas and across all levels. This ranges from data generation, (storage technologies and data management), processing and analysis (high performance computing and technologies such as JupyterHub), to making visual outputs for end users (web technologies and virtualisation) to increase the reach and impact of PML science. About You You will enjoy working with others to help deliver a modern and reliable digital infrastructure to underpin the world leading research carried out at PML. You will understand the importance of stability from existing infrastructure but will also be keen to learn and try new technologies. You will have experience of administering Linux systems, ideally using Ubuntu, and will be able to make use of scripts and common tools such as ansible to manage this. You will understand the importance of taking a proactive approach to identify and resolve and problems and will be able to make use of monitoring software (e.g., Nagios, Grafana) to accomplish this. You will understand best practices in cybersecurity and be able to apply these. Skills Required Linux systems administration and monitoring Linux scripting (e.g., bash and Python) Experience in management of data at the Terrabyte to Petabyte scale and storage technologies such as NFS and S3. Cybersecurity (Understand and apply best practices) Container technologies (Docker and Kubernetes) High performance Computing (Slurm) Virtualisation (VMWare) Key Deliverables Maintain our storage infrastructure to ensure data is distributed across servers based on existing capacity and projected changes in data volumes. This includes regular data moves and liaising with stakeholders to ensure data is backed up and archiving projects are completes as needed. Monitor high performance computing infrastructure to identify and resolve problems either on their own or by working with IT (depending on the nature of the problem). Act of a point of contact between scientists and IT to answer questions, help identify solutions and provide training. Work with the data architect to maintain and develop web infrastructure used to provide existing and planned data search and visualisation services. Manage the NEODAAS GPU cluster (MAGEO), including liaising with IT, vendors and system users. About PML As a marine-focused charity we develop and apply innovative science with a view to ensuring ocean sustainability. With over 40 years of experience, we offer evidence-based solutions to societal challenges. Our impact spans from research publications to informing policies and training future scientists. The science undertaken at PML contributes to UN Sustainable Development Goals by promoting healthy, productive and resilient oceans and seas. To support PML s science it operates in house Linux infrastructure used for processing satellite data, running models and making outputs accessible through web visualisation tools. This infrastructure includes a large amount of storage (6 PB), a High-Performance Computing cluster with over 1500 cores, a 40 GPU cluster (the MAssive GPU cluster for Earth Observation; MAGEO) and a virtual machine cluster. The role will be part of the Digital Innovation and Marine Autonomy (DIMA) group within PML. DIMA is a pioneering digital science group dedicated to advancing PML s world-class and cutting-edge environmental research through the utilisation of state-of-the-art digital and autonomous technologies. The team comprises research software engineers, research infrastructure engineers, marine technologists and scientists who work on a variety of projects using autonomous vessels, satellite data, drones, Artificial Intelligence, High Performance Computing and data visualisation tools to help deliver PML s goals. The team have an enthusiasm for solving problems through collaboration and shared learning.
12/04/2024
Full time
Digital Research Infrastructure Engineer - Linux Specialist PML operations grade 4 £30000 - £45000 DOE Full Time Open Ended Appointment The Role We have an exciting opportunity at PML for an individual with skills in Linux system administration to join the PML s Digital Innovation and Marine Autonomy (DIMA) group. The role provides a business critical link between scientists, PML Applications (commercial work) and our IT Group to support the Linux computing infrastructure as it continues to evolve, underpinning PML science in multiple areas and across all levels. This ranges from data generation, (storage technologies and data management), processing and analysis (high performance computing and technologies such as JupyterHub), to making visual outputs for end users (web technologies and virtualisation) to increase the reach and impact of PML science. About You You will enjoy working with others to help deliver a modern and reliable digital infrastructure to underpin the world leading research carried out at PML. You will understand the importance of stability from existing infrastructure but will also be keen to learn and try new technologies. You will have experience of administering Linux systems, ideally using Ubuntu, and will be able to make use of scripts and common tools such as ansible to manage this. You will understand the importance of taking a proactive approach to identify and resolve and problems and will be able to make use of monitoring software (e.g., Nagios, Grafana) to accomplish this. You will understand best practices in cybersecurity and be able to apply these. Skills Required Linux systems administration and monitoring Linux scripting (e.g., bash and Python) Experience in management of data at the Terrabyte to Petabyte scale and storage technologies such as NFS and S3. Cybersecurity (Understand and apply best practices) Container technologies (Docker and Kubernetes) High performance Computing (Slurm) Virtualisation (VMWare) Key Deliverables Maintain our storage infrastructure to ensure data is distributed across servers based on existing capacity and projected changes in data volumes. This includes regular data moves and liaising with stakeholders to ensure data is backed up and archiving projects are completes as needed. Monitor high performance computing infrastructure to identify and resolve problems either on their own or by working with IT (depending on the nature of the problem). Act of a point of contact between scientists and IT to answer questions, help identify solutions and provide training. Work with the data architect to maintain and develop web infrastructure used to provide existing and planned data search and visualisation services. Manage the NEODAAS GPU cluster (MAGEO), including liaising with IT, vendors and system users. About PML As a marine-focused charity we develop and apply innovative science with a view to ensuring ocean sustainability. With over 40 years of experience, we offer evidence-based solutions to societal challenges. Our impact spans from research publications to informing policies and training future scientists. The science undertaken at PML contributes to UN Sustainable Development Goals by promoting healthy, productive and resilient oceans and seas. To support PML s science it operates in house Linux infrastructure used for processing satellite data, running models and making outputs accessible through web visualisation tools. This infrastructure includes a large amount of storage (6 PB), a High-Performance Computing cluster with over 1500 cores, a 40 GPU cluster (the MAssive GPU cluster for Earth Observation; MAGEO) and a virtual machine cluster. The role will be part of the Digital Innovation and Marine Autonomy (DIMA) group within PML. DIMA is a pioneering digital science group dedicated to advancing PML s world-class and cutting-edge environmental research through the utilisation of state-of-the-art digital and autonomous technologies. The team comprises research software engineers, research infrastructure engineers, marine technologists and scientists who work on a variety of projects using autonomous vessels, satellite data, drones, Artificial Intelligence, High Performance Computing and data visualisation tools to help deliver PML s goals. The team have an enthusiasm for solving problems through collaboration and shared learning.