Job Summary: We are looking for a talented Data Scientist who can characterise business problems, develop data-driven solutions, and communicate insights effectively to stakeholders. The successful candidate will have a strong foundation in statistics, programming skills, and experience with big data platforms. This role requires excellent problem-solving skills, leadership abilities, and the ability to work collaboratively with teams. Requirements: Education: Bachelor's/Master's degree in Machine Learning or Computer Science, Statistics, or related field. Preference will be given to those candidates with strong educational background as well as relevant certifications in the mentioned fields. Skills: Strong foundation in statistics and programming (R/Python). Experience with data preparation, visualisation, and model building. Knowledge of big data platforms (Hadoop, Spark) and SQL/NoSQL databases. Experience: 3+ years of experience as a Data Scientist or related role. Typical Responsibilities: Develop and maintain data products. Data Engineering teams are responsible for the delivery and operational stability of the data products built and provide ongoing support for those products. Data Engineers work within, and contribute to, the overall data development life cycle process as part of multi-functional Agile delivery teams focused on one or more products. Data Scientists should have the following skills: Data science foundation - a data scientist must be able to: - Characterise a business problem - Formulate a hypothesis - Demonstrate the use of methodologies in the analytics cycle - Plan for the execution Understanding the data science workflow and recognizing the importance of each element of the process is critical for successful implementations. Statistics and programming foundation (Analysis & Visualisation) - the competencies in this area are focused on the knowledge of key statistics concepts and methods essential to finding structure in data and making predictions. Programming skills (R/Python) or other statistical programming skills are essential -and the ability to visualise data, extract insights and communicate the insights in a clear and concise manner. Data preparation - to ensure usable data sets, the key competencies required are: - Identifying and collecting the data required - Manipulating, transforming and cleaning the data A data scientist must deal with data anomalies such as missing values, outliers, unbalanced data and data normalisation. Model building - this stage is the core of the data science execution, where different algorithms are used to train the data and the best algorithm is selected. A data scientist should know: - Multiple modelling techniques - Model validation and selection techniques A data scientist must understand the use of different methodologies to get insight from the data and translate the insight into business value. Model deployment - an ML model is valuable when it's integrated into an existing production environment and used to make business decisions. Deploying a validated model and monitoring it to maintain the accuracy of the results is a key skill. Big data foundation - a data scientist deals with a large volume of structured and unstructured data, they must demonstrate understanding of how big data is used, the big data ecosystem and its major components. The data scientist must also demonstrate expertise with big data platforms, such as Hadoop and Spark and master SQL and NoSQL. Leadership and professional development - data scientists must be good problem solvers. They must understand the opportunity before implementing the solution, work in a rigorous and complete manner, and explain their findings. A data scientist needs to understand the concepts of analysing business risk, making improvements in processes and how systems engineering works.
20/12/2024
Full time
Job Summary: We are looking for a talented Data Scientist who can characterise business problems, develop data-driven solutions, and communicate insights effectively to stakeholders. The successful candidate will have a strong foundation in statistics, programming skills, and experience with big data platforms. This role requires excellent problem-solving skills, leadership abilities, and the ability to work collaboratively with teams. Requirements: Education: Bachelor's/Master's degree in Machine Learning or Computer Science, Statistics, or related field. Preference will be given to those candidates with strong educational background as well as relevant certifications in the mentioned fields. Skills: Strong foundation in statistics and programming (R/Python). Experience with data preparation, visualisation, and model building. Knowledge of big data platforms (Hadoop, Spark) and SQL/NoSQL databases. Experience: 3+ years of experience as a Data Scientist or related role. Typical Responsibilities: Develop and maintain data products. Data Engineering teams are responsible for the delivery and operational stability of the data products built and provide ongoing support for those products. Data Engineers work within, and contribute to, the overall data development life cycle process as part of multi-functional Agile delivery teams focused on one or more products. Data Scientists should have the following skills: Data science foundation - a data scientist must be able to: - Characterise a business problem - Formulate a hypothesis - Demonstrate the use of methodologies in the analytics cycle - Plan for the execution Understanding the data science workflow and recognizing the importance of each element of the process is critical for successful implementations. Statistics and programming foundation (Analysis & Visualisation) - the competencies in this area are focused on the knowledge of key statistics concepts and methods essential to finding structure in data and making predictions. Programming skills (R/Python) or other statistical programming skills are essential -and the ability to visualise data, extract insights and communicate the insights in a clear and concise manner. Data preparation - to ensure usable data sets, the key competencies required are: - Identifying and collecting the data required - Manipulating, transforming and cleaning the data A data scientist must deal with data anomalies such as missing values, outliers, unbalanced data and data normalisation. Model building - this stage is the core of the data science execution, where different algorithms are used to train the data and the best algorithm is selected. A data scientist should know: - Multiple modelling techniques - Model validation and selection techniques A data scientist must understand the use of different methodologies to get insight from the data and translate the insight into business value. Model deployment - an ML model is valuable when it's integrated into an existing production environment and used to make business decisions. Deploying a validated model and monitoring it to maintain the accuracy of the results is a key skill. Big data foundation - a data scientist deals with a large volume of structured and unstructured data, they must demonstrate understanding of how big data is used, the big data ecosystem and its major components. The data scientist must also demonstrate expertise with big data platforms, such as Hadoop and Spark and master SQL and NoSQL. Leadership and professional development - data scientists must be good problem solvers. They must understand the opportunity before implementing the solution, work in a rigorous and complete manner, and explain their findings. A data scientist needs to understand the concepts of analysing business risk, making improvements in processes and how systems engineering works.
Python Developer - Hybrid working We are working with one of our world renowned clients who are looking to recruit an experienced Python Developer. Job Responsibilities: Develop robust ETL data pipelines for large scale data into SQL and NoSQL systems Maintain and optimize AWS RDS, S3, and other data storage systems Collaborate with data analysts and data scientists to implement analytics and ML models through AWS Implement CI/CD standards and tools Oversee the work of junior developers Plan, oversee, and implement sprint plans for multiple development efforts using Agile processes Work within an AWS ecosystem, leveraging cloud services for scalable applications Develop and maintain Python-based Back End services Drive code reviews and contribute to technical documentation About You: 5-7 years of experience in Python centric development Excellent proficiency in Python and common ML oriented packages Significant experience working with AWS PostgreSQL RDS, including database design, optimization, and management Significant experience with AWS services and cloud architecture Strong understanding of database technologies (SQL and NoSQL) Expertise in machine learning concepts and data visualization techniques Strong understanding of RESTful API design and implementation Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
19/12/2024
Full time
Python Developer - Hybrid working We are working with one of our world renowned clients who are looking to recruit an experienced Python Developer. Job Responsibilities: Develop robust ETL data pipelines for large scale data into SQL and NoSQL systems Maintain and optimize AWS RDS, S3, and other data storage systems Collaborate with data analysts and data scientists to implement analytics and ML models through AWS Implement CI/CD standards and tools Oversee the work of junior developers Plan, oversee, and implement sprint plans for multiple development efforts using Agile processes Work within an AWS ecosystem, leveraging cloud services for scalable applications Develop and maintain Python-based Back End services Drive code reviews and contribute to technical documentation About You: 5-7 years of experience in Python centric development Excellent proficiency in Python and common ML oriented packages Significant experience working with AWS PostgreSQL RDS, including database design, optimization, and management Significant experience with AWS services and cloud architecture Strong understanding of database technologies (SQL and NoSQL) Expertise in machine learning concepts and data visualization techniques Strong understanding of RESTful API design and implementation Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
An Edinburgh-based t ech-start up , working in the tech for good space, is looking for a skilled Python Software Engineer to join their hybrid team - genuinely interesting subject matter and real variety in work. They've been running for a few years now and are really starting to make a name for themselves, they have one core product and develop a series of applications that are critical within the research community in their field. They operate in the tech for good space, and everything you'll be working on will be used by academic researchers to provide them with tools and the ability to analyse their data to help form conclusions - they predominantly work in the environmental space. You'll work in a multidisciplinary team consisting of Data Scientist and Software Engineers , and will experience real variety in your role. You'll spend part of your time working on their core product, be tasked with developing multiple tools and applications from scratch, and helping to maintain and enhance existing applications. They work in a pretty fast paced environment due to the nature of their project work, so they are looking for someone that enjoys this style of working. As the project work is pretty varied, their tech stack is also quite similar. Predominantly they work in Python (moving towards FastAPI), they host applications on GCP, within a Linux environment and tend to use ReactJS for the Front End with a MongoDB database. However, they're looking for a creative and curious Software Engineer to join the business and if you feel a different technology would better suit a project you genuinely have the ability to suggest and implement this. You'll ideally have commercial experience with most of the following; * Python * JavaScript * Cloud Services * CI/CD * NoSQL Databases The following experience is highly desirable; * Working within academic research/strong academic background * Working with Algorithms * ReactJS This role would suit an ambitious Software Engineer looking for a challenging role, you'll be able to pick up as much responsibility as you crave here, and will be expected to work pretty independently on technical projects. You'll also be able to get involved in requirements gathering, prototyping, system design and even suggesting new technologies. They're able to offer a salary of £40k to £50k for this role with a series of benefits to match. Their offices are based in central Edinburgh, just a short walk from Haymarket and Waverly train station. The team supports hybrid working here, where you'll be expected onsite about once a week (although most are regularly in more often - out of choice), they also offer very flexible working arrangements. If you're keen to find out more, please apply or drop Doug Paget at Cathcart Technology a message
19/12/2024
Full time
An Edinburgh-based t ech-start up , working in the tech for good space, is looking for a skilled Python Software Engineer to join their hybrid team - genuinely interesting subject matter and real variety in work. They've been running for a few years now and are really starting to make a name for themselves, they have one core product and develop a series of applications that are critical within the research community in their field. They operate in the tech for good space, and everything you'll be working on will be used by academic researchers to provide them with tools and the ability to analyse their data to help form conclusions - they predominantly work in the environmental space. You'll work in a multidisciplinary team consisting of Data Scientist and Software Engineers , and will experience real variety in your role. You'll spend part of your time working on their core product, be tasked with developing multiple tools and applications from scratch, and helping to maintain and enhance existing applications. They work in a pretty fast paced environment due to the nature of their project work, so they are looking for someone that enjoys this style of working. As the project work is pretty varied, their tech stack is also quite similar. Predominantly they work in Python (moving towards FastAPI), they host applications on GCP, within a Linux environment and tend to use ReactJS for the Front End with a MongoDB database. However, they're looking for a creative and curious Software Engineer to join the business and if you feel a different technology would better suit a project you genuinely have the ability to suggest and implement this. You'll ideally have commercial experience with most of the following; * Python * JavaScript * Cloud Services * CI/CD * NoSQL Databases The following experience is highly desirable; * Working within academic research/strong academic background * Working with Algorithms * ReactJS This role would suit an ambitious Software Engineer looking for a challenging role, you'll be able to pick up as much responsibility as you crave here, and will be expected to work pretty independently on technical projects. You'll also be able to get involved in requirements gathering, prototyping, system design and even suggesting new technologies. They're able to offer a salary of £40k to £50k for this role with a series of benefits to match. Their offices are based in central Edinburgh, just a short walk from Haymarket and Waverly train station. The team supports hybrid working here, where you'll be expected onsite about once a week (although most are regularly in more often - out of choice), they also offer very flexible working arrangements. If you're keen to find out more, please apply or drop Doug Paget at Cathcart Technology a message
Senior Level - SFIA5 Salary: £50k-65k dependent on experience Location: Coventry/Hybrid About Scrumconnect: Scrumconnect is a leading force in technology consultancy, proudly contributing to over 20% of the UK's most significant citizen-facing public services. Our award-winning team has made a substantial impact, delivering more than 64 services in the past two years alone. This work has not only reached over 50 million citizens but also achieved considerable savings for the taxpayer, amounting to over £25 million. At Scrumconnect, we foster a community of talented consultants who thrive on collaboration, sharing knowledge, and continuous learning to address and solve complex challenges. Our mission is to combine advanced software engineering, human-focused design, and data-driven insights to deliver unparalleled service to our clients. A lead data scientist is a leader of data science, quite often with responsibility for managing and developing teams. At this role level, you will: have a broad knowledge of data science techniques, use cases and potential impact, as well as the tools and technologies have extensive experience in scoping, designing and delivering data science outputs and products work collaboratively with a range of experts in support of organisational objectives communicate effectively and challenge delivery plans and priorities appreciate and understand data ethics, data preparation and manipulation appreciate and understand delivery methods, and how to deliver supported solutions at scale Skills: Applied maths practises (Level: Expert) identify opportunities to develop statistical insight, reports and models to support organisational objectives, while collaborating across the organisation effectively critique statistical analyses use a variety of data analytics techniques (such as data mining and prescriptive and predictive analytics) for complex data analysis through the whole data life cycle use model outputs to produce evidence and help design services and policies understand a broad range of statistical tools, particularly those deployed within the organisation, and can use these appropriately and help others to use them Data Engineering (Level Expert) help to identify the data engineering requirements for any data science product, while working with data engineers and data scientists to design and deliver those products into the organisation effectively understand the need to cleanse and prepare data before including it in data science products and can put reusable processes and checks in place understand a broad range of architectures, including cloud and on-premise, and data manipulation and transformation tools deployed within the organisation, and can use these tools appropriately and help others use them Data Science Innovation (Expert) be a leader in the data science space demonstrate in-depth knowledge of data science tools and techniques, which you can use to solve problems creatively and to create opportunities for your team act as a coach, inspiring curiosity and creativity in others demonstrate in-depth knowledge of your chosen profession and keep up to date with changes in the industry challenge the status quo and always look for ways to improve data science One example of such work is proven experience to build Recurrent Neural Network (either in R or Python) from underlying data sets (after performing EDA) Delivering business impact (Level Practitioner) lead and support your organisation area by using data science to create change identify opportunities to develop data science products to support organisational objectives, while collaborating across the organisation to fulfil goals show an understanding of the role of user research, and can design and manage processes to gather and establish user needs communicate relevant and compelling stories effectively and present analysis and data visualisations clearly to get across complex messages work with colleagues to implement scalable data science products, and to understand maintenance requirements Ethics and Privacy (Level Expert) show an understanding of how ethical issues fit into a wider context and can work with relevant stakeholders stay up to date with developments in data ethics standards and legislation frameworks, using these to improve processes in your work area identify and respond to ethical concerns in your area of responsibility Programming for Data Science (Level Expert) write complex programs and scripts seek to make code open source where appropriate supervise Junior Analysts and set coding standards for your team understand software architecture and how to write efficient, optimised code perform user testing on products prior to launch Product Delivery (Level Expert) understand the differences between delivery methods, such as Agile and waterfall, and can set out how your team should use and adapt these methods lead a team through the different phases of the product delivery life cycle collaborate with the product manager to influence the direction of work identify and involve relevant teams to smoothly deliver data science products into the organisation, ensuring these products inform decision making ensure products are monitored, maintained and continually improved, engaging and working with others where necessary have oversight of any data science features implemented within products or services Knowledge of Public Sector Standards Government Digital Service (GDS): Familiarity with GDS service standards and the Technology Code of Practice. These skills reflect the need for both technical depth and the ability to navigate the unique demands of the UK public sector environment. Desired Qualifications Certifications in Azure, Databricks, or related technologies. Experience with public sector data initiatives and compliance requirements. Proven expertise and experience with machine learning and artificial intelligence concepts What our offer includes 28 days holiday inc. bank holidays 1 day Birthday leave after 1 year service 2 additional days after 2 years service Pension: 4% employee, 3% employer BUPA Health Cover AIG Life Cover Rewards Gateway On job training
17/12/2024
Full time
Senior Level - SFIA5 Salary: £50k-65k dependent on experience Location: Coventry/Hybrid About Scrumconnect: Scrumconnect is a leading force in technology consultancy, proudly contributing to over 20% of the UK's most significant citizen-facing public services. Our award-winning team has made a substantial impact, delivering more than 64 services in the past two years alone. This work has not only reached over 50 million citizens but also achieved considerable savings for the taxpayer, amounting to over £25 million. At Scrumconnect, we foster a community of talented consultants who thrive on collaboration, sharing knowledge, and continuous learning to address and solve complex challenges. Our mission is to combine advanced software engineering, human-focused design, and data-driven insights to deliver unparalleled service to our clients. A lead data scientist is a leader of data science, quite often with responsibility for managing and developing teams. At this role level, you will: have a broad knowledge of data science techniques, use cases and potential impact, as well as the tools and technologies have extensive experience in scoping, designing and delivering data science outputs and products work collaboratively with a range of experts in support of organisational objectives communicate effectively and challenge delivery plans and priorities appreciate and understand data ethics, data preparation and manipulation appreciate and understand delivery methods, and how to deliver supported solutions at scale Skills: Applied maths practises (Level: Expert) identify opportunities to develop statistical insight, reports and models to support organisational objectives, while collaborating across the organisation effectively critique statistical analyses use a variety of data analytics techniques (such as data mining and prescriptive and predictive analytics) for complex data analysis through the whole data life cycle use model outputs to produce evidence and help design services and policies understand a broad range of statistical tools, particularly those deployed within the organisation, and can use these appropriately and help others to use them Data Engineering (Level Expert) help to identify the data engineering requirements for any data science product, while working with data engineers and data scientists to design and deliver those products into the organisation effectively understand the need to cleanse and prepare data before including it in data science products and can put reusable processes and checks in place understand a broad range of architectures, including cloud and on-premise, and data manipulation and transformation tools deployed within the organisation, and can use these tools appropriately and help others use them Data Science Innovation (Expert) be a leader in the data science space demonstrate in-depth knowledge of data science tools and techniques, which you can use to solve problems creatively and to create opportunities for your team act as a coach, inspiring curiosity and creativity in others demonstrate in-depth knowledge of your chosen profession and keep up to date with changes in the industry challenge the status quo and always look for ways to improve data science One example of such work is proven experience to build Recurrent Neural Network (either in R or Python) from underlying data sets (after performing EDA) Delivering business impact (Level Practitioner) lead and support your organisation area by using data science to create change identify opportunities to develop data science products to support organisational objectives, while collaborating across the organisation to fulfil goals show an understanding of the role of user research, and can design and manage processes to gather and establish user needs communicate relevant and compelling stories effectively and present analysis and data visualisations clearly to get across complex messages work with colleagues to implement scalable data science products, and to understand maintenance requirements Ethics and Privacy (Level Expert) show an understanding of how ethical issues fit into a wider context and can work with relevant stakeholders stay up to date with developments in data ethics standards and legislation frameworks, using these to improve processes in your work area identify and respond to ethical concerns in your area of responsibility Programming for Data Science (Level Expert) write complex programs and scripts seek to make code open source where appropriate supervise Junior Analysts and set coding standards for your team understand software architecture and how to write efficient, optimised code perform user testing on products prior to launch Product Delivery (Level Expert) understand the differences between delivery methods, such as Agile and waterfall, and can set out how your team should use and adapt these methods lead a team through the different phases of the product delivery life cycle collaborate with the product manager to influence the direction of work identify and involve relevant teams to smoothly deliver data science products into the organisation, ensuring these products inform decision making ensure products are monitored, maintained and continually improved, engaging and working with others where necessary have oversight of any data science features implemented within products or services Knowledge of Public Sector Standards Government Digital Service (GDS): Familiarity with GDS service standards and the Technology Code of Practice. These skills reflect the need for both technical depth and the ability to navigate the unique demands of the UK public sector environment. Desired Qualifications Certifications in Azure, Databricks, or related technologies. Experience with public sector data initiatives and compliance requirements. Proven expertise and experience with machine learning and artificial intelligence concepts What our offer includes 28 days holiday inc. bank holidays 1 day Birthday leave after 1 year service 2 additional days after 2 years service Pension: 4% employee, 3% employer BUPA Health Cover AIG Life Cover Rewards Gateway On job training
NO SPONSORSHIP Associate Principal, Data Analytics Engineering SALARY: $110k flex plus 10% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote You will be expanding analytics capabilities to design and build internal analytics within data warehouse using on-premises and cloud-based tools. You will create dashboards or visualization using the tools tableau powerBI SQL Queries Alteryx Jira services now. GIT a big plus, AWS or loud data warehouse airflow bs degree masters preferred this is working for operational risk 5 years experience building dashboards any audit risk knowledge is a plus This role will drive a team responsible for expanding analytics capabilities by making internal corporate data accessible and usable to analysts throughout the organization. Primary Duties and Responsibilities: Work closely with data analyst and business stakeholders to understand their data needs and provide support in data access, data preparation, and ad hoc queries Automate data processes to reduce manual interventions, improve data processing efficiency and optimize data workflow for performance scalability Integrate data form multiple sources and ensure data consistency and quality Build data models to ensure information is available in our analytics warehouse for downstream uses, such as analysis and create dashboards or visualizations using Tableau, Power BI to present insights Maintain performance requirements of our analytics warehouse by tuning optimizations and processes Create documentation and testing to ensure data is accurate and easily understandable Promote self-service capabilities and data literacy for business users leveraging the platform through development of training presentations and resources Discover and share best practices for data and analytics engineering with members of the team Invest in your continued learning on data and analytics engineering best practices and evaluate them for fit in improving maintainability and reliability of analytics infrastructure Qualifications: Ability to collaborate with multiple partners (eg, Corporate Risk, Compliance, Audit, Production Operations, DBAs, Data Architecture, Security) to craft solutions that align business goals with internal security and development standards Ability to communicate technical concepts to audiences with varying levels of technical background and synthesize non-technical requests into technical output Comfortable supporting business analysts on high-priority projects. High attention to detail and ability to think structurally about a solution Experience working within an agile environment Technical Skills & Background Ability to write and optimize complex analytical (SELECT) SQL queries Experience with data viz/prep tools Tableau and Alteryx [Preferred] Experience with SaaS tools and their backends, such as Jira and ServiceNow [Preferred] Applied knowledge of Python for writing custom pipeline code (virtual environments, functional programming, and unit testing) [Preferred] Experience with a source code repository system (preferably Git) [Preferred] Familiarity with at least one cloud data platform, such as AWS or GCP [Preferred] Experience creating and/or maintaining a cloud data warehouse or database [Preferred] Exposure to data orchestration tools, such as Airflow [Preferred] Understanding of applied statistics and hands-on experience applying these concepts Bachelor's degree in quantitative discipline (eg, Statistics, Computer Science, Mathematics, Physics, Electrical Engineering, Industrial Engineering) or equivalent professional experience 5+ years of experience as a business analyst, data analyst, data engineer, research analyst, data engineer, analytics engineer, Business Intelligence analyst, data analyst, data scientist, or research analyst
16/12/2024
Full time
NO SPONSORSHIP Associate Principal, Data Analytics Engineering SALARY: $110k flex plus 10% bonus LOCATION: Chicago, IL Hybrid 3 days in office and 2 days remote You will be expanding analytics capabilities to design and build internal analytics within data warehouse using on-premises and cloud-based tools. You will create dashboards or visualization using the tools tableau powerBI SQL Queries Alteryx Jira services now. GIT a big plus, AWS or loud data warehouse airflow bs degree masters preferred this is working for operational risk 5 years experience building dashboards any audit risk knowledge is a plus This role will drive a team responsible for expanding analytics capabilities by making internal corporate data accessible and usable to analysts throughout the organization. Primary Duties and Responsibilities: Work closely with data analyst and business stakeholders to understand their data needs and provide support in data access, data preparation, and ad hoc queries Automate data processes to reduce manual interventions, improve data processing efficiency and optimize data workflow for performance scalability Integrate data form multiple sources and ensure data consistency and quality Build data models to ensure information is available in our analytics warehouse for downstream uses, such as analysis and create dashboards or visualizations using Tableau, Power BI to present insights Maintain performance requirements of our analytics warehouse by tuning optimizations and processes Create documentation and testing to ensure data is accurate and easily understandable Promote self-service capabilities and data literacy for business users leveraging the platform through development of training presentations and resources Discover and share best practices for data and analytics engineering with members of the team Invest in your continued learning on data and analytics engineering best practices and evaluate them for fit in improving maintainability and reliability of analytics infrastructure Qualifications: Ability to collaborate with multiple partners (eg, Corporate Risk, Compliance, Audit, Production Operations, DBAs, Data Architecture, Security) to craft solutions that align business goals with internal security and development standards Ability to communicate technical concepts to audiences with varying levels of technical background and synthesize non-technical requests into technical output Comfortable supporting business analysts on high-priority projects. High attention to detail and ability to think structurally about a solution Experience working within an agile environment Technical Skills & Background Ability to write and optimize complex analytical (SELECT) SQL queries Experience with data viz/prep tools Tableau and Alteryx [Preferred] Experience with SaaS tools and their backends, such as Jira and ServiceNow [Preferred] Applied knowledge of Python for writing custom pipeline code (virtual environments, functional programming, and unit testing) [Preferred] Experience with a source code repository system (preferably Git) [Preferred] Familiarity with at least one cloud data platform, such as AWS or GCP [Preferred] Experience creating and/or maintaining a cloud data warehouse or database [Preferred] Exposure to data orchestration tools, such as Airflow [Preferred] Understanding of applied statistics and hands-on experience applying these concepts Bachelor's degree in quantitative discipline (eg, Statistics, Computer Science, Mathematics, Physics, Electrical Engineering, Industrial Engineering) or equivalent professional experience 5+ years of experience as a business analyst, data analyst, data engineer, research analyst, data engineer, analytics engineer, Business Intelligence analyst, data analyst, data scientist, or research analyst