G & L Consulting
Romania
Job Title: Data Specialist x 4 Open Positions Location: Romania (Remote working permissible, however MUST be able to travel to client site in Romania if required) Duration: Initial 3 Month Contract Start Date: Immediate/ASAP Daily Rate: Negotiable (Paid in EUROS) Job Description Who are we: We create incredible digital experiences that change the world for our customers. We develop and deploy disruptive solutions which helps create more advocates, create a voice distinct from competition and anticipate disruption. We are powered by our core values of Customer success, We Care, Entrepreneurial (mind set) and Excellence that form our DNA and drive everything that we do-for customers and employees. As part of our Analytics practice, you will partner with clients to build experiential data science products at scale and speed to bring vision to life. You will work with a bunch of energetic and passionate technologists fueled by data, fun, and curiosity who discover solutions hidden in large datasets. A day in the Life of Data Specialist with our company: You will partner closely with the clients to deliver state-of-the-art outcomes by engineering solutions. You will continue to expand your expertise in Data Science, Big Data, Analytics and Visualization tools/techniques daily to enable incremental innovation in on-going projects. At all times, you will clearly articulate and communicate with a diverse set of internal and external customers with varying degrees of technical proficiency and deliver critical business and process related information to mitigate any risks or failures. You will persistently look for opportunities to address customer needs by being a thought partner in every moment of engagement. For you to be successful, you must: Build partnerships within and outside the team regardless of formal authority. Dismantle personal knowledge and empower others by mentoring and fostering an environment of growth. Create value by anticipating and meeting needs of internal and external customers and delivering high-quality results and be accountable for outcomes. Open and flexible to accommodate and implement new ideas, understand business complexities, nurture innovation, and challenge the status quo persistently. Grasp data that is available, pay attention to details and strive to be subject matter expert in chosen area of specialty through continuous learning and improvement. What you will bring to the table: Ability to work in ambiguous situations with unstructured problems and anticipate potential issues/risks. Demonstrated experience in building data pipelines in data analytics implementations such as Data Lake and Data Warehouse At least 2 instances of end-to-end implementation of data processing pipeline Experience configuring or developing custom code components for data ingestion, data processing and data provisioning, using Big data & distributed computing platforms such as Hadoop/Spark, and Cloud platforms such as AWS or Azure. Hands-on-experience developing enterprise solutions using designing and building frameworks, enterprise patterns, database design and development in 2 or more of the following areas: End-to-end implementation of Cloud data engineering solution AWS (EC2, S3, EMR, Spectrum, Dynamo DB, RDS, Redshift, Glue, Kinesis)/ Azure (Azure SQL DW, Azure Data factory, HDInsight, Cosmos DB, PostgreSQL, SQL on Azure) End-to-end implementation of Big data solution on Cloudera/Hortonworks/MapR ecosystem Real Time solution using Spark streaming, Kafka/Apache pulsar/Kinesis. Distributed compute solution (Spark/Storm/Hive/Impala) Distributed storage and NoSQL storage (Cassandra, Mongo DB, Datastax) Batch solution and distributed computing using ETL/ELT (SSIS/Informatica/Talend/Spark SQL/Spark Data frame/AWS Glue/ADF) DW-BI (MSBI, Oracle, Teradata), Data modelling, performance tuning, memory optimization/DB partitioning Frameworks, reusable components, accelerators, CI/CD automation Languages (Python, Scala) Proficiency in Datamodelling, for both structured and unstructured data, for various layers of storage Ability to collaborate closely with business analysts, architects and client stake holders to create technical specifications. Ensure quality of code components delivered by employing unit testing and test automation techniques including CI in DevOps environments. Ability to profile data, assess data quality in the context of business rules, and incorporate validation and certification mechanism to ensure data quality. Ability to review technical deliverables, mentor and drive technical teams to deliver quality technical deliverables. Understand system Architecture and provide component level design specifications, both high level and low-level design.
Job Title: Data Specialist x 4 Open Positions Location: Romania (Remote working permissible, however MUST be able to travel to client site in Romania if required) Duration: Initial 3 Month Contract Start Date: Immediate/ASAP Daily Rate: Negotiable (Paid in EUROS) Job Description Who are we: We create incredible digital experiences that change the world for our customers. We develop and deploy disruptive solutions which helps create more advocates, create a voice distinct from competition and anticipate disruption. We are powered by our core values of Customer success, We Care, Entrepreneurial (mind set) and Excellence that form our DNA and drive everything that we do-for customers and employees. As part of our Analytics practice, you will partner with clients to build experiential data science products at scale and speed to bring vision to life. You will work with a bunch of energetic and passionate technologists fueled by data, fun, and curiosity who discover solutions hidden in large datasets. A day in the Life of Data Specialist with our company: You will partner closely with the clients to deliver state-of-the-art outcomes by engineering solutions. You will continue to expand your expertise in Data Science, Big Data, Analytics and Visualization tools/techniques daily to enable incremental innovation in on-going projects. At all times, you will clearly articulate and communicate with a diverse set of internal and external customers with varying degrees of technical proficiency and deliver critical business and process related information to mitigate any risks or failures. You will persistently look for opportunities to address customer needs by being a thought partner in every moment of engagement. For you to be successful, you must: Build partnerships within and outside the team regardless of formal authority. Dismantle personal knowledge and empower others by mentoring and fostering an environment of growth. Create value by anticipating and meeting needs of internal and external customers and delivering high-quality results and be accountable for outcomes. Open and flexible to accommodate and implement new ideas, understand business complexities, nurture innovation, and challenge the status quo persistently. Grasp data that is available, pay attention to details and strive to be subject matter expert in chosen area of specialty through continuous learning and improvement. What you will bring to the table: Ability to work in ambiguous situations with unstructured problems and anticipate potential issues/risks. Demonstrated experience in building data pipelines in data analytics implementations such as Data Lake and Data Warehouse At least 2 instances of end-to-end implementation of data processing pipeline Experience configuring or developing custom code components for data ingestion, data processing and data provisioning, using Big data & distributed computing platforms such as Hadoop/Spark, and Cloud platforms such as AWS or Azure. Hands-on-experience developing enterprise solutions using designing and building frameworks, enterprise patterns, database design and development in 2 or more of the following areas: End-to-end implementation of Cloud data engineering solution AWS (EC2, S3, EMR, Spectrum, Dynamo DB, RDS, Redshift, Glue, Kinesis)/ Azure (Azure SQL DW, Azure Data factory, HDInsight, Cosmos DB, PostgreSQL, SQL on Azure) End-to-end implementation of Big data solution on Cloudera/Hortonworks/MapR ecosystem Real Time solution using Spark streaming, Kafka/Apache pulsar/Kinesis. Distributed compute solution (Spark/Storm/Hive/Impala) Distributed storage and NoSQL storage (Cassandra, Mongo DB, Datastax) Batch solution and distributed computing using ETL/ELT (SSIS/Informatica/Talend/Spark SQL/Spark Data frame/AWS Glue/ADF) DW-BI (MSBI, Oracle, Teradata), Data modelling, performance tuning, memory optimization/DB partitioning Frameworks, reusable components, accelerators, CI/CD automation Languages (Python, Scala) Proficiency in Datamodelling, for both structured and unstructured data, for various layers of storage Ability to collaborate closely with business analysts, architects and client stake holders to create technical specifications. Ensure quality of code components delivered by employing unit testing and test automation techniques including CI in DevOps environments. Ability to profile data, assess data quality in the context of business rules, and incorporate validation and certification mechanism to ensure data quality. Ability to review technical deliverables, mentor and drive technical teams to deliver quality technical deliverables. Understand system Architecture and provide component level design specifications, both high level and low-level design.