Upwork is hiring a Data Scientist

Data Scientist

Upwork  ·  US  ·  $62k/yr - $83k/yr
about 2 years ago

Data Engineer

About the Role

Location: Remote, Residing in Australia (Melbourne/Sydney)

Reports to: Program Manager and CTO

Type: Fixed Term Contract

Duration: 6+ months

Role Purpose

The Data Engineer will design, build and integrate data from multiple sources in order to prepare a trusted data infrastructure that supports the data and analytics capabilities for one of our clients. This position develops the processes for data acquisition, transformation, migration, verification, modelling and mining.

The core purpose of this position will be to migrate data over from an AWS Redshift instance to a new greenfield Snowflake instance. This will include, but not limited to, building SQL/Python scripts/processes, designing/modelling the environment, orchestration, and integrations.

Key Responsibilities

•Analysis of stored procedures, functions, views, and other SQL objects.

• Analysis of Tableau reports/dashboards and their corresponding data sources.

• Translating reports into data and functional requirements.

• Designing and data modelling the to-be platform.

• Building of SQL code to process data between layers (Bronze, Silver, Gold) and apply transformations.

• Orchestration of SQL code using Airlfow and/or Talend.

• Organizing and managing tasks through a Scrum Agile methodology.

• Running UAT sessions with end stakeholders.

• Deployments of code between environments. This will also include DevOps integration through Bit Bucket.

• Writing of solution documentation and supporting maintenance of developed solutions.

• Executing test scripts and unit tests of developed code within the environment.

• Collaboration with overall project team and business stakeholders.

Essential Knowledge, Skills, and Experience

• Bachelor’s Degree in Data Science, Computer Science, Engineering, Statistics, Mathematics, Actuary or a similar related field.

• Minimum 3 years of experience as a Data Engineer designing, building, and optimizing data pipelines, architecture and data sets.

• Extensive experience in building processes supporting data transformation, data structures and the manipulation, processing and extraction of value from datasets.

• Extensive experience using SQL and Python.

• Experience with data pipeline and workflow management tools, and Cloud data platforms.

• Knowledge of data platform concepts such as data lakes, data warehousing, big data

• processing, real time processing architecture, scheduling, and monitoring of ETL/ELT workflows.

• Experience with Snowflake and Relational Databases.

• Experience with Talend and/or Airflow would be highly advantageous.

• Knowledge of Agile Software Development methodologies.

• Strong organizing and prioritizing skills with the ability to work across multiple work streams simultaneously.

• Strong process orientation with solid documentation skills.

• Strong analytical and problem-solving skills.

Job is closed

This job is already closed and no longer accepting applicants, sorry.