Data Engineer

vor 2 Monaten


Remote, Österreich CYOS Solutions Vollzeit
Application closing date: Wednesday, 11 September 2024 • 11:59pm, Canberra time (in Canberra)

Estimated start date: Tuesday, 01 October 2024

Location of work: ACT

Working arrangements: Onsite at least 3 days a week and mandatory all-day face to face team meeting once a fortnight

Length of contract: 30 June 2025

Contract extensions: 1x 12 months

Security clearance: Must have Baseline

Rates: $120 - $150 per hour (inc. super)

The Department of Agriculture, Fisheries and Forestry (DAFF) is looking for Data Engineers to join the Digital Transformation Program in the Australian Bureau of Agricultural and Resource Economics and Sciences (ABARES) to work across several data and analytics platforms. We are seeking candidates with strong experience in developing Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) processes, and /or the development of data products and complex data visualisations.

The role will be responsible for design, development and unit testing activities across several data movement, data transformation, and data visualisation processes within DAFF. The data movement and transformation processes focus on the preparation of data for use in decision making processes across the department, utilising modern cloud technology (Azure) to enable operational analytics use cases.

The successful candidate will require experience with the following techniques and technologies:

MS Azure Stack
Data Integration - Data Factory, SQL Server Integration Services and/or Databricks, LogicApp and/or FuncationApp
Data Store - SQL Server and/or Data Lake Storage, Unity Catalog (Medallion Architecture)
Analytics - Azure Databricks, Azure Machine Learning, ArcGIS Enterprise
Development tools – DevOps, Visual Studio, VS Code
Data technology solutions – sourcing (Flat file, Ingres, Azure, SQL Server), collecting, ingesting and storing
SQL, Python, R, .Net
Data Preparation
Transformation of data into formats tailored to analytics use cases – Parquet and/or Delta
Data Visualisation
Analyse and interpret complex data sets, and to identify trends and patterns.
Design principles to create appropriate visualisations for target audience.
Visualisation tools - Power BI

Essential Criteria

Demonstrated experience developing ETL/ELT processes for complex and/or large data movement, transformation and/or visualisations, particularly in a cloud environment.
Experience preparing data optimised for query performance in cloud computed engines. E.g. • Distributed computing engines (Spark) • Azure SQL • Python • R
Experience working with Engineering, Storage and Analytics services in cloud infrastructure.
Experience working collaboratively in an agile development team