Job details

Location
Parramatta
Salary
AU$1000 - AU$1100 per day + Super
Job Type
Contract
Ref
37808_1655186396
Contact
Laurie Weeks
Contact email
Email Laurie
Contact phone
0405 306 634
Posted
14 days ago

Job details

Location
Parramatta
Salary
AU$1000 - AU$1100 per day + Super
Job Type
Contract
Ref
37808_1655186396
Contact
Laurie Weeks
Contact email
Email Laurie
Contact phone
0405 306 634
Posted
14 days ago

Role: Senior Azure Data Engineer
Daily Rate: $1000 - $1100 + Super
Duration: 6 months (high chance of extension)
Location: Parramatta/WFH flex

The Role

As Senior Data Engineer, you will be responsible for developing and maintaining the data architecture, data models and standards for various Data Integration & Data Warehouse projects in Azure. Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. In addition, develop and maintain the documentation of the data architecture, data flow, and data warehouse data models appropriate for both technical and business audiences.

Key Accountabilities

  • Provide technical expertise in Azure Data Platform
  • Experienced in Data Warehouse / Data Lake / Lake House implementations
  • Develop ELT pipelines in and out of data lake using Azure Data Factory, Databricks and Delta Lake.
  • Promote changes between environments using Azure Devops Pipelines
  • Integrate on-premise infrastructure with public cloud AZURE infrastructure.
  • Partner with the Data Architects, Product Managers, Data Modellers, Business Users and Scrum Masters to deliver data integrations and BI solutions required for various projects.
  • Review the work provisioned by vendor and ensure it is compliant with Sydney Water guidelines
  • Proactively identify improvement on the way of working within the team, with stakeholders to achieve a better outcome for the Sydney Water

About You

  • Must have a minimum 5+ years in IT and 4+ years of hands-on experience working as an Azure Data Engineer, and 4+ years in Data Warehousing, ETL/ELT, BI, and Analytics projects.
  • Extensive design and implementation experience in Azure cloud data platform and modern data warehousing, including data security, Azure Logging and Monitoring, Azure Databricks, Azure Blob Storage, Azure Data Lake, Azure Data Factory, Azure SQL Database, Azure Devops
  • Expertise in data modelling, ELT using ADF, implementing complex views, stored procedures and standard DWH and ETL/ELT concepts.
  • Proven experience working with Spark on large data volumes (preferably in Databricks)
  • Proven understanding of the Lakehouse concept and how Delta Lake is used to realise this
  • Deep understanding of relational and NoSQL data stores, methods, and approaches (star and snowflake, dimensional modelling).
  • Experience with data security and data access controls and design.
  • Proficiency in Python (Pyspark), SQL (ANSI and Spark SQL) and SCALA performance tuning and troubleshooting.
  • Proficiency in building and optimising Azure Devops Pipelines is desired
  • Experience working in DataOps model preferred
  • Provide resolution to an extensive range of complex data pipeline-related problems proactively and as issues surface.
  • Experience with Agile development methodologies

If this sounds like you, contact Laurie at Talenza, or APPLY today!