Multiple Operations Data Engineers are required to support and maintain enterprise data assets across a modern cloud-based data platform. These roles play a key part in ensuring the reliability, performance, and operational stability of data services, supporting both business-as-usual (BAU) delivery and critical processing cycles.<br><br> The positions focus on operational support of data pipelines, month-end processing, and responding to data-related requests, while ensuring service level agreements (SLAs) are consistently met across the platform.<br> <br><br><strong>Key Responsibilities</strong><br> <ul> <li>Support and maintain data assets across cloud data lake, cloud EDW, legacy EDW, and analytics platforms</li> <li>Provide Level 2/3 technical support across AWS, Control-M, Snowflake, Teradata (legacy), and ETL-based integration solutions</li> <li>Support daily data delivery processes and month-end operational workloads</li> <li>Develop and maintain SQL-based analytical and ETL code</li> <li>Monitor and manage scheduling workflows using Control-M or similar tools</li> <li>Work with file transfer solutions such as GoAnywhere or equivalent</li> <li>Apply DevOps practices and tools in line with established delivery processes</li> <li>Maintain system and technical documentation</li> <li>Analyse, size, prioritise, and queue work items</li> <li>Collaborate with data consumers, database developers, testers, and IT support teams</li> <li>Drive continuous improvement in coding standards, delivery practices, and documentation, including knowledge sharing and training</li> </ul> <strong>Essential Skills & Experience</strong><br> <ul> <li>Strong experience with AWS data engineering services including Glue, S3, Lambda, EC2, and RDS</li> <li>Proven experience designing and developing data pipelines using ETL/ELT frameworks</li> <li>Hands-on experience with Snowflake (cloud EDW) and/or Teradata (on-prem EDW)</li> <li>High level of programming capability, particularly in Python</li> <li>Experience with Control-M orchestration, monitoring, or similar scheduling tools</li> <li>Strong background in operational support and BAU environments</li> </ul> <strong>Desirable Experience</strong><br> <ul> <li>Infrastructure-as-code exposure using CloudFormation or Terraform</li> <li>Experience with ETL tools such as DBT, Talend, Informatica, or similar</li> <li>Advanced SQL capability</li> <li>Exposure to SAS platforms (Base, Enterprise Guide, or SAS Viya)</li> </ul> <br>This role will be offered as an <strong>initial 12 month contract</strong> + opportunity to extend.<br><br>Multiple roles are available in <strong>Sydney and Melbourne</strong> and are due to start early February 2026.<br><br>To apply please send your resume or contact Nat on 0430 292 875 or natalie@blackroc.co