We're partnering with a rapidly evolving leader in the Logistics and Supply Chain sector, dedicated to optimising complex global operations and improving operational efficiency through data. For the right candidate with the necessary skills and experience, we are pleased to offer 482 visa sponsorship.
This client requires a Senior Data Engineer to act as a technical authority for their entire data platform built on AWS. You will take the lead in architecting, building, and governing highly scalable, real-time data pipelines (streaming and batch) that ingest, transform, and serve critical operational and analytical data. You will champion engineering best practises and mentor junior team members, directly enabling complex business intelligence and machine learning initiatives.
What You'll Do
- Architect and Implement end-to-end data solutions on AWS using services such as S3, Glue, EMR, Kinesis, DynamoDB, and Redshift/Snowflake.
- Lead the design of the data lake and data warehouse, ensuring data governance, quality, security, and compliance are maintained.
- Develop and optimise robust ETL/ELT data pipelines using expert-level Python or Scala programming.
- Champion DevOps practises for data, including automated testing, CI/CD, and Infrastructure as Code (IaC) using Terraform or CloudFormation.
- Analyse complex data requirements and provide strategic guidance on data modelling, schema optimisation, and query performance tuning.
- Mentor and guide junior data engineers on technical design, coding standards, and cloud engineering best practises.
- Proactively troubleshoot and resolve complex data platform issues, ensuring high availability and reliability of critical data assets.
What You'll Bring
- 6+ years of progressive professional experience in Data Engineering, with at least 2 years operating in a senior or lead capacity.
- Expert-level, hands-on experience with the AWS data ecosystem (S3, Glue, EMR, Kinesis/Kafka, Redshift) is mandatory.
- Expert-level proficiency in Python or Scala for data processing and pipeline development.
- Advanced proficiency in SQL, dimensional modelling, and data warehousing concepts.
- Proven experience implementing Infrastructure as Code (IaC) using Terraform or CloudFormation.
- Experience with real-time data streaming technologies (e.g., Kafka, AWS Kinesis).
- Strong understanding of data governance, security principles, and cost optimisation within cloud data platforms.
- Exceptional communication and leadership skills, with the ability to drive technical decisions across teams.