• 1Search for courses by Study Area, Level and Location
  • 2We deliver you all the matched results
  • 3Choose one or more course providers to contact you
Industry

Distance from location (kms)

Exact 5 10 25 50

6

June

Kafka & Integration Engineer

Little Birdie - Melbourne, VIC

IT
Source: uWorkin

JOB DESCRIPTION

Are you a strong Integration Engineer looking to make your mark? This is an opportunity not to be taken lightly! If you're passionate about retail, sales and building a business that puts the customer at the core, then we've got the best role going around.


We're a start-up born out of our passion for supporting retail, e-commerce and, of course, the customer. We always think outside the box to ensure we capture every opportunity to deliver the ultimate experience to our customers and our people. We want our people to feel confident and have a voice every day they come to work. This is not your ordinary job – it’s the opportunity to spread your wings (no pun intended).


Not to mention you'll be working with a small, fun and friendly team who have seen and done it all in the online space over the past 15 years! This is a great opportunity to work with and learn from some huge figures in the e-commerce space!


In this awesome role you will:

  • Enable the business to move data between platforms by using technologies such as Kafka, Lambda and Tray.io.
  • Build resilient, secure high-performance infrastructure required for CDC and high volume data processes from a wide variety of data sources. 
  • Supporting the development and architecture of applications and new solutions with your deep knowledge of data structures.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
  • Contribute to building analytics tools that utilize the data pipeline to provide actionable insights into the planning system performance.
  • Work with stakeholders including the Executive, Product and Data teams to resolve data-related technical issues and support their data infrastructure needs.
  • Collaborate with data and analytics experts to deliver greater functionality in our data systems.


To get your resume noticed your bag of tricks will include:

  • 3+ years in a similar data-driven role within automation, integration and implementation projects.
  • 3+ years of building, managing and maintaining Apache Kafka and associated technologies.
  • 3+ years experience working within AWS or similar.
  • Experience with Debezium is highly advantageous.
  • Exposure to Tray.io or similar workflow automation platform.
  • Ability to write Python for managing and manipulating data.
  • Built high throughput, low latency CDC pipelines that output data for multiple consumers.
  • Ability to articulate and present complex information in an easy to understand format.
  • Experience in writing designing, developing and debugging complex SQL queries.
  • Ability to turn ideas into solutions.


What we are not:

  • Into building process for the sake of process.
  • Micro-managers! We want someone genuinely confident in their own abilities to bring their ideas to the table and to be able to really take ownership of this site.


The extra stuff we know you want to know

As part of the Little Birdie culture, we know that to soar in this industry, it’s important to celebrate our wins and recognise our mistakes. Without a doubt, we want someone who isn’t afraid to share their opinions and ideas no matter how silly they might be! 


In return, and aside from the amazing team you’ll get to work with, we offer a flexible working environment to ensure work-life balance. 


It’s a seriously exciting time to be getting in at the ground level - and a genuine career maker. If you think you want to join us and be part of something great, do it and apply now!