AI Talent is an Australian-owned company established in 2010, specializing in providing exceptional AI professionals for impactful projects. With over 25 years of industry experience, they offer seamless access to a curated pool of AI talent for short to long-term engagements. They headhunt globally, bringing world-class AI professionals to Australia, and are approved suppliers to the government, working with blue-chip and Fortune 500 corporations, with strong partnerships with AWS and Microsoft. Their business model focuses on directly employing AI experts to ensure high standards and offers a flexible subcontracting model for clients to select experts for their projects, emphasizing their niche focus on AI and local onsite availability.
AI TalentArtificial IntelligenceAustraliaCloudData ScienceIT StaffingMachine LearningRecruitmentaitalent.com.au
AI Talent
AI Talent is an Australian-owned company established in 2010, specializing in providing exceptional AI professionals for impactful projects. With over 25 years of industry experience, they offer seamless access to a curated pool of AI talent for short to long-term engagements. They headhunt globally, bringing world-class AI professionals to Australia, and are approved suppliers to the government, working with blue-chip and Fortune 500 corporations, with strong partnerships with AWS and Microsoft. Their business model focuses on directly employing AI experts to ensure high standards and offers a flexible subcontracting model for clients to select experts for their projects, emphasizing their niche focus on AI and local onsite availability.
AI TalentArtificial IntelligenceAustraliaCloudData ScienceIT StaffingMachine LearningRecruitmentaitalent.com.au
HQSydney, AU
Team Size48
Open Jobs28
Total Funding-
Latest FundraiseUnknown
Join the Team
AI and Automation Specialist
HybridSydney, New South Wales, AU
Hybrid • Sydney, New South Wales, AU
Teeming tracks opportunities at over 24,000 AI startups, then works with you to find (and land) the one you'll love.
Backend Developer
InternshipJerusalem
Internship • Jerusalem
Technical Writer
Full-timeHamburg, DE
Full-time • Hamburg, DE
Machine Learning Engineer
Part-timeHaifa
Part-time • Haifa
Technical Writer
ContractBerlin, DE
Contract • Berlin, DE
Frontend Developer
InternshipBelgrade, RS
Internship • Belgrade, RS
Software Engineer
Part-timeTel Aviv
Part-time • Tel Aviv
We're partnering with a vital leader in the
Government Health
sector, dedicated to
optimising
public health outcomes, resource allocation, and policy
analysis
through a modern, cloud-native data platform. For the right candidate with the necessary skills and experience, we are pleased to offer
482 visa sponsorship
.
This client requires a
Senior Databricks Developer
to serve as a technical leader for their advanced
Analytics
and
Research
workloads. You will be instrumental in
architecting
and building highly scalable data pipelines using
PySpark/Scala
within the Databricks Lakehouse Platform. This role demands expertise in Delta Lake, performance
optimisation
, and ensuring strict
data governance
and security for sensitive patient data, adhering to all
compliance
and privacy
regulations
.
What You'll Do
What You'll Bring
Lead the design and development
of large-scale, resilient, and performant ETL/ELT data pipelines using
PySpark/Scala
within Databricks notebooks and jobs.
Architect and manage the Delta Lake
environment, focusing on data ingestion, quality enforcement (using Delta Live Tables or similar), and schema evolution for complex public health datasets.
Optimise
Databricks clusters, notebooks, and Spark jobs for cost-efficiency and performance, specifically targeting bottlenecks in high-volume batch and streaming workloads.
Define and enforce data governance practises
within the Lakehouse, utilizing Unity Catalogue for centralized metadata and access control, adhering to government standards.
Collaborate closely
with government
analysts
and data scientists to transition
analytical
models and research findings into scalable, production-ready pipelines.
Champion CI/CD and MLOps practises
for Databricks notebooks and workflows, utilizing tools like Azure DevOps or Jenkins.
Mentor and guide
junior engineers on Databricks development standards, Spark
optimisation
, and modern data engineering
practises
.
6+ years of progressive professional experience in Data Engineering, with at least 3 years dedicated to developing solutions on the
Databricks Platform
.
Expert-level proficiency in PySpark and/or Scala
for distributed data processing.
Mandatory hands-on experience
with
Delta Lakearchitecture
, including DLT, time travel, and VACUUM operations.
Deep understanding of cloud infrastructure (
Azure
preferred) and how Databricks integrates with cloud storage (ADLS Gen2) and services.
Expert proficiency in
SQL
and dimensional modelling principles.
Proven experience with CI/CD, Infrastructure as Code (e.g., Terraform), and Databricks command-line tools for automation.
Exceptional communication and problem-solving skills, with the ability to
analyse
complex requirements and design resilient solutions in a highly regulated environment.
AI and Automation Specialist | AI Talent · Teeming.ai