
PeakData helps pharmaceutical companies learn more about their customers through our agile targeting and segmentation SaaS platform. PeakData is used by more than half of the top 20 Pharmaceuticalsβ¦

PeakData helps pharmaceutical companies learn more about their customers through our agile targeting and segmentation SaaS platform. PeakData is used by more than half of the top 20 Pharmaceuticalsβ¦
Founded: 2018
Headquarters: Zug, Switzerland
Product: AI-driven HCP profiling, influence mapping, targeting and analytics for pharma
Customers: Used by over half of the top 20 pharmaceutical companies
Latest known funding: Series A announced Aug 22, 2022
Healthcare professional (HCP) identification, profiling and targeting for pharmaceutical commercial and medical teams.
2018
Healthcare analytics / SaaS
Series A announced with participation from Octopus Ventures and Heal Capital
βAlbionVC led Series A; other investors include Octopus Ventures and Heal Capitalβ
| Company |
|---|
About PeakData
PeakData provides AI-powered market intelligence that helps pharmaceutical companies
optimize drug launch execution and resource allocation. Our platform delivers actionable
insights on healthcare professionals (HCPs) and healthcare organizations (HCOs), giving
commercial leaders data-driven tools for decision-making.
The Team
The engineering team in Poland is structured around three areas: Data Science (2 people),
Backend/Data Engineering (4 people), and Frontend (2 people). You'd be joining the
backend/data engineering team.
You'll work closely with the DS on handoffs from experimentation to production, with the
Product Manager on scope and priorities, and with the broader backend team on platform
services where pipeline logic intersects with the product.
Role Overview
We're looking for a Senior Data Engineer to own data pipeline work end-to-end β from
building and maintaining production pipelines to integrating new data sources and
contributing to backend services that power the platform.
You'll be working on: developing new pipeline components as we revamp and upgrade our
data flows, cleaning up and sunsetting legacy services, maintaining and improving
infrastructure, and eventually integrating open data sources. When DS work produces a
new model or data package, you're the person who makes it production-ready.
This is a hands-on engineering role. Not research, not analytics.
Technical Environment
β’ Python β core language for everything
β’ SQL β daily use
β’ Docker β containerized services throughout
β’ AWS β Lambda, ECS, S3, Step Functions, CDK (experience a plus, not required
from day one)
β’ Data Stores: PostgreSQL, DynamoDB, BigQuery
β’ IaC: Terraform / CDK (nice to have)
β’ Orchestration: Argo (or similar β willingness to learn matters more than prior
experience)
What You'll Own
β’ Build new pipeline components as we upgrade and refactor data flows
β’ Maintain and improve existing data pipelines β including sunset work on legacy
services
β’ Integrate new data package sources after DS teams define the spec β you make it
production-grade
β’ Collaborate with data scientists on handoffs from experimentation to production
β’ Keep the infrastructure running: monitoring, alerting, reliability
β’ Contribute to backend services where data and platform logic intersects with the
product
What We're Looking For
Must-have:
β’ 4+ years in data engineering or backend engineering
β’ Strong Python β production-grade, not just scripts or notebooks
β’ SQL β comfortable with complex queries, schema design, performance
considerations
β’ Docker β you've shipped containerized services, not just run them locally
Nice to have:
β’ pandas / data wrangling experience
β’ AWS (any depth β Lambda, S3, ECS)
β’ Terraform or CDK
β’ Experience with orchestration tools (Argo, Airflow, Prefect, or similar)
β’ Prior work in regulated or data-sensitive environments (life sciences, healthcare,
finance)
What matters more than stack coverage:
β’ You take ownership of what you ship and monitor it
β’ You're used to working from a data spec that's 80% complete β you can fill the
gaps yourself and flag the ones you can't
β’ You can work across the DS β backend boundary without needing a translator
What We're Not Looking For
β’ Full-stack generalists who drift toward frontend
β’ People who wait for a ticket to tell them something is broken
What we offer
Workthat matters β build solutions that support faster and more effective healthcare outcomes.
Hands-onAI work β practical experience with LLMs and modern AI applications used in real business scenarios.
FlexibleB2B compensation β β¬30β40/h depending on seniority (sole proprietorship only). Net invoicing, VAT settled on the Swiss side. Payments in EUR .
Remote-first setup β work from anywhere in Poland. Love good coffee and teamwork? Youβre always welcome at our WrocΕaw office.
Long-termprojects with global pharma partners β stability, clear goals and meaningful work.
Growthopportunities β develop your skills, explore new tools and deepen your expertise in healthcare and data.
Join us and make your data skills matter.
Build your career in a stable international team shaping the future of healthcare.
Your next opportunity is in here somewhere. Sign up to explore 52,000+ startups and their open roles. No spam. No gamification. Just jobs.
52,000+
Startups
60,000+
Open Roles
500+
New This Week