
The Rhino Federated Computing Platform allows enterprises to set up computation pipelines on distributed data sources in days, not months - while still respecting confidentiality, privacy and data sovereignty. Rhino FCP is powered by Edge Computing and Federated Learning - innovative techniques that ‘bring code to the data’, training AI models locally in order to arrive at better outcomes for applications such as drug discovery and fighting financial crime.

The Rhino Federated Computing Platform allows enterprises to set up computation pipelines on distributed data sources in days, not months - while still respecting confidentiality, privacy and data sovereignty. Rhino FCP is powered by Edge Computing and Federated Learning - innovative techniques that ‘bring code to the data’, training AI models locally in order to arrive at better outcomes for applications such as drug discovery and fighting financial crime.
Headquarters: Boston (R&D center in Tel Aviv)
Core product: Rhino Federated Computing Platform — enterprise federated learning & federated computing
Founded / launch: Launched publicly February 2021
Recent funding: May 22, 2025 Series A $15M (led by AlleyCorp)
Privacy-preserving distributed data collaboration and federated learning for regulated and siloed datasets.
2021
Software Development
$5M
Seed announced at company launch
$15M
Series A reported as oversubscribed with multiple participating investors
“Raised an oversubscribed $15M Series A with participation from multiple specialist and institutional investors, indicating institutional investor confidence.”
| Company |
|---|
About Rhino Federated Computing
Rhino solves one of the biggest challenges in AI: seamlessly connecting siloed data through federated computing. The Rhino Federated Computing Platform (Rhino FCP) serves as the ‘data collaboration tech stack’, extending from providing computing resources to data preparation & discoverability, to model development & monitoring - all in a secure, privacy preserving environment.
To do this, Rhino FCP offers flexible architecture (multi-cloud and on-prem hardware), end-to-end data management workflows (multimodal data, schema definition, harmonization, and visualization), privacy enhancing technologies (e.g., differential privacy), and allows for the secure deployment of custom code & 3rd party applications via persistent data pipelines.
Rhino is trusted by >60 leading organizations worldwide - including 14 of 20 of Newsweek’s ‘Best Smart Hospitals’ and top 20 global biopharma companies - and is leveraging this foundation for financial services, ecommerce, and beyond. The company is headquartered in Boston, with an R&D center in Tel Aviv.
About the role
We are looking for an Applied Data Scientist to join our growing R&D team. You will play a key role in developing the AI capabilities that power our platform, while also acting as a hands-on practitioner who tests and validates our technology across diverse use cases.
In this role, you will balance the immediate needs of a fast-growing startup with long-term data science tasks. You will be responsible for building internal tools that automate complex data workflows, as well as developing and fine-tuning models that demonstrate the full potential of federated computing.
You will work with a wide range of technologies - from integrating off-the-shelf LLM APIs to fine-tuning State-of-the-Art deep learning models - and collaborate closely with Product and Engineering to improve the platform based on your hands-on experience.
Day-to-day responsibilities:
Develop Internal AI Engines:
Research and implement intelligent tools to automate data mapping, harmonization, and user assistance pipelines using Generative AI and LLMs.
End-to-End Model Execution:
Take ownership of diverse modeling tasks (NLP, Computer Vision, Tabular) from data collection and preparation to training, fine-tuning, and validation.
Platform Validation & "Customer Zero":
Stress-test the Rhino platform by implementing various ML workflows (both federated and centralized) to ensure robustness and identify gaps before they reach the customer.
Support & Innovation:
Assist in solving complex data science challenges while simultaneously researching new methods to enhance our core technology.
Product Collaboration:
Provide feedback to the product team on UI/UX and feature requirements based on your deep technical usage of the system.
About the candidate
This role is for a fast learner who loves technology and is capable of executing quickly without losing sight of the bigger picture. We are looking for a versatile data scientist who can choose the right tool for the job - whether it’s prompt engineering for an LLM, statistical modeling, or training a deep neural network.
Requirements:
4+ years of professional experience in Data Science or Applied Machine Learning.
Strong proficiency in Python and experience with modern ML frameworks (e.g., PyTorch, TensorFlow, Scikit-learn).
Generative AI & LLM Expertise:
Proven experience working with LLM APIs (OpenAI, Anthropic, etc.), prompt engineering, and building functional AI-driven pipelines.
Strong software practices within Data/ML workflows:
including clean code structure, modular design, reproducibility, and the ability to transition exploratory work into well-organized, maintainable code.
Adaptability & Versatility:
Ability to switch contexts between different domains (NLP, Image Processing, Structured Data) and tasks.
Model Lifecycle Knowledge:
Experience with data curation, model fine-tuning, and rigorous evaluation.
Startup Mindset:
Ability to prioritize effectively in a dynamic environment, balancing "quick wins" for delivery with robust development for the long term.
Creative Problem Solving:
Demonstrated ability to find innovative solutions to complex data or modeling constraints.
Advantages: