Welcome to the Sarus Blog. Here we discuss product updates, industry examples, privacy, data protection, data governance, math, machine learning, and much more!
Thanks to Differentially Private LLM Fine Tuning
Deep dive into the usage of Sarus inside an Azure Confidential Clean Rooms to combine multiple parties privacy-sensitive data.
This video series guides you through the steps of using Sarus, from installation to fine-tuning LLM models with privacy guarantees.
Lindsey Allen VP of Engineering and GM of Azure-Databricks AI & Innovations joins Sarus as advisor, bringing 30+ years of experience in data and AI.
Based on work done on Differentially Private In Context Learning (DP-ICL), we developed an tested an algorithm that enables DP on Retrieval Augmented Generation (RAG).
Learn how to use a Python library to fine-tune a LLM in Databricks, protecting sensitive data with Differential Privacy.
Fine-tuning is more accessible than ever, thanks to services such as OpenAI’s. But fine-tuned GPT-4o-mini models are blabbermouths
The untapped value of connected car data.
You can fine-tune an LLM to learn new knowledge from private data, ensuring that no sensitive records are at risk of being regurgitated.
Language models do classify well but memorize even better, posing privacy risks.
Sarus Activate lets data scientists analyze and act on private data without viewing it, ensuring privacy-by-design in workflows for various industries
Sarus just released "medical_extended" a benchmark dataset to study privacy preserving AI