ML with Differential Privacy using JAX and DP-SGD

A tutorial on JAX: a great tool for DP-SGD

Differential Privacy
Deep Learning
Machine Learning
Nicolas Grislain

In 📔 this notebook we demonstrate how powerful it is to use JAX based DP-SGD along with the dp-accounting library to train deep learning models with Differential Privacy (DP). We show how to use these tools on a small toy problem: histogram estimation, and compare this generic approach to some well known benchmarks.

Along the way we get to discover some of the problems one is commonly faced with when tuning the hyper-parameters of a DP-SGD, in particular its clipping threshold.

If you own privacy-sensitive data and would like more people to use this data without having to worry about privacy risk, you might be interested by what Sarus Technologies does.

Read the notebook 📔

The per-sample gradient clipping may add a strong bias to our model

About the author

Nicolas Grislain

Cofounder & CSO @ Sarus

Ready?

Ready to unlock the value of your data? We can set you up in no time.
main.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17

Shell

Subscribe to our newsletter

You're on the list! Thank you for signing up.
Oops! Something went wrong while submitting the form.
128 rue La Boétie
75008 Paris — France
Resources
Blog
©2023 Sarus Technologies.
All rights reserved.