Skip to content
/ DPDFD Public

Code snippet of DPDFD (IJCAI23: Model Conversion via Differentially Private Data-Free Distillation)

Notifications You must be signed in to change notification settings

llbbcc/DPDFD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 

Repository files navigation

DPDFD

It is a code snippet of our IJCAI 2023 paper Model Conversion via Differentially Private Data-Free Distillation. This is the main code for achieving differential privacy. It can be applied directly to any distillation process to train privacy-preserving student models.

Running this code needs a pre-trained teacher and images (private data or synthetic data generated by a generator). Other parameters are hyperparameters. You will get a differentially private output s_out_new after running it. You can think of it as a differentially private label predicted by teacher(s) to update the student.

The calculation of privacy budget can be found in our paper.

About

Code snippet of DPDFD (IJCAI23: Model Conversion via Differentially Private Data-Free Distillation)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages