Differential Privacy
Differential privacy is a system for sharing information about a dataset by describing the patterns of groups within the dataset while withholding information about individuals. Differential Privacy provides margins on how much a single data record from a training dataset contributes to a machine-learning model. There is a membership test on the training data records, and it ensures if a single data record is removed from the dataset, the output should not change beyond a certain threshold.