Shorter but still interesting mix this week with two pillars of private machine learning: homomorphic encryption and differential privacy!
- Model-Agnostic Private Learning via Stability
More work on ensuring privacy of training data via differential private query mechanisms. Compared to paper from a few weeks ago, this one focuses on "algorithms that are agnostic to the underlying learning problem [with] formal utility guarantees [and] provable accuracy guarantees".
- Homomorphic Encryption for Speaker Recognition: Protection of Biometric Templates and Vendor Model Parameters
The Paillier cryptosystem is used to securely evaluate simplified similarity functions so users don't leak biometric information during authentication. Performance numbers included.
- Efficient Determination of Equivalence for Encrypted Data
Reminder that even a simpler task such as privately linking identities and records together is relevant in industry.
- The Morning Paper: When coding style survives compilation
Anonymity is hard! Random forests can be trained to identify your coding style from source code as well as compiled programs.