Traditional collaborative learning approaches are based on sharing of model weights between clients and a server. However, there are advantages to resource efficiency through schemes based on sharing of activations. Several differentially private methods were developed for sharing of weights while such mechanisms do not exist so far for sharing of activations. We propose Power-Learning to learn a privacy encoding network in conjunction with a small utility generation network such that the final activations generated from it are equipped with formal differential privacy guarantees. These privatized activations are then shared with a more powerful server, that learns a post-processing that results in a higher accuracy for machine learning tasks. We show that our co-design of collaborative and private learning results in requiring only one round of privatized communication and lesser compute on the client than traditional methods. The privatized activations that we share from the client are agnostic to the type of model used on the server in order to process these activations to complete a task.
Our method transforms private data into embeddings through a carefully designed two-step process: