|Photo: Kevin Murnane|
The result is immediate improvements in performance for the user, subsequent improvements for all users, and increased privacy. They call it federated learning and it has the potential to be a game changer.
There are two main processes involved in deep learning, training and inference. The network learns during training and uses what it learned to draw inferences from data. Sophisticated deep learning networks are trained on massive data sets that often reside on multiple machines located in data centers. The amount of data needed for training precludes training on mobile devices.
This doesn’t mean that apps can’t benefit from deep learning; they can and do as anyone knows who has used Google Search or Assistant on their smartphone. The way this usually works is that training takes place in the cloud and inference happens on the user's phone. User data is sent from the phone to the cloud where the deep learning model lives. Data from millions of users is used to train the model and an improved version of this shared model is pushed down to users' phones. The version of the model that lives on the user's phone carries out inference processes.
In a system like this, the model on the user’s device doesn’t get better by learning directly from the user, it improves by learning from all users. In other words, the user's model isn't tuned to how the user uses the app, it's tuned to how everyone uses the app. Federated learning changes this by allowing the user’s model to learn from both the user and everyone else.