Advancing Private Federated Learning: Insights and Innovations from Research at Apple
Private Federated Learning (PFL) is an approach to collaboratively train a machine learning model between edge devices with coordination by a central server, whilst preserving the privacy of the data on each edge device. PFL is an emerging field, with exponential growth in the number of papers published over the past few years. Several big tech companies invest heavily in the practical applications of PFL.
In this talk, we will introduce Federated Learning as a subject and describe techniques to preserve the privacy of participating edge devices. We will discuss the unique challenges encountered in PFL, and advocate for further research in areas of great impact for PFL. We will briefly outline the methods Apple has chosen to implement a privacy-preserving Federated Learning system and present key results from deployed real-world applications and our published research.
Finally, we will present how to get started doing research in PFL using simulations with open source tools.
Filip Granqvist has been a core contributor to Apple’s Private Federated Learning (PFL) project since its inception in 2018, focusing on applied ML research, software design, and architecture. He is leading the development of Apple’s PFL framework for modelling, both in a simulation environment and for real-world applications using customer devices while preserving privacy. Filip’s expertise in PFL extends to consulting on a range of products at Apple, which span various domains such as language models, vision, and time series analysis. Some of his work has also been published at NeurIPS, ICASSP, and Interspeech. Filip is passionate about federated and decentralized technologies, software architecture for ML, and believes that privacy-preserving technologies are essential for a well-functioning society.