Federated learning has been gaining adoption in mobile and cloud computing as a mechanism for decentralizing both computation and trust when building machine learning models based on user data. Proposals for using federated learning in IoT devices have been emerging but, even though they can be used to gain valuable insights faster and more efficiently from IoT devices, their adoption is lackluster.

We argue that this slow adoption is due to two challenges: first, the federated learning methods are designed for cloud server and mobile device type processors, which are significantly more performant than usual IoT devices and, second, the level of trust we can have in code executed on IoT devices is significantly lower than in mobile devices or cloud servers.

we aim to overcome the challenges of using federated learning by redesigning the relevant algorithms to run more efficiently on the IoT device hardware. As an additional benefit of this redesign, the resulting applications have small memory footprint and communication overhead, which makes it easier to secure their execution and make it trusted in the IoT device.

More information: Project webpage