Federated Studying (FL) is an rising ML coaching paradigm the place purchasers personal their information and collaborate to coach a worldwide mannequin with out revealing any information to the server and different members.
Researchers generally carry out experiments in a simulation atmosphere to rapidly iterate on concepts. Nonetheless, current open-source instruments don’t supply the effectivity required to simulate FL on bigger and extra lifelike FL datasets. We introduce pfl-research, a quick, modular, and easy-to-use Python framework for simulating FL. It helps TensorFlow, PyTorch, and non-neural community fashions, and is tightly built-in with state-of-the-art privateness algorithms.
We research the pace of open-source FL frameworks and present that pfl-research is 7-72× sooner than different open-source frameworks on widespread cross-device setups. Such speedup will considerably enhance the productiveness of the FL analysis neighborhood and allow testing hypotheses on lifelike FL datasets that had been beforehand too useful resource intensive. We launch a set of benchmarks that evaluates an algorithm’s general efficiency on a various set of lifelike situations.