Work Package Update: Decentralised learning under information constraints Summer 2025

24th July, 2025 | Research blog

This work package focuses on problems of statistical estimation, hypothesis testing and decision-making in distributed systems, bearing in mind additional constraints such as communication and privacy.   

Their work to date includes a new method in how to securely and reliably store information in a distributed setting when there’s a malicious hacker  who can only see and corrupt  (an unknown) part of the system.  Optimal Information Security Against Limited-View Adversaries: The Benefits of Causality and Feedback – University of Bristol  and Codes for Adversaries: Between Worst-Case and Average-Case Jamming.   

In addition, researchers have conducted an exploration into how to split large datasets into smaller, less similar batches to improve machine learning performance;  intentionally mixing dissimilar data in each batch to help models learn more general patterns and avoid overfitting to repetitive or redundant data  Dissimilar Batch Decompositions of Random Datasets | Sankhya A