OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 27.03.2026, 19:51

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Knowledge distillation in federated learning: a comprehensive survey

2025·7 Zitationen·Discover ComputingOpen Access
Volltext beim Verlag öffnen

7

Zitationen

6

Autoren

2025

Jahr

Abstract

Federated Learning, often known as FL, is an approach that has recently emerged as a potentially helpful method for training machine learning models in a distributed manner without the requirement of central data storage. However, when attempting to aggregate information, the inherent variety and discrepancies in the data contributed by many FL contributors might be a substantial obstacle. In order to address this problem, researchers have offered various solutions, one of which is called knowledge distillation (KD). Such a solution seeks to transfer knowledge from a larger, more precise model to a smaller model, thus enhancing its performance. This study provides a detailed examination of the effectiveness of KD in responding to these challenges posed by FL. We comprehensively review existing research, emphasizing the benefits and limitations of using these techniques in FL and discussing the numerous challenges and research questions in this field.

Ähnliche Arbeiten