Last updated on August 26, 2025
- A machine learning technique where multiple devices or servers collaboratively train a shared model without sharing raw data.
- Instead of sending data to a central server, only the model updates (gradients/parameters) are sent, keeping sensitive information local.
Key Concepts
-
Decentralized Training: Data stays on local devices (e.g., smartphones, IoT, edge devices).
-
Model Aggregation: A central server collects and averages model updates to improve the global model.
-
Privacy-Preserving: Minimizes risk of exposing personal or sensitive data.
-
Communication Efficiency: Reduces the need for large-scale raw data transfer.
-
Edge AI Integration: Often paired with edge computing for real-time AI.
How Federated Learning Works
-
The Central server creates a base model.
-
Devices train the model using their own local data.
-
Devices send the server model updates (not raw data).
-
Server aggregates updates (e.g., via FedAvg algorithm).
-
The updated global model is sent back to the devices.
-
Process continues until convergence.
Benefits of Federated Learning
-
Data Privacy & Security – Raw data never leaves the device.
-
Reduced Latency – Local training allows faster processing.
-
Compliance – Helps with regulations like GDPR & HIPAA.
-
Scalability – Works across millions of distributed devices.
-
Personalization – Models can adapt to individual users.
Popular Frameworks & Tools
-
TensorFlow Federated (TFF)
-
PySyft (OpenMined)
-
FedML
-
Flower (FLwr)
-
OpenFL (Intel)