Skip to main content
Registration is now open! Early-bird pricing available through May 5, 2026. Register now

All Accepted Papers

FLASC: Federated LoRA with Sparse Communication

Kevin Kuo (Carnegie Mellon University), Arian Raje (Carnegie Mellon University), Kousik Rajesh (Carnegie Mellon University), Virginia Smith (Carnegie Mellon University)

System Optimization & Efficiency

Abstract

Low-rank adaptation (LoRA) is a promising method for finetuning models in communication-constrained settings such as cross-device federated learning (FL). Prior work has explored ways to improve the efficiency of LoRA in federated settings by imposing additional sparsity constraints. However, existing methods for sparse LoRA not only harm accuracy but can in fact increase overall communication costs. We instead propose FLASC, a simple composite method that consists of a PEFT method and compression algorithm. First, we demonstrate that FLASC as a combination of LoRA and sparse Top-K communication outperforms baselines of using a lower LoRA rank or pruning LoRA weights. Second, FLASC-Search efficiently searches the space of rank-and-sparsity configurations by first tuning sparsity at a low rank and then transferring to higher ranks. Across four FL datasets, we demonstrate that FLASC outperforms existing sparse LoRA methods with up to 20% higher accuracy or 10x less communication. Overall, FLASC is a simple yet competitive baseline which can be easily extended to more advanced PEFT and compression methods in the future

ACM CAIS 2026 Sponsors