Full metadata record

DC Field Value Language
dc.contributor.authorMinjae Kim-
dc.contributor.authorSangyoon Yu-
dc.contributor.authorKim, Suhyun-
dc.contributor.authorSoo-Mook Moon-
dc.date.accessioned2024-01-12T02:46:37Z-
dc.date.available2024-01-12T02:46:37Z-
dc.date.created2023-11-26-
dc.date.issued2023-05-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/76459-
dc.identifier.urihttps://iclr.cc/virtual/2023/poster/11878-
dc.description.abstractFederated learning is for training a global model without collecting private local data from clients. As they repeatedly need to upload locally-updated weights or gradients instead, clients require both computation and communication resources enough to participate in learning, but in reality their resources are heterogeneous. To enable resource-constrained clients to train smaller local models, width scaling techniques have been used, which reduces the channels of a global model. Unfortunately, width scaling suffers from heterogeneity of local models when averaging them, leading to a lower accuracy than when simply excluding resource-constrained clients from training. This paper proposes a new approach based on depth scaling called DepthFL. DepthFL defines local models of different depths by pruning the deepest layers off the global model, and allocates them to clients depending on their available resources. Since many clients do not have enough resources to train deep local models, this would make deep layers partially-trained with insufficient data, unlike shallow layers that are fully trained. DepthFL alleviates this problem by mutual self-distillation of knowledge among the classifiers of various depths within a local model. Our experiments show that depth-scaled local models build a global model better than width-scaled ones, and that self-distillation is highly effective in training data-insufficient deep layers.-
dc.publisherICLR-
dc.titleDepthFL : Depthwise Federated Learning for Heterogeneous Clients-
dc.typeConference-
dc.description.journalClass1-
dc.identifier.bibliographicCitationThe Eleventh International Conference on Learning Representations-
dc.citation.titleThe Eleventh International Conference on Learning Representations-
dc.citation.conferencePlaceRW-
dc.citation.conferencePlaceKigali Rwanda-
dc.citation.conferenceDate2023-05-01-
dc.relation.isPartOfThe Eleventh International Conference on Learning Representations-
Appears in Collections:
KIST Conference Paper > 2023
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE