Full metadata record

DC Field Value Language
dc.contributor.authorKim, Woojeong-
dc.contributor.authorKim, Suhyun-
dc.contributor.authorPark, Mincheol-
dc.contributor.authorJeon, Geonseok-
dc.date.accessioned2024-01-12T04:08:47Z-
dc.date.available2024-01-12T04:08:47Z-
dc.date.created2023-11-26-
dc.date.issued2020-12-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/77789-
dc.description.abstractNetwork pruning is widely used to lighten and accelerate neural network models. Structured network pruning discards the whole neuron or filter, leading to accuracy loss. In this work, we propose a novel concept of neuron merging applicable to both fully connected layers and convolution layers, which compensates for the information loss due to the pruned neurons/filters. Neuron merging starts with decomposing the original weights into two matrices/tensors. One of them becomes the new weights for the current layer, and the other is what we name a scaling matrix, guiding the combination of neurons. If the activation function is ReLU, the scaling matrix can be absorbed into the next layer under certain conditions, compensating for the removed neurons. We also propose a data-free and inexpensive method to decompose the weights by utilizing the cosine similarity between neurons. Compared to the pruned model with the same topology, our merged model better preserves the output feature map of the original model; thus, it maintains the accuracy after pruning without fine-tuning. We demonstrate the effectiveness of our approach over network pruning for various model architectures and datasets. As an example, for VGG-16 on CIFAR-10, we achieve an accuracy of 93.16% while reducing 64% of total parameters, without any fine-tuning. The code can be found here: https://github.com/friendshipkim/neuron-merging.-
dc.languageEnglish-
dc.publisherNeural information processing systems foundation-
dc.titleNeuron merging: Compensating for pruned neurons-
dc.typeConference-
dc.description.journalClass1-
dc.identifier.bibliographicCitation34th Conference on Neural Information Processing Systems, NeurIPS 2020-
dc.citation.title34th Conference on Neural Information Processing Systems, NeurIPS 2020-
dc.citation.conferencePlaceUS-
dc.citation.conferencePlaceOnline-
dc.citation.conferenceDate2020-12-06-
dc.relation.isPartOfAdvances in Neural Information Processing Systems-
Appears in Collections:
KIST Conference Paper > 2020
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE