Full metadata record

DC Field Value Language
dc.contributor.authorIssenhuth, Thibaut-
dc.contributor.authorLee, Sangchul-
dc.contributor.authorDos Santos, Ludovic-
dc.contributor.authorFranceschi, Jean-Yves-
dc.contributor.authorKim, Chan soo-
dc.contributor.authorRakotomamonjy, Alan-
dc.date.accessioned2025-11-28T04:30:42Z-
dc.date.available2025-11-28T04:30:42Z-
dc.date.created2025-11-28-
dc.date.issued2025-07-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/153710-
dc.identifier.urihttps://openreview.net/forum?id=GKqoqGCHTq-
dc.description.abstractConsistency models imitate the multi-step sampling of score-based diffusion in a single forward pass of a neural network.They can be learned in two ways: consistency distillation and consistency training. The former relies on the true velocity field of the corresponding differential equation, approximated by a pre-trained neural network.In contrast, the latter uses a single-sample Monte Carlo estimate of this velocity field.The related estimation error induces a discrepancy between consistency distillation and training that, we show, still holds in the continuous-time limit.To alleviate this issue, we propose a novel flow that transports noisy data towards their corresponding outputs derived from a consistency model.We prove that this flow reduces the previously identified discrepancy and the noise-data transport cost.Consequently, our method not only accelerates consistency training convergence but also enhances its overall performance. The code is available at https://github.com/thibautissenhuth/consistency_GC.-
dc.publisherICML-
dc.titleImproving Consistency Models with Generator-Augmented Flows-
dc.typeConference-
dc.description.journalClass1-
dc.identifier.bibliographicCitationInternational Conference on Machine Learning (ICML) 2025-
dc.citation.titleInternational Conference on Machine Learning (ICML) 2025-
dc.citation.conferencePlaceCN-
dc.citation.conferencePlaceVancouver Convention Center-
dc.citation.conferenceDate2025-07-13-
dc.relation.isPartOfProceedings of the 42nd International Conference on Machine Learning (ICML)-

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE