Improving Consistency Models with Generator-Augmented Flows
- Authors
- Issenhuth, Thibaut; Lee, Sangchul; Dos Santos, Ludovic; Franceschi, Jean-Yves; Kim, Chan soo; Rakotomamonjy, Alan
- Issue Date
- 2025-07
- Publisher
- ICML
- Citation
- International Conference on Machine Learning (ICML) 2025
- Abstract
- Consistency models imitate the multi-step sampling of score-based diffusion in a single forward pass of a neural network.They can be learned in two ways: consistency distillation and consistency training. The former relies on the true velocity field of the corresponding differential equation, approximated by a pre-trained neural network.In contrast, the latter uses a single-sample Monte Carlo estimate of this velocity field.The related estimation error induces a discrepancy between consistency distillation and training that, we show, still holds in the continuous-time limit.To alleviate this issue, we propose a novel flow that transports noisy data towards their corresponding outputs derived from a consistency model.We prove that this flow reduces the previously identified discrepancy and the noise-data transport cost.Consequently, our method not only accelerates consistency training convergence but also enhances its overall performance. The code is available at https://github.com/thibautissenhuth/consistency_GC.
- URI
Go to Link
- Appears in Collections:
- KIST Conference Paper > 2025
- Export
- RIS (EndNote)
- XLS (Excel)
- XML
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.