Li Z, Cechnicka S, Ouyang C, Breininger K, Schüffler P, Kainz B (2025)
Publication Type: Journal article
Publication year: 2025
Book Volume: 113
Article Number: 104623
DOI: 10.1016/j.jvcir.2025.104623
As deep learning models continue to scale in complexity and data size, reducing storage and training costs has become increasingly important. Dataset distillation addresses this challenge by synthesizing a small set of synthetic samples that effectively substitute for the original dataset in downstream tasks. Existing approaches typically rely on matching gradients or features either in pixel space or in the latent space of a pretrained generative model. We propose a novel stochastic distillation method that models the joint distribution of latent features using a low-rank multivariate normal distribution, parameterized by a lightweight neural network. This formulation captures spatial correlations in the feature space, which are then projected into class probability space to generate more diverse and informative predictions. The proposed module integrates seamlessly with existing distillation pipelines. Our method achieves state-of-the-art cross-architecture results, improving test accuracy by up to 7.47% in gradient matching and 35.71% in distribution matching over baselines.
APA:
Li, Z., Cechnicka, S., Ouyang, C., Breininger, K., Schüffler, P., & Kainz, B. (2025). Stochastic latent feature distillation: Enhancing dataset distillation via structured uncertainty modeling. Journal of Visual Communication and Image Representation, 113. https://doi.org/10.1016/j.jvcir.2025.104623
MLA:
Li, Zhe, et al. "Stochastic latent feature distillation: Enhancing dataset distillation via structured uncertainty modeling." Journal of Visual Communication and Image Representation 113 (2025).
BibTeX: Download