Abstract:
We propose DIVERSE, a framework for systematically exploring the Rashomon set of deep neural networks, the collection of models that match a reference model's accuracy while differing in their predictive behavior. DIVERSE augments a pretrained model with Feature-wise Linear Modulation (FiLM) layers and uses Covariance Matrix Adaptation Evolution Strategy (CMA-ES) to search a latent modulation space, generating diverse model variants without retraining or gradient access. Across MNIST, PneumoniaMNIST, and CIFAR-10, DIVERSE uncovers multiple high-performing yet functionally distinct models. Our experiments show that DIVERSE offers a competitive and efficient exploration of the Rashomon set, making it feasible to construct diverse sets that maintain robustness and performance while supporting well-balanced model multiplicity. While retraining remains the baseline for generating Rashomon sets, DIVERSE achieves comparable diversity at reduced computational cost.
Cite (BibTeX):
@inproceedings{eerlings2026diverse,
author = {Eerlings, Gilles and Zoomers, Brent and Liesenborgs, Jori and Rovelo Ruiz, Gustavo and Luyten, Kris},
title = {DIVERSE: Disagreement-inducing vector evolution for rashomon set exploration},
booktitle = {International conference on learning representations (ICLR 2026)},
year = {2026},
publisher = {OpenReview.net},
url = {https://arxiv.org/abs/2601.20627}
}