Rupprecht C, Laina I, Dipietro R, Baust M, Tombari F, Navab N, Hager GD (2017)
Publication Type: Conference contribution
Publication year: 2017
Publisher: Institute of Electrical and Electronics Engineers Inc.
Book Volume: 2017-October
Pages Range: 3611-3620
Conference Proceedings Title: Proceedings of the IEEE International Conference on Computer Vision
Event location: Venice, ITA
ISBN: 9781538610329
Many prediction tasks contain uncertainty. In some cases, uncertainty is inherent in the task itself. In future prediction, for example, many distinct outcomes are equally valid. In other cases, uncertainty arises from the way data is labeled. For example, in object detection, many objects of interest often go unlabeled, and in human pose estimation, occluded joints are often labeled with ambiguous values. In this work we focus on a principled approach for handling such scenarios. In particular, we propose a frame-work for reformulating existing single-prediction models as multiple hypothesis prediction (MHP) models and an associated meta loss and optimization procedure to train them. To demonstrate our approach, we consider four diverse applications: human pose estimation, future prediction, image classification and segmentation. We find that MHP models outperform their single-hypothesis counterparts in all cases, and that MHP models simultaneously expose valuable insights into the variability of predictions.
APA:
Rupprecht, C., Laina, I., Dipietro, R., Baust, M., Tombari, F., Navab, N., & Hager, G.D. (2017). Learning in an Uncertain World: Representing Ambiguity Through Multiple Hypotheses. In Proceedings of the IEEE International Conference on Computer Vision (pp. 3611-3620). Venice, ITA: Institute of Electrical and Electronics Engineers Inc..
MLA:
Rupprecht, Christian, et al. "Learning in an Uncertain World: Representing Ambiguity Through Multiple Hypotheses." Proceedings of the 16th IEEE International Conference on Computer Vision, ICCV 2017, Venice, ITA Institute of Electrical and Electronics Engineers Inc., 2017. 3611-3620.
BibTeX: Download