Skip navigation

Generalization Error Bound for Hyperbolic Ordinal Embedding

Generalization Error Bound for Hyperbolic Ordinal Embedding

Suzuki, Atsushi, Nitanda, Atsushi, Wang, Jing, Xu, Linchuan, Yamanishi, Kenji and Cavazza, Marc (2021) Generalization Error Bound for Hyperbolic Ordinal Embedding. In: Proceedings of the 38th International Conference on Machine Learning. Volume 139: International Conference on Machine Learning, 18th - 24th July 2021, Virtual. Proceedings of Machine Learning Research (PMLR) Press - Journal of Machine Learning Research (JMLR), Cambridge MA, USA, pp. 10011-10021. ISSN 1938-7228 (Print), 2640-3498 (Online)

[thumbnail of Conference paper]
Preview
PDF (Conference paper)
48651 WANG_Generalization_Error_Bound_For_Hyperbolic_Ordinal_Embedding_(OA)_2021.pdf - Published Version
Available under License Creative Commons Attribution.

Download (344kB) | Preview
[thumbnail of Supplemental material]
Preview
PDF (Supplemental material)
48651 WANG_Generalization_Error_Bound_For_Hyperbolic_Ordinal_Embedding_(SUPPLEMENTAL)_2021 (1).pdf - Supplemental Material
Available under License Creative Commons Attribution.

Download (273kB) | Preview

Abstract

Hyperbolic ordinal embedding (HOE) represents entities as points in hyperbolic space so that they agree as well as possible with given constraints in the form of entity i is more similar to entity j than to entity k. It has been experimentally shown that HOE can obtain representations of hierarchical data such as a knowledge base and a citation network effectively, owing to hyperbolic space’s exponential growth property. However, its theoretical analysis has been limited to ideal noiseless settings, and its generalization error in compensation for hyperbolic space’s exponential representation ability has not been guaranteed. The difficulty is that existing generalization error bound derivations for ordinal embedding based on the Gramian matrix are not applicable in HOE, since hyperbolic space is not inner-product space. In this paper, through our novel characterization of HOE with decomposed Lorentz Gramian matrices, we provide a generalization error bound of HOE for the first time, which is at most exponential with respect to the embedding space’s radius. Our comparison between the bounds of HOE and Euclidean ordinal embedding shows that HOE’s generalization error comes at a reasonable cost considering its exponential representation ability.

Item Type: Conference Proceedings
Title of Proceedings: Proceedings of the 38th International Conference on Machine Learning. Volume 139: International Conference on Machine Learning, 18th - 24th July 2021, Virtual
Uncontrolled Keywords: hyperbolic, representation learning, machine learning
Subjects: Q Science > Q Science (General)
Q Science > QA Mathematics
Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Faculty / School / Research Centre / Research Group: Faculty of Engineering & Science
Faculty of Engineering & Science > School of Computing & Mathematical Sciences (CMS)
Related URLs:
Last Modified: 19 Nov 2024 16:27
URI: http://gala.gre.ac.uk/id/eprint/48651

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics