Conference item icon

Conference item

Sampling-based Nystrom approximation and kernel quadrature

Abstract:
We analyze the Nyström approximation of a positive definite kernel associated with a probability measure. We first prove an improved error bound for the conventional Nyström approximation with i.i.d. sampling and singular-value decomposition in the continuous regime; the proof techniques are borrowed from statistical learning theory. We further introduce a refined selection of subspaces in Nyström approximation with theoretical guarantees that is applicable to non-i.i.d. landmark points. Finally, we discuss their application to convex kernel quadrature and give novel theoretical guarantees as well as numerical observations.
Publication status:
Published
Peer review status:
Peer reviewed

Actions


Access Document


Publication website:
https://proceedings.mlr.press/v202/hayakawa23a.html

Authors


More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Mathematical Institute
Oxford college:
St Anne's College
Role:
Author
ORCID:
0000-0002-9972-2809


Publisher:
Proceedings of Machine Learning Research
Host title:
Proceedings of the 40th International Conference on Machine Learning
Volume:
202
Pages:
12678-12699
Publication date:
2023-05-08
Acceptance date:
2023-04-24
Event title:
The 40th International Conference on Machine Learning
Event series:
International Conference on Machine Learning
Event location:
Honolulu, Hawaii, USA
Event website:
https://icml.cc/Conferences/2023
Event start date:
2023-07-23
Event end date:
2023-07-29
ISSN:
2640-3498


Language:
English
Keywords:
Pubs id:
1532598
Local pid:
pubs:1532598
Deposit date:
2023-09-19

Terms of use



Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP