Conference item
Sampling-based Nystrom approximation and kernel quadrature
- Abstract:
- We analyze the Nyström approximation of a positive definite kernel associated with a probability measure. We first prove an improved error bound for the conventional Nyström approximation with i.i.d. sampling and singular-value decomposition in the continuous regime; the proof techniques are borrowed from statistical learning theory. We further introduce a refined selection of subspaces in Nyström approximation with theoretical guarantees that is applicable to non-i.i.d. landmark points. Finally, we discuss their application to convex kernel quadrature and give novel theoretical guarantees as well as numerical observations.
- Publication status:
- Published
- Peer review status:
- Peer reviewed
Actions
Access Document
- Files:
-
-
(Preview, Version of record, pdf, 502.0KB, Terms of use)
-
- Publication website:
- https://proceedings.mlr.press/v202/hayakawa23a.html
Authors
- Publisher:
- Proceedings of Machine Learning Research
- Host title:
- Proceedings of the 40th International Conference on Machine Learning
- Volume:
- 202
- Pages:
- 12678-12699
- Publication date:
- 2023-05-08
- Acceptance date:
- 2023-04-24
- Event title:
- The 40th International Conference on Machine Learning
- Event series:
- International Conference on Machine Learning
- Event location:
- Honolulu, Hawaii, USA
- Event website:
- https://icml.cc/Conferences/2023
- Event start date:
- 2023-07-23
- Event end date:
- 2023-07-29
- ISSN:
-
2640-3498
- Language:
-
English
- Keywords:
- Pubs id:
-
1532598
- Local pid:
-
pubs:1532598
- Deposit date:
-
2023-09-19
Terms of use
- Copyright holder:
- Hayakawa et al
- Copyright date:
- 2023
- Rights statement:
- ©2023 by the author(s). This is an Open Access article under the CC BY 4.0 license.
- Licence:
- CC Attribution (CC BY)
If you are the owner of this record, you can report an update to it here: Report update to this record