Journal article icon

Journal article

Scalable subspace methods for derivative-free nonlinear least-squares optimization

Abstract:

We introduce a general framework for large-scale model-based derivative-free optimization based on iterative minimization within random subspaces. We present a probabilistic worst-case complexity analysis for our method, where in particular we prove high-probability bounds on the number of iterations before a given optimality is achieved. This framework is specialized to nonlinear least-squares problems, with a model-based framework based on the Gauss–Newton method. This method achieves scala...

Expand abstract
Publication status:
Published
Peer review status:
Peer reviewed

Actions


Access Document


Files:
Publisher copy:
10.1007/s10107-022-01836-1

Authors


More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Mathematical Institute
Oxford college:
Balliol College
Role:
Author
ORCID:
0000-0002-0963-5550
Publisher:
Springer
Journal:
Mathematical Programming More from this journal
Volume:
199
Pages:
461-524
Publication date:
2022-06-09
Acceptance date:
2022-05-10
DOI:
EISSN:
1436-4646
ISSN:
0025-5610
Language:
English
Keywords:
Pubs id:
1265676
Local pid:
pubs:1265676
Deposit date:
2022-07-24

Terms of use


Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP