Academic literature on the topic 'Adaptive cubic regularization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Adaptive cubic regularization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Adaptive cubic regularization"

1

Zhu, Zhi, and Jingya Chang. "Solving the Adaptive Cubic Regularization Sub-Problem Using the Lanczos Method." Symmetry 14, no. 10 (October 18, 2022): 2191. http://dx.doi.org/10.3390/sym14102191.

Full text
Abstract:
The adaptive cubic regularization method solves an unconstrained optimization model by using a three-order regularization term to approximate the objective function at each iteration. Similar to the trust-region method, the calculation of the sub-problem highly affects the computing efficiency. The Lanczos method is an useful tool for simplifying the objective function in the sub-problem. In this paper, we implement the adaptive cubic regularization method with the aid of the Lanczos method, and analyze the error of Lanczos approximation. We show that both the error between the Lanczos objective function and the original cubic term, and the error between the solution of the Lanczos approximation and the solution of the original cubic sub-problem are bounded up by the condition number of the optimal Hessian matrix. Furthermore, we compare the numerical performances of the adaptive cubic regularization algorithm when using the Lanczos approximation method and the adaptive cubic regularization algorithm without using the Lanczos approximation for unconstrained optimization problems. Numerical experiments show that the Lanczos method improves the computation efficiency of the adaptive cubic method remarkably.
APA, Harvard, Vancouver, ISO, and other styles
2

Gould, N. I. M., M. Porcelli, and P. L. Toint. "Updating the regularization parameter in the adaptive cubic regularization algorithm." Computational Optimization and Applications 53, no. 1 (December 21, 2011): 1–22. http://dx.doi.org/10.1007/s10589-011-9446-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

He, Lingyun, Peng Wang, and Detong Zhu. "Projected Adaptive Cubic Regularization Algorithm with Derivative-Free Filter Technique for Box Constrained Optimization." Discrete Dynamics in Nature and Society 2021 (October 30, 2021): 1–13. http://dx.doi.org/10.1155/2021/1496048.

Full text
Abstract:
An adaptive projected affine scaling algorithm of cubic regularization method using a filter technique for solving box constrained optimization without derivatives is put forward in the passage. The affine scaling interior-point cubic model is based on the quadratic probabilistic interpolation approach on the objective function. The new iterations are obtained by the solutions of the projected adaptive cubic regularization algorithm with filter technique. We prove the convergence of the proposed algorithm under some assumptions. Finally, experiments results showed that the presented algorithm is effective in detail.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Junyu, Lin Xiao, and Shuzhong Zhang. "Adaptive Stochastic Variance Reduction for Subsampled Newton Method with Cubic Regularization." INFORMS Journal on Optimization 4, no. 1 (January 2022): 45–64. http://dx.doi.org/10.1287/ijoo.2021.0058.

Full text
Abstract:
The cubic regularized Newton method of Nesterov and Polyak has become increasingly popular for nonconvex optimization because of its capability of finding an approximate local solution with a second order guarantee and its low iteration complexity. Several recent works extend this method to the setting of minimizing the average of N smooth functions by replacing the exact gradients and Hessians with subsampled approximations. It is shown that the total Hessian sample complexity can be reduced to be sublinear in N per iteration by leveraging stochastic variance reduction techniques. We present an adaptive variance reduction scheme for a subsampled Newton method with cubic regularization and show that the expected Hessian sample complexity is [Formula: see text] for finding an [Formula: see text]-approximate local solution (in terms of first and second order guarantees, respectively). Moreover, we show that the same Hessian sample complexity is retained with fixed sample sizes if exact gradients are used. The techniques of our analysis are different from previous works in that we do not rely on high probability bounds based on matrix concentration inequalities. Instead, we derive and utilize new bounds on the third and fourth order moments of the average of random matrices, which are of independent interest on their own.
APA, Harvard, Vancouver, ISO, and other styles
5

Cartis, Coralia, Nicholas I. M. Gould, and Philippe L. Toint. "Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization." Optimization Methods and Software 27, no. 2 (April 2012): 197–219. http://dx.doi.org/10.1080/10556788.2011.602076.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Park, Seonho, Seung Hyun Jung, and Panos M. Pardalos. "Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization." Journal of Optimization Theory and Applications 184, no. 3 (December 24, 2019): 953–71. http://dx.doi.org/10.1007/s10957-019-01624-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Doikov, Nikita, and Yurii Nesterov. "Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method." Journal of Optimization Theory and Applications 189, no. 1 (March 10, 2021): 317–39. http://dx.doi.org/10.1007/s10957-021-01838-7.

Full text
Abstract:
AbstractIn this paper, we study the iteration complexity of cubic regularization of Newton method for solving composite minimization problems with uniformly convex objective. We introduce the notion of second-order condition number of a certain degree and justify the linear rate of convergence in a nondegenerate case for the method with an adaptive estimate of the regularization parameter. The algorithm automatically achieves the best possible global complexity bound among different problem classes of uniformly convex objective functions with Hölder continuous Hessian of the smooth part of the objective. As a byproduct of our developments, we justify an intuitively plausible result that the global iteration complexity of the Newton method is always better than that of the gradient method on the class of strongly convex functions with uniformly bounded second derivative.
APA, Harvard, Vancouver, ISO, and other styles
8

Lu, Sha, Zengxin Wei, and Lue Li. "A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization." Computational Optimization and Applications 51, no. 2 (October 19, 2010): 551–73. http://dx.doi.org/10.1007/s10589-010-9363-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Li, Qun, Bing Zheng, and Yutao Zheng. "An efficient nonmonotone adaptive cubic regularization method with line search for unconstrained optimization problem." Applied Mathematics Letters 98 (December 2019): 74–80. http://dx.doi.org/10.1016/j.aml.2019.05.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bergou, El Houcine, Youssef Diouane, and Serge Gratton. "A Line-Search Algorithm Inspired by the Adaptive Cubic Regularization Framework and Complexity Analysis." Journal of Optimization Theory and Applications 178, no. 3 (July 10, 2018): 885–913. http://dx.doi.org/10.1007/s10957-018-1341-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Adaptive cubic regularization"

1

Göllner, Thea [Verfasser]. "Geometry Optimization of Branched Sheet Metal Structures with a Globalization Strategy by Adaptive Cubic Regularization / Thea Göllner." München : Verlag Dr. Hut, 2014. http://d-nb.info/1064560539/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography