Journal articles on the topic 'Adaptive cubic regularization'

To see the other types of publications on this topic, follow the link: Adaptive cubic regularization.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 20 journal articles for your research on the topic 'Adaptive cubic regularization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Zhu, Zhi, and Jingya Chang. "Solving the Adaptive Cubic Regularization Sub-Problem Using the Lanczos Method." Symmetry 14, no. 10 (October 18, 2022): 2191. http://dx.doi.org/10.3390/sym14102191.

Full text
Abstract:
The adaptive cubic regularization method solves an unconstrained optimization model by using a three-order regularization term to approximate the objective function at each iteration. Similar to the trust-region method, the calculation of the sub-problem highly affects the computing efficiency. The Lanczos method is an useful tool for simplifying the objective function in the sub-problem. In this paper, we implement the adaptive cubic regularization method with the aid of the Lanczos method, and analyze the error of Lanczos approximation. We show that both the error between the Lanczos objective function and the original cubic term, and the error between the solution of the Lanczos approximation and the solution of the original cubic sub-problem are bounded up by the condition number of the optimal Hessian matrix. Furthermore, we compare the numerical performances of the adaptive cubic regularization algorithm when using the Lanczos approximation method and the adaptive cubic regularization algorithm without using the Lanczos approximation for unconstrained optimization problems. Numerical experiments show that the Lanczos method improves the computation efficiency of the adaptive cubic method remarkably.
APA, Harvard, Vancouver, ISO, and other styles
2

Gould, N. I. M., M. Porcelli, and P. L. Toint. "Updating the regularization parameter in the adaptive cubic regularization algorithm." Computational Optimization and Applications 53, no. 1 (December 21, 2011): 1–22. http://dx.doi.org/10.1007/s10589-011-9446-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

He, Lingyun, Peng Wang, and Detong Zhu. "Projected Adaptive Cubic Regularization Algorithm with Derivative-Free Filter Technique for Box Constrained Optimization." Discrete Dynamics in Nature and Society 2021 (October 30, 2021): 1–13. http://dx.doi.org/10.1155/2021/1496048.

Full text
Abstract:
An adaptive projected affine scaling algorithm of cubic regularization method using a filter technique for solving box constrained optimization without derivatives is put forward in the passage. The affine scaling interior-point cubic model is based on the quadratic probabilistic interpolation approach on the objective function. The new iterations are obtained by the solutions of the projected adaptive cubic regularization algorithm with filter technique. We prove the convergence of the proposed algorithm under some assumptions. Finally, experiments results showed that the presented algorithm is effective in detail.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Junyu, Lin Xiao, and Shuzhong Zhang. "Adaptive Stochastic Variance Reduction for Subsampled Newton Method with Cubic Regularization." INFORMS Journal on Optimization 4, no. 1 (January 2022): 45–64. http://dx.doi.org/10.1287/ijoo.2021.0058.

Full text
Abstract:
The cubic regularized Newton method of Nesterov and Polyak has become increasingly popular for nonconvex optimization because of its capability of finding an approximate local solution with a second order guarantee and its low iteration complexity. Several recent works extend this method to the setting of minimizing the average of N smooth functions by replacing the exact gradients and Hessians with subsampled approximations. It is shown that the total Hessian sample complexity can be reduced to be sublinear in N per iteration by leveraging stochastic variance reduction techniques. We present an adaptive variance reduction scheme for a subsampled Newton method with cubic regularization and show that the expected Hessian sample complexity is [Formula: see text] for finding an [Formula: see text]-approximate local solution (in terms of first and second order guarantees, respectively). Moreover, we show that the same Hessian sample complexity is retained with fixed sample sizes if exact gradients are used. The techniques of our analysis are different from previous works in that we do not rely on high probability bounds based on matrix concentration inequalities. Instead, we derive and utilize new bounds on the third and fourth order moments of the average of random matrices, which are of independent interest on their own.
APA, Harvard, Vancouver, ISO, and other styles
5

Cartis, Coralia, Nicholas I. M. Gould, and Philippe L. Toint. "Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization." Optimization Methods and Software 27, no. 2 (April 2012): 197–219. http://dx.doi.org/10.1080/10556788.2011.602076.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Park, Seonho, Seung Hyun Jung, and Panos M. Pardalos. "Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization." Journal of Optimization Theory and Applications 184, no. 3 (December 24, 2019): 953–71. http://dx.doi.org/10.1007/s10957-019-01624-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Doikov, Nikita, and Yurii Nesterov. "Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method." Journal of Optimization Theory and Applications 189, no. 1 (March 10, 2021): 317–39. http://dx.doi.org/10.1007/s10957-021-01838-7.

Full text
Abstract:
AbstractIn this paper, we study the iteration complexity of cubic regularization of Newton method for solving composite minimization problems with uniformly convex objective. We introduce the notion of second-order condition number of a certain degree and justify the linear rate of convergence in a nondegenerate case for the method with an adaptive estimate of the regularization parameter. The algorithm automatically achieves the best possible global complexity bound among different problem classes of uniformly convex objective functions with Hölder continuous Hessian of the smooth part of the objective. As a byproduct of our developments, we justify an intuitively plausible result that the global iteration complexity of the Newton method is always better than that of the gradient method on the class of strongly convex functions with uniformly bounded second derivative.
APA, Harvard, Vancouver, ISO, and other styles
8

Lu, Sha, Zengxin Wei, and Lue Li. "A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization." Computational Optimization and Applications 51, no. 2 (October 19, 2010): 551–73. http://dx.doi.org/10.1007/s10589-010-9363-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Li, Qun, Bing Zheng, and Yutao Zheng. "An efficient nonmonotone adaptive cubic regularization method with line search for unconstrained optimization problem." Applied Mathematics Letters 98 (December 2019): 74–80. http://dx.doi.org/10.1016/j.aml.2019.05.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bergou, El Houcine, Youssef Diouane, and Serge Gratton. "A Line-Search Algorithm Inspired by the Adaptive Cubic Regularization Framework and Complexity Analysis." Journal of Optimization Theory and Applications 178, no. 3 (July 10, 2018): 885–913. http://dx.doi.org/10.1007/s10957-018-1341-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Bergou, E., Y. Diouane, and S. Gratton. "On the use of the energy norm in trust-region and adaptive cubic regularization subproblems." Computational Optimization and Applications 68, no. 3 (July 21, 2017): 533–54. http://dx.doi.org/10.1007/s10589-017-9929-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Cartis, C., N. I. M. Gould, and P. L. Toint. "An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity." IMA Journal of Numerical Analysis 32, no. 4 (January 17, 2012): 1662–95. http://dx.doi.org/10.1093/imanum/drr035.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Dussault, Jean-Pierre. "ARCq: a new adaptive regularization by cubics." Optimization Methods and Software 33, no. 2 (May 25, 2017): 322–35. http://dx.doi.org/10.1080/10556788.2017.1322080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Curtis, Frank E., Daniel P. Robinson, and Mohammadreza Samadi. "An inexact regularized Newton framework with a worst-case iteration complexity of $ {\mathscr O}(\varepsilon^{-3/2}) $ for nonconvex optimization." IMA Journal of Numerical Analysis 39, no. 3 (May 8, 2018): 1296–327. http://dx.doi.org/10.1093/imanum/dry022.

Full text
Abstract:
Abstract An algorithm for solving smooth nonconvex optimization problems is proposed that, in the worst-case, takes ${\mathscr O}(\varepsilon ^{-3/2})$ iterations to drive the norm of the gradient of the objective function below a prescribed positive real number $\varepsilon $ and can take ${\mathscr O}(\varepsilon ^{-3})$ iterations to drive the leftmost eigenvalue of the Hessian of the objective above $-\varepsilon $. The proposed algorithm is a general framework that covers a wide range of techniques including quadratically and cubically regularized Newton methods, such as the Adaptive Regularization using Cubics (arc) method and the recently proposed Trust-Region Algorithm with Contractions and Expansions (trace). The generality of our method is achieved through the introduction of generic conditions that each trial step is required to satisfy, which in particular allows for inexact regularized Newton steps to be used. These conditions center around a new subproblem that can be approximately solved to obtain trial steps that satisfy the conditions. A new instance of the framework, distinct from arc and trace, is described that may be viewed as a hybrid between quadratically and cubically regularized Newton methods. Numerical results demonstrate that our hybrid algorithm outperforms a cubically regularized Newton method.
APA, Harvard, Vancouver, ISO, and other styles
15

Dehghan Niri, T., M. Heydari, and M. M. Hosseini. "An improvement of adaptive cubic regularization method for unconstrained optimization problems." International Journal of Computer Mathematics, March 11, 2020, 1–17. http://dx.doi.org/10.1080/00207160.2020.1738406.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Pei, Yonggang, Shaofang Song, and Detong Zhu. "A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization." Numerical Algorithms, December 8, 2022. http://dx.doi.org/10.1007/s11075-022-01475-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Dehghan Niri, T., M. Heydari, and M. M. Hosseini. "Two modified adaptive cubic regularization algorithms by using the nonmonotone Armijo-type line search." Optimization, May 15, 2022, 1–24. http://dx.doi.org/10.1080/02331934.2022.2075746.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Bellavia, Stefania, Gianmarco Gurioli, and Benedetta Morini. "Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization." IMA Journal of Numerical Analysis, April 23, 2020. http://dx.doi.org/10.1093/imanum/drz076.

Full text
Abstract:
Abstract We consider the adaptive regularization with cubics approach for solving nonconvex optimization problems and propose a new variant based on inexact Hessian information chosen dynamically. The theoretical analysis of the proposed procedure is given. The key property of ARC framework, constituted by optimal worst-case function/derivative evaluation bounds for first- and second-order critical point, is guaranteed. Application to large-scale finite-sum minimization based on subsampled Hessian is discussed and analyzed in both a deterministic and probabilistic manner, and equipped with numerical experiments on synthetic and real datasets.
APA, Harvard, Vancouver, ISO, and other styles
19

Bellavia, Stefania, and Gianmarco Gurioli. "Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy." Optimization, February 28, 2021, 1–35. http://dx.doi.org/10.1080/02331934.2021.1892104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Agarwal, Naman, Nicolas Boumal, Brian Bullins, and Coralia Cartis. "Adaptive regularization with cubics on manifolds." Mathematical Programming, May 13, 2020. http://dx.doi.org/10.1007/s10107-020-01505-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography