Littérature scientifique sur le sujet « Adaptive cubic regularization »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Adaptive cubic regularization ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Articles de revues sur le sujet "Adaptive cubic regularization"

1

Zhu, Zhi, et Jingya Chang. « Solving the Adaptive Cubic Regularization Sub-Problem Using the Lanczos Method ». Symmetry 14, no 10 (18 octobre 2022) : 2191. http://dx.doi.org/10.3390/sym14102191.

Texte intégral
Résumé :
The adaptive cubic regularization method solves an unconstrained optimization model by using a three-order regularization term to approximate the objective function at each iteration. Similar to the trust-region method, the calculation of the sub-problem highly affects the computing efficiency. The Lanczos method is an useful tool for simplifying the objective function in the sub-problem. In this paper, we implement the adaptive cubic regularization method with the aid of the Lanczos method, and analyze the error of Lanczos approximation. We show that both the error between the Lanczos objective function and the original cubic term, and the error between the solution of the Lanczos approximation and the solution of the original cubic sub-problem are bounded up by the condition number of the optimal Hessian matrix. Furthermore, we compare the numerical performances of the adaptive cubic regularization algorithm when using the Lanczos approximation method and the adaptive cubic regularization algorithm without using the Lanczos approximation for unconstrained optimization problems. Numerical experiments show that the Lanczos method improves the computation efficiency of the adaptive cubic method remarkably.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Gould, N. I. M., M. Porcelli et P. L. Toint. « Updating the regularization parameter in the adaptive cubic regularization algorithm ». Computational Optimization and Applications 53, no 1 (21 décembre 2011) : 1–22. http://dx.doi.org/10.1007/s10589-011-9446-7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

He, Lingyun, Peng Wang et Detong Zhu. « Projected Adaptive Cubic Regularization Algorithm with Derivative-Free Filter Technique for Box Constrained Optimization ». Discrete Dynamics in Nature and Society 2021 (30 octobre 2021) : 1–13. http://dx.doi.org/10.1155/2021/1496048.

Texte intégral
Résumé :
An adaptive projected affine scaling algorithm of cubic regularization method using a filter technique for solving box constrained optimization without derivatives is put forward in the passage. The affine scaling interior-point cubic model is based on the quadratic probabilistic interpolation approach on the objective function. The new iterations are obtained by the solutions of the projected adaptive cubic regularization algorithm with filter technique. We prove the convergence of the proposed algorithm under some assumptions. Finally, experiments results showed that the presented algorithm is effective in detail.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Zhang, Junyu, Lin Xiao et Shuzhong Zhang. « Adaptive Stochastic Variance Reduction for Subsampled Newton Method with Cubic Regularization ». INFORMS Journal on Optimization 4, no 1 (janvier 2022) : 45–64. http://dx.doi.org/10.1287/ijoo.2021.0058.

Texte intégral
Résumé :
The cubic regularized Newton method of Nesterov and Polyak has become increasingly popular for nonconvex optimization because of its capability of finding an approximate local solution with a second order guarantee and its low iteration complexity. Several recent works extend this method to the setting of minimizing the average of N smooth functions by replacing the exact gradients and Hessians with subsampled approximations. It is shown that the total Hessian sample complexity can be reduced to be sublinear in N per iteration by leveraging stochastic variance reduction techniques. We present an adaptive variance reduction scheme for a subsampled Newton method with cubic regularization and show that the expected Hessian sample complexity is [Formula: see text] for finding an [Formula: see text]-approximate local solution (in terms of first and second order guarantees, respectively). Moreover, we show that the same Hessian sample complexity is retained with fixed sample sizes if exact gradients are used. The techniques of our analysis are different from previous works in that we do not rely on high probability bounds based on matrix concentration inequalities. Instead, we derive and utilize new bounds on the third and fourth order moments of the average of random matrices, which are of independent interest on their own.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Cartis, Coralia, Nicholas I. M. Gould et Philippe L. Toint. « Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization ». Optimization Methods and Software 27, no 2 (avril 2012) : 197–219. http://dx.doi.org/10.1080/10556788.2011.602076.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Park, Seonho, Seung Hyun Jung et Panos M. Pardalos. « Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization ». Journal of Optimization Theory and Applications 184, no 3 (24 décembre 2019) : 953–71. http://dx.doi.org/10.1007/s10957-019-01624-6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Doikov, Nikita, et Yurii Nesterov. « Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method ». Journal of Optimization Theory and Applications 189, no 1 (10 mars 2021) : 317–39. http://dx.doi.org/10.1007/s10957-021-01838-7.

Texte intégral
Résumé :
AbstractIn this paper, we study the iteration complexity of cubic regularization of Newton method for solving composite minimization problems with uniformly convex objective. We introduce the notion of second-order condition number of a certain degree and justify the linear rate of convergence in a nondegenerate case for the method with an adaptive estimate of the regularization parameter. The algorithm automatically achieves the best possible global complexity bound among different problem classes of uniformly convex objective functions with Hölder continuous Hessian of the smooth part of the objective. As a byproduct of our developments, we justify an intuitively plausible result that the global iteration complexity of the Newton method is always better than that of the gradient method on the class of strongly convex functions with uniformly bounded second derivative.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Lu, Sha, Zengxin Wei et Lue Li. « A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization ». Computational Optimization and Applications 51, no 2 (19 octobre 2010) : 551–73. http://dx.doi.org/10.1007/s10589-010-9363-1.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Li, Qun, Bing Zheng et Yutao Zheng. « An efficient nonmonotone adaptive cubic regularization method with line search for unconstrained optimization problem ». Applied Mathematics Letters 98 (décembre 2019) : 74–80. http://dx.doi.org/10.1016/j.aml.2019.05.040.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Bergou, El Houcine, Youssef Diouane et Serge Gratton. « A Line-Search Algorithm Inspired by the Adaptive Cubic Regularization Framework and Complexity Analysis ». Journal of Optimization Theory and Applications 178, no 3 (10 juillet 2018) : 885–913. http://dx.doi.org/10.1007/s10957-018-1341-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Thèses sur le sujet "Adaptive cubic regularization"

1

Göllner, Thea [Verfasser]. « Geometry Optimization of Branched Sheet Metal Structures with a Globalization Strategy by Adaptive Cubic Regularization / Thea Göllner ». München : Verlag Dr. Hut, 2014. http://d-nb.info/1064560539/34.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie