Articles de revues sur le sujet « Adaptive cubic regularization »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Adaptive cubic regularization.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 20 meilleurs articles de revues pour votre recherche sur le sujet « Adaptive cubic regularization ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Zhu, Zhi, et Jingya Chang. « Solving the Adaptive Cubic Regularization Sub-Problem Using the Lanczos Method ». Symmetry 14, no 10 (18 octobre 2022) : 2191. http://dx.doi.org/10.3390/sym14102191.

Texte intégral
Résumé :
The adaptive cubic regularization method solves an unconstrained optimization model by using a three-order regularization term to approximate the objective function at each iteration. Similar to the trust-region method, the calculation of the sub-problem highly affects the computing efficiency. The Lanczos method is an useful tool for simplifying the objective function in the sub-problem. In this paper, we implement the adaptive cubic regularization method with the aid of the Lanczos method, and analyze the error of Lanczos approximation. We show that both the error between the Lanczos objective function and the original cubic term, and the error between the solution of the Lanczos approximation and the solution of the original cubic sub-problem are bounded up by the condition number of the optimal Hessian matrix. Furthermore, we compare the numerical performances of the adaptive cubic regularization algorithm when using the Lanczos approximation method and the adaptive cubic regularization algorithm without using the Lanczos approximation for unconstrained optimization problems. Numerical experiments show that the Lanczos method improves the computation efficiency of the adaptive cubic method remarkably.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Gould, N. I. M., M. Porcelli et P. L. Toint. « Updating the regularization parameter in the adaptive cubic regularization algorithm ». Computational Optimization and Applications 53, no 1 (21 décembre 2011) : 1–22. http://dx.doi.org/10.1007/s10589-011-9446-7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

He, Lingyun, Peng Wang et Detong Zhu. « Projected Adaptive Cubic Regularization Algorithm with Derivative-Free Filter Technique for Box Constrained Optimization ». Discrete Dynamics in Nature and Society 2021 (30 octobre 2021) : 1–13. http://dx.doi.org/10.1155/2021/1496048.

Texte intégral
Résumé :
An adaptive projected affine scaling algorithm of cubic regularization method using a filter technique for solving box constrained optimization without derivatives is put forward in the passage. The affine scaling interior-point cubic model is based on the quadratic probabilistic interpolation approach on the objective function. The new iterations are obtained by the solutions of the projected adaptive cubic regularization algorithm with filter technique. We prove the convergence of the proposed algorithm under some assumptions. Finally, experiments results showed that the presented algorithm is effective in detail.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Zhang, Junyu, Lin Xiao et Shuzhong Zhang. « Adaptive Stochastic Variance Reduction for Subsampled Newton Method with Cubic Regularization ». INFORMS Journal on Optimization 4, no 1 (janvier 2022) : 45–64. http://dx.doi.org/10.1287/ijoo.2021.0058.

Texte intégral
Résumé :
The cubic regularized Newton method of Nesterov and Polyak has become increasingly popular for nonconvex optimization because of its capability of finding an approximate local solution with a second order guarantee and its low iteration complexity. Several recent works extend this method to the setting of minimizing the average of N smooth functions by replacing the exact gradients and Hessians with subsampled approximations. It is shown that the total Hessian sample complexity can be reduced to be sublinear in N per iteration by leveraging stochastic variance reduction techniques. We present an adaptive variance reduction scheme for a subsampled Newton method with cubic regularization and show that the expected Hessian sample complexity is [Formula: see text] for finding an [Formula: see text]-approximate local solution (in terms of first and second order guarantees, respectively). Moreover, we show that the same Hessian sample complexity is retained with fixed sample sizes if exact gradients are used. The techniques of our analysis are different from previous works in that we do not rely on high probability bounds based on matrix concentration inequalities. Instead, we derive and utilize new bounds on the third and fourth order moments of the average of random matrices, which are of independent interest on their own.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Cartis, Coralia, Nicholas I. M. Gould et Philippe L. Toint. « Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization ». Optimization Methods and Software 27, no 2 (avril 2012) : 197–219. http://dx.doi.org/10.1080/10556788.2011.602076.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Park, Seonho, Seung Hyun Jung et Panos M. Pardalos. « Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization ». Journal of Optimization Theory and Applications 184, no 3 (24 décembre 2019) : 953–71. http://dx.doi.org/10.1007/s10957-019-01624-6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Doikov, Nikita, et Yurii Nesterov. « Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method ». Journal of Optimization Theory and Applications 189, no 1 (10 mars 2021) : 317–39. http://dx.doi.org/10.1007/s10957-021-01838-7.

Texte intégral
Résumé :
AbstractIn this paper, we study the iteration complexity of cubic regularization of Newton method for solving composite minimization problems with uniformly convex objective. We introduce the notion of second-order condition number of a certain degree and justify the linear rate of convergence in a nondegenerate case for the method with an adaptive estimate of the regularization parameter. The algorithm automatically achieves the best possible global complexity bound among different problem classes of uniformly convex objective functions with Hölder continuous Hessian of the smooth part of the objective. As a byproduct of our developments, we justify an intuitively plausible result that the global iteration complexity of the Newton method is always better than that of the gradient method on the class of strongly convex functions with uniformly bounded second derivative.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Lu, Sha, Zengxin Wei et Lue Li. « A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization ». Computational Optimization and Applications 51, no 2 (19 octobre 2010) : 551–73. http://dx.doi.org/10.1007/s10589-010-9363-1.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Li, Qun, Bing Zheng et Yutao Zheng. « An efficient nonmonotone adaptive cubic regularization method with line search for unconstrained optimization problem ». Applied Mathematics Letters 98 (décembre 2019) : 74–80. http://dx.doi.org/10.1016/j.aml.2019.05.040.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Bergou, El Houcine, Youssef Diouane et Serge Gratton. « A Line-Search Algorithm Inspired by the Adaptive Cubic Regularization Framework and Complexity Analysis ». Journal of Optimization Theory and Applications 178, no 3 (10 juillet 2018) : 885–913. http://dx.doi.org/10.1007/s10957-018-1341-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
11

Bergou, E., Y. Diouane et S. Gratton. « On the use of the energy norm in trust-region and adaptive cubic regularization subproblems ». Computational Optimization and Applications 68, no 3 (21 juillet 2017) : 533–54. http://dx.doi.org/10.1007/s10589-017-9929-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
12

Cartis, C., N. I. M. Gould et P. L. Toint. « An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity ». IMA Journal of Numerical Analysis 32, no 4 (17 janvier 2012) : 1662–95. http://dx.doi.org/10.1093/imanum/drr035.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
13

Dussault, Jean-Pierre. « ARCq : a new adaptive regularization by cubics ». Optimization Methods and Software 33, no 2 (25 mai 2017) : 322–35. http://dx.doi.org/10.1080/10556788.2017.1322080.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
14

Curtis, Frank E., Daniel P. Robinson et Mohammadreza Samadi. « An inexact regularized Newton framework with a worst-case iteration complexity of $ {\mathscr O}(\varepsilon^{-3/2}) $ for nonconvex optimization ». IMA Journal of Numerical Analysis 39, no 3 (8 mai 2018) : 1296–327. http://dx.doi.org/10.1093/imanum/dry022.

Texte intégral
Résumé :
Abstract An algorithm for solving smooth nonconvex optimization problems is proposed that, in the worst-case, takes ${\mathscr O}(\varepsilon ^{-3/2})$ iterations to drive the norm of the gradient of the objective function below a prescribed positive real number $\varepsilon $ and can take ${\mathscr O}(\varepsilon ^{-3})$ iterations to drive the leftmost eigenvalue of the Hessian of the objective above $-\varepsilon $. The proposed algorithm is a general framework that covers a wide range of techniques including quadratically and cubically regularized Newton methods, such as the Adaptive Regularization using Cubics (arc) method and the recently proposed Trust-Region Algorithm with Contractions and Expansions (trace). The generality of our method is achieved through the introduction of generic conditions that each trial step is required to satisfy, which in particular allows for inexact regularized Newton steps to be used. These conditions center around a new subproblem that can be approximately solved to obtain trial steps that satisfy the conditions. A new instance of the framework, distinct from arc and trace, is described that may be viewed as a hybrid between quadratically and cubically regularized Newton methods. Numerical results demonstrate that our hybrid algorithm outperforms a cubically regularized Newton method.
Styles APA, Harvard, Vancouver, ISO, etc.
15

Dehghan Niri, T., M. Heydari et M. M. Hosseini. « An improvement of adaptive cubic regularization method for unconstrained optimization problems ». International Journal of Computer Mathematics, 11 mars 2020, 1–17. http://dx.doi.org/10.1080/00207160.2020.1738406.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
16

Pei, Yonggang, Shaofang Song et Detong Zhu. « A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization ». Numerical Algorithms, 8 décembre 2022. http://dx.doi.org/10.1007/s11075-022-01475-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
17

Dehghan Niri, T., M. Heydari et M. M. Hosseini. « Two modified adaptive cubic regularization algorithms by using the nonmonotone Armijo-type line search ». Optimization, 15 mai 2022, 1–24. http://dx.doi.org/10.1080/02331934.2022.2075746.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
18

Bellavia, Stefania, Gianmarco Gurioli et Benedetta Morini. « Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization ». IMA Journal of Numerical Analysis, 23 avril 2020. http://dx.doi.org/10.1093/imanum/drz076.

Texte intégral
Résumé :
Abstract We consider the adaptive regularization with cubics approach for solving nonconvex optimization problems and propose a new variant based on inexact Hessian information chosen dynamically. The theoretical analysis of the proposed procedure is given. The key property of ARC framework, constituted by optimal worst-case function/derivative evaluation bounds for first- and second-order critical point, is guaranteed. Application to large-scale finite-sum minimization based on subsampled Hessian is discussed and analyzed in both a deterministic and probabilistic manner, and equipped with numerical experiments on synthetic and real datasets.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Bellavia, Stefania, et Gianmarco Gurioli. « Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy ». Optimization, 28 février 2021, 1–35. http://dx.doi.org/10.1080/02331934.2021.1892104.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
20

Agarwal, Naman, Nicolas Boumal, Brian Bullins et Coralia Cartis. « Adaptive regularization with cubics on manifolds ». Mathematical Programming, 13 mai 2020. http://dx.doi.org/10.1007/s10107-020-01505-1.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie