Gotowa bibliografia na temat „Subgradient descent”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Subgradient descent”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Subgradient descent"

1

Krutikov, Vladimir, Svetlana Gutova, Elena Tovbis, Lev Kazakovtsev, and Eugene Semenkin. "Relaxation Subgradient Algorithms with Machine Learning Procedures." Mathematics 10, no. 21 (2022): 3959. http://dx.doi.org/10.3390/math10213959.

Pełny tekst źródła
Streszczenie:
In the modern digital economy, optimal decision support systems, as well as machine learning systems, are becoming an integral part of production processes. Artificial neural network training as well as other engineering problems generate such problems of high dimension that are difficult to solve with traditional gradient or conjugate gradient methods. Relaxation subgradient minimization methods (RSMMs) construct a descent direction that forms an obtuse angle with all subgradients of the current minimum neighborhood, which reduces to the problem of solving systems of inequalities. Having form
Style APA, Harvard, Vancouver, ISO itp.
2

Tovbis, Elena, Vladimir Krutikov, Predrag Stanimirović, Vladimir Meshechkin, Aleksey Popov, and Lev Kazakovtsev. "A Family of Multi-Step Subgradient Minimization Methods." Mathematics 11, no. 10 (2023): 2264. http://dx.doi.org/10.3390/math11102264.

Pełny tekst źródła
Streszczenie:
For solving non-smooth multidimensional optimization problems, we present a family of relaxation subgradient methods (RSMs) with a built-in algorithm for finding the descent direction that forms an acute angle with all subgradients in the neighborhood of the current minimum. Minimizing the function along the opposite direction (with a minus sign) enables the algorithm to go beyond the neighborhood of the current minimum. The family of algorithms for finding the descent direction is based on solving systems of inequalities. The finite convergence of the algorithms on separable bounded sets is p
Style APA, Harvard, Vancouver, ISO itp.
3

Li, Gang, Minghua Li, and Yaohua Hu. "Stochastic quasi-subgradient method for stochastic quasi-convex feasibility problems." Discrete & Continuous Dynamical Systems - S 15, no. 4 (2022): 713. http://dx.doi.org/10.3934/dcdss.2021127.

Pełny tekst źródła
Streszczenie:
<p style='text-indent:20px;'>The feasibility problem is at the core of the modeling of many problems in various disciplines of mathematics and physical sciences, and the quasi-convex function is widely applied in many fields such as economics, finance, and management science. In this paper, we consider the stochastic quasi-convex feasibility problem (SQFP), which is to find a common point of infinitely many sublevel sets of quasi-convex functions. Inspired by the idea of a stochastic index scheme, we propose a stochastic quasi-subgradient method to solve the SQFP, in which the quasi-subg
Style APA, Harvard, Vancouver, ISO itp.
4

Chu, Wenqing, Yao Hu, Chen Zhao, Haifeng Liu, and Deng Cai. "Atom Decomposition Based Subgradient Descent for matrix classification." Neurocomputing 205 (September 2016): 222–28. http://dx.doi.org/10.1016/j.neucom.2016.03.069.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Bedi, Amrit Singh, and Ketan Rajawat. "Network Resource Allocation via Stochastic Subgradient Descent: Convergence Rate." IEEE Transactions on Communications 66, no. 5 (2018): 2107–21. http://dx.doi.org/10.1109/tcomm.2018.2792430.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Nedić, Angelia, and Soomin Lee. "On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging." SIAM Journal on Optimization 24, no. 1 (2014): 84–107. http://dx.doi.org/10.1137/120894464.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Cui, Yun-Ling, Lu-Chuan Ceng, Fang-Fei Zhang, et al. "Modified Mann-Type Subgradient Extragradient Rules for Variational Inequalities and Common Fixed Points Implicating Countably Many Nonexpansive Operators." Mathematics 10, no. 11 (2022): 1949. http://dx.doi.org/10.3390/math10111949.

Pełny tekst źródła
Streszczenie:
In a real Hilbert space, let the CFPP, VIP, and HFPP denote the common fixed-point problem of countable nonexpansive operators and asymptotically nonexpansive operator, variational inequality problem, and hierarchical fixed point problem, respectively. With the help of the Mann iteration method, a subgradient extragradient approach with a linear-search process, and a hybrid deepest-descent technique, we construct two modified Mann-type subgradient extragradient rules with a linear-search process for finding a common solution of the CFPP and VIP. Under suitable assumptions, we demonstrate the s
Style APA, Harvard, Vancouver, ISO itp.
8

Montonen, O., N. Karmitsa, and M. M. Mäkelä. "Multiple subgradient descent bundle method for convex nonsmooth multiobjective optimization." Optimization 67, no. 1 (2017): 139–58. http://dx.doi.org/10.1080/02331934.2017.1387259.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Beck, Amir, and Marc Teboulle. "Mirror descent and nonlinear projected subgradient methods for convex optimization." Operations Research Letters 31, no. 3 (2003): 167–75. http://dx.doi.org/10.1016/s0167-6377(02)00231-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Ceng, Lu-Chuan, Li-Jun Zhu, and Tzu-Chien Yin. "Modified subgradient extragradient algorithms for systems of generalized equilibria with constraints." AIMS Mathematics 8, no. 2 (2023): 2961–94. http://dx.doi.org/10.3934/math.2023154.

Pełny tekst źródła
Streszczenie:
<abstract><p>In this paper, we introduce the modified Mann-like subgradient-like extragradient implicit rules with linear-search process for finding a common solution of a system of generalized equilibrium problems, a pseudomonotone variational inequality problem and a fixed-point problem of an asymptotically nonexpansive mapping in a real Hilbert space. The proposed algorithms are based on the subgradient extragradient rule with linear-search process, Mann implicit iteration approach, and hybrid deepest-descent technique. Under mild restrictions, we demonstrate the strong converge
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Subgradient descent"

1

Beltran, Royo César. "Generalized unit commitment by the radar multiplier method." Doctoral thesis, Universitat Politècnica de Catalunya, 2001. http://hdl.handle.net/10803/6501.

Pełny tekst źródła
Streszczenie:
This operations research thesis should be situated in the field of the power generation industry. The general objective of this work is to efficiently solve the Generalized Unit Commitment (GUC) problem by means of specialized software. The GUC problem generalizes the Unit Commitment (UC) problem by simultane-ously solving the associated Optimal Power Flow (OPF) problem. There are many approaches to solve the UC and OPF problems separately, but approaches to solve them jointly, i.e. to solve the GUC problem, are quite scarce. One of these GUC solving approaches is due to professors Batut and R
Style APA, Harvard, Vancouver, ISO itp.
2

Yaji, Vinayaka Ganapati. "Stochastic approximation with set-valued maps and Markov noise: Theoretical foundations and applications." Thesis, 2017. https://etd.iisc.ac.in/handle/2005/5461.

Pełny tekst źródła
Streszczenie:
Stochastic approximation algorithms produce estimates of a desired solution using noisy real world data. Introduced by Robbins and Monro, in 1951, stochastic approximation techniques have been instrumental in the asymptotic analysis of several adaptive algorithms in learning, signal processing and control. A popular method for the analysis of stochastic approximation schemes is the dynamical systems approach or the o.d.e. method introduced by Ljung and developed further extensively by Benaim and Hirsch. We build upon the works of Benaim et.al. and Bhatnagar et.al., and present the asympto
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Subgradient descent"

1

Kiwiel, Krzysztof C. "Aggregate subgradient methods for unconstrained convex minimization." In Methods of Descent for Nondifferentiable Optimization. Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/bfb0074502.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Kiwiel, Krzysztof C. "Methods with subgradient locality measures for minimizing nonconvex functions." In Methods of Descent for Nondifferentiable Optimization. Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/bfb0074503.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Kiwiel, Krzysztof C. "Methods with subgradient deletion rules for unconstrained nonconvex minimization." In Methods of Descent for Nondifferentiable Optimization. Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/bfb0074504.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Subgradient descent"

1

Gez, Tamir L. S., and Kobi Cohen. "Subgradient Descent Learning with Over-the-Air Computation." In ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2023. http://dx.doi.org/10.1109/icassp49357.2023.10095134.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Zhang, Honghui, Jingdong Wang, Ping Tan, Jinglu Wang, and Long Quan. "Learning CRFs for Image Parsing with Adaptive Subgradient Descent." In 2013 IEEE International Conference on Computer Vision (ICCV). IEEE, 2013. http://dx.doi.org/10.1109/iccv.2013.382.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Lucchi, Aurelien, Yunpeng Li, and Pascal Fua. "Learning for Structured Prediction Using Approximate Subgradient Descent with Working Sets." In 2013 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2013. http://dx.doi.org/10.1109/cvpr.2013.259.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Singhal, Manmohan, and Saurabh Khanna. "Proximal Subgradient Descent Method for Cancelling Cross-Interference in FMCW Radars." In 2023 IEEE Statistical Signal Processing Workshop (SSP). IEEE, 2023. http://dx.doi.org/10.1109/ssp53291.2023.10208039.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Wu, Songwei, Hang Yu, and Justin Dauwels. "Efficient Stochastic Subgradient Descent Algorithms for High-dimensional Semi-sparse Graphical Model Selection." In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019. http://dx.doi.org/10.1109/icassp.2019.8683823.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Wan, Yuanyu, Nan Wei, and Lijun Zhang. "Efficient Adaptive Online Learning via Frequent Directions." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/381.

Pełny tekst źródła
Streszczenie:
By employing time-varying proximal functions, adaptive subgradient methods (ADAGRAD) have improved the regret bound and been widely used in online learning and optimization. However, ADAGRAD with full matrix proximal functions (ADA-FULL) cannot deal with large-scale problems due to the impractical time and space complexities, though it has better performance when gradients are correlated. In this paper, we propose ADA-FD, an efficient variant of ADA-FULL based on a deterministic matrix sketching technique called frequent directions. Following ADA-FULL, we incorporate our ADA-FD into both prima
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!