Literatura científica selecionada sobre o tema "Inertial Bregman proximal gradient"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Inertial Bregman proximal gradient".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Artigos de revistas sobre o assunto "Inertial Bregman proximal gradient"

1

Mukkamala, Mahesh Chandra, Peter Ochs, Thomas Pock e Shoham Sabach. "Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Nonconvex Optimization". SIAM Journal on Mathematics of Data Science 2, n.º 3 (janeiro de 2020): 658–82. http://dx.doi.org/10.1137/19m1298007.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Kabbadj, S. "Inexact Version of Bregman Proximal Gradient Algorithm". Abstract and Applied Analysis 2020 (1 de abril de 2020): 1–11. http://dx.doi.org/10.1155/2020/1963980.

Texto completo da fonte
Resumo:
The Bregman Proximal Gradient (BPG) algorithm is an algorithm for minimizing the sum of two convex functions, with one being nonsmooth. The supercoercivity of the objective function is necessary for the convergence of this algorithm precluding its use in many applications. In this paper, we give an inexact version of the BPG algorithm while circumventing the condition of supercoercivity by replacing it with a simple condition on the parameters of the problem. Our study covers the existing results, while giving other.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Zhou, Yi, Yingbin Liang e Lixin Shen. "A simple convergence analysis of Bregman proximal gradient algorithm". Computational Optimization and Applications 73, n.º 3 (4 de abril de 2019): 903–12. http://dx.doi.org/10.1007/s10589-019-00092-y.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Hanzely, Filip, Peter Richtárik e Lin Xiao. "Accelerated Bregman proximal gradient methods for relatively smooth convex optimization". Computational Optimization and Applications 79, n.º 2 (7 de abril de 2021): 405–40. http://dx.doi.org/10.1007/s10589-021-00273-8.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Mahadevan, Sridhar, Stephen Giguere e Nicholas Jacek. "Basis Adaptation for Sparse Nonlinear Reinforcement Learning". Proceedings of the AAAI Conference on Artificial Intelligence 27, n.º 1 (30 de junho de 2013): 654–60. http://dx.doi.org/10.1609/aaai.v27i1.8665.

Texto completo da fonte
Resumo:
This paper presents a new approach to representation discovery in reinforcement learning (RL) using basis adaptation. We introduce a general framework for basis adaptation as {\em nonlinear separable least-squares value function approximation} based on finding Frechet gradients of an error function using variable projection functionals. We then present a scalable proximal gradient-based approach for basis adaptation using the recently proposed mirror-descent framework for RL. Unlike traditional temporal-difference (TD) methods for RL, mirror descent based RL methods undertake proximal gradient updates of weights in a dual space, which is linked together with the primal space using a Legendre transform involving the gradient of a strongly convex function. Mirror descent RL can be viewed as a proximal TD algorithm using Bregman divergence as the distance generating function. We present a new class of regularized proximal-gradient based TD methods, which combine feature selection through sparse L1 regularization and basis adaptation. Experimental results are provided to illustrate and validate the approach.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Yang, Lei, e Kim-Chuan Toh. "Bregman Proximal Point Algorithm Revisited: A New Inexact Version and Its Inertial Variant". SIAM Journal on Optimization 32, n.º 3 (13 de julho de 2022): 1523–54. http://dx.doi.org/10.1137/20m1360748.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Li, Jing, Xiao Wei, Fengpin Wang e Jinjia Wang. "IPGM: Inertial Proximal Gradient Method for Convolutional Dictionary Learning". Electronics 10, n.º 23 (3 de dezembro de 2021): 3021. http://dx.doi.org/10.3390/electronics10233021.

Texto completo da fonte
Resumo:
Inspired by the recent success of the proximal gradient method (PGM) and recent efforts to develop an inertial algorithm, we propose an inertial PGM (IPGM) for convolutional dictionary learning (CDL) by jointly optimizing both an ℓ2-norm data fidelity term and a sparsity term that enforces an ℓ1 penalty. Contrary to other CDL methods, in the proposed approach, the dictionary and needles are updated with an inertial force by the PGM. We obtain a novel derivative formula for the needles and dictionary with respect to the data fidelity term. At the same time, a gradient descent step is designed to add an inertial term. The proximal operation uses the thresholding operation for needles and projects the dictionary to a unit-norm sphere. We prove the convergence property of the proposed IPGM algorithm in a backtracking case. Simulation results show that the proposed IPGM achieves better performance than the PGM and slice-based methods that possess the same structure and are optimized using the alternating-direction method of multipliers (ADMM).
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Xiao, Xiantao. "A Unified Convergence Analysis of Stochastic Bregman Proximal Gradient and Extragradient Methods". Journal of Optimization Theory and Applications 188, n.º 3 (8 de janeiro de 2021): 605–27. http://dx.doi.org/10.1007/s10957-020-01799-3.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Wang, Qingsong, Zehui Liu, Chunfeng Cui e Deren Han. "A Bregman Proximal Stochastic Gradient Method with Extrapolation for Nonconvex Nonsmooth Problems". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 14 (24 de março de 2024): 15580–88. http://dx.doi.org/10.1609/aaai.v38i14.29485.

Texto completo da fonte
Resumo:
In this paper, we explore a specific optimization problem that involves the combination of a differentiable nonconvex function and a nondifferentiable function. The differentiable component lacks a global Lipschitz continuous gradient, posing challenges for optimization. To address this issue and accelerate the convergence, we propose a Bregman proximal stochastic gradient method with extrapolation (BPSGE), which only requires smooth adaptivity of the differentiable part. Under variance reduction framework, we not only analyze the subsequential and global convergence of the proposed algorithm under certain conditions, but also analyze the sublinear convergence rate of the subsequence, and the complexity of the algorithm, revealing that the BPSGE algorithm requires at most O(epsilon\^\,(-2)) iterations in expectation to attain an epsilon-stationary point. To validate the effectiveness of our proposed algorithm, we conduct numerical experiments on three real-world applications: graph regularized nonnegative matrix factorization (NMF), matrix factorization with weakly-convex regularization, and NMF with nonconvex sparsity constraints. These experiments demonstrate that BPSGE is faster than the baselines without extrapolation. The code is available at: https://github.com/nothing2wang/BPSGE-Algorithm.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

He, Lulu, Jimin Ye e Jianwei E. "Nonconvex optimization with inertial proximal stochastic variance reduction gradient". Information Sciences 648 (novembro de 2023): 119546. http://dx.doi.org/10.1016/j.ins.2023.119546.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Teses / dissertações sobre o assunto "Inertial Bregman proximal gradient"

1

Godeme, Jean-Jacques. "Ρhase retrieval with nοn-Euclidean Bregman based geοmetry". Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMC214.

Texto completo da fonte
Resumo:
Dans ce travail, nous nous intéressons au problème de reconstruction de phase de signaux à valeurs réelles en dimension finie, un défi rencontré dans de nombreuses disciplines scientifiques et d’ingénierie. Nous explorons deux approches complémentaires : la reconstruction avec et sans régularisation. Dans les deux cas, notre travail se concentre sur la relaxation de l’hypothèse de Lipschitz-continuité généralement requise par les algorithmes de descente du premier ordre, et qui n’est pas valide pour la reconstruction de phase lorsqu’il formulée comme un problème de minimisation. L’idée clé ici est de remplacer la géométrie euclidienne par une divergence de Bregman non euclidienne associée à un noyau générateur approprié. Nous utilisons un algorithme de descente miroir ou de descente à la Bregman avec cette divergence pour résoudre le problème de reconstruction de phase sans régularisation. Nous démontrons des résultats de reconstruction exacte (à un signe global près) à la fois dans un cadre déterministe et avec une forte probabilité pour un nombre suffisant de mesures aléatoires (mesures Gaussiennes et pour des mesures structurées comme la diffraction codée). De plus, nous établissons la stabilité de cette approche vis-à-vis d’un bruit additif faible. En passant à la reconstruction de phase régularisée, nous développons et analysons d’abord un algorithme proximal inertiel à la Bregman pour minimiser la somme de deux fonctions, l’une étant convexe et potentiellement non lisse et la seconde étant relativement lisse dans la géométrie de Bregman. Nous fournissons des garanties de convergence à la fois globale et locale pour cet algorithme. Enfin, nous étudions la reconstruction sans bruit et la stabilité du problème régularisé par un a priori de faible complexité. Pour celà, nous formulons le problème comme la minimisation d’une objective impliquant un terme d’attache aux données non convexe et un terme de régularisation convexe favorisant les solutions conformes à une certaine notion de faible complexité. Nous établissons des conditions pour une reconstruction exacte et stable et fournissons des bornes sur le nombre de mesures aléatoires suffisants pour de garantir que ces conditionssoient remplies. Ces bornes d’échantillonnage dépendent de la faible complexité des signaux à reconstruire. Ces résultats nouveaux permettent d’aller bien au-delà du cas de la reconstruction de phase parcimonieuse
In this work, we investigate the phase retrieval problem of real-valued signals in finite dimension, a challenge encountered across various scientific and engineering disciplines. It explores two complementary approaches: retrieval with and without regularization. In both settings, our work is focused on relaxing the Lipschitz-smoothness assumption generally required by first-order splitting algorithms, and which is not valid for phase retrieval cast as a minimization problem. The key idea here is to replace the Euclidean geometry by a non-Euclidean Bregman divergence associated to an appropriate kernel. We use a Bregman gradient/mirror descent algorithm with this divergence to solve thephase retrieval problem without regularization, and we show exact (up to a global sign) recovery both in a deterministic setting and with high probability for a sufficient number of random measurements (Gaussian and Coded Diffraction Patterns). Furthermore, we establish the robustness of this approachagainst small additive noise. Shifting to regularized phase retrieval, we first develop and analyze an Inertial Bregman Proximal Gradient algorithm for minimizing the sum of two functions in finite dimension, one of which is convex and possibly nonsmooth and the second is relatively smooth in the Bregman geometry. We provide both global and local convergence guarantees for this algorithm. Finally, we study noiseless and stable recovery of low complexity regularized phase retrieval. For this, weformulate the problem as the minimization of an objective functional involving a nonconvex smooth data fidelity term and a convex regularizer promoting solutions conforming to some notion of low-complexity related to their nonsmoothness points. We establish conditions for exact and stable recovery and provide sample complexity bounds for random measurements to ensure that these conditions hold. These sample bounds depend on the low complexity of the signals to be recovered. Our new results allow to go far beyond the case of sparse phase retrieval
Estilos ABNT, Harvard, Vancouver, APA, etc.

Capítulos de livros sobre o assunto "Inertial Bregman proximal gradient"

1

Mukkamala, Mahesh Chandra, Felix Westerkamp, Emanuel Laude, Daniel Cremers e Peter Ochs. "Bregman Proximal Gradient Algorithms for Deep Matrix Factorization". In Lecture Notes in Computer Science, 204–15. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-75549-2_17.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Trabalhos de conferências sobre o assunto "Inertial Bregman proximal gradient"

1

Li, Huan, Wenjuan Zhang, Shujian Huang e Feng Xiao. "Poisson Noise Image Restoration Based on Bregman Proximal Gradient". In 2023 6th International Conference on Computer Network, Electronic and Automation (ICCNEA). IEEE, 2023. http://dx.doi.org/10.1109/iccnea60107.2023.00058.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Pu, Wenqiang, Jiawei Zhang, Rui Zhou, Xiao Fu e Mingyi Hong. "A Smoothed Bregman Proximal Gradient Algorithm for Decentralized Nonconvex Optimization". In ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2024. http://dx.doi.org/10.1109/icassp48485.2024.10448285.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia