To see the other types of publications on this topic, follow the link: Convex optimization.

Journal articles on the topic 'Convex optimization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Convex optimization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Luethi, Hans-Jakob. "Convex Optimization." Journal of the American Statistical Association 100, no. 471 (September 2005): 1097. http://dx.doi.org/10.1198/jasa.2005.s41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ceria, Sebastián, and João Soares. "Convex programming for disjunctive convex optimization." Mathematical Programming 86, no. 3 (December 1, 1999): 595–614. http://dx.doi.org/10.1007/s101070050106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lasserre, Jean B. "On convex optimization without convex representation." Optimization Letters 5, no. 4 (April 13, 2011): 549–56. http://dx.doi.org/10.1007/s11590-011-0323-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ben-Tal, A., and A. Nemirovski. "Robust Convex Optimization." Mathematics of Operations Research 23, no. 4 (November 1998): 769–805. http://dx.doi.org/10.1287/moor.23.4.769.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tilahun, Surafel Luleseged. "Convex Grey Optimization." RAIRO - Operations Research 53, no. 1 (January 2019): 339–49. http://dx.doi.org/10.1051/ro/2018088.

Full text
Abstract:
Many optimization problems are formulated from a real scenario involving incomplete information due to uncertainty in reality. The uncertainties can be expressed with appropriate probability distributions or fuzzy numbers with a membership function, if enough information can be accessed for the construction of either the probability density function or the membership of the fuzzy numbers. However, in some cases there may not be enough information for that and grey numbers need to be used. A grey number is an interval number to represent the value of a quantity. Its exact value or the likelihood is not known but the maximum and/or the minimum possible values are. Applications in space exploration, robotics and engineering can be mentioned which involves such a scenario. An optimization problem is called a grey optimization problem if it involves a grey number in the objective function and/or constraint set. Unlike its wide applications, not much research is done in the field. Hence, in this paper, a convex grey optimization problem will be discussed. It will be shown that an optimal solution for a convex grey optimization problem is a grey number where the lower and upper limit are computed by solving the problem in an optimistic and pessimistic way. The optimistic way is when the decision maker counts the grey numbers as decision variables and optimize the objective function for all the decision variables whereas the pessimistic way is solving a minimax or maximin problem over the decision variables and over the grey numbers.
APA, Harvard, Vancouver, ISO, and other styles
6

Ubhaya, Vasant A. "Quasi-convex optimization." Journal of Mathematical Analysis and Applications 116, no. 2 (June 1986): 439–49. http://dx.doi.org/10.1016/s0022-247x(86)80008-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Onn, Shmuel. "Convex Matroid Optimization." SIAM Journal on Discrete Mathematics 17, no. 2 (January 2003): 249–53. http://dx.doi.org/10.1137/s0895480102408559.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pardalos, Panos M. "Convex optimization theory." Optimization Methods and Software 25, no. 3 (June 2010): 487. http://dx.doi.org/10.1080/10556781003625177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Onn, Shmuel, and Uriel G. Rothblum. "Convex Combinatorial Optimization." Discrete & Computational Geometry 32, no. 4 (August 19, 2004): 549–66. http://dx.doi.org/10.1007/s00454-004-1138-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Mayeli, Azita. "Non-convex Optimization via Strongly Convex Majorization-minimization." Canadian Mathematical Bulletin 63, no. 4 (December 10, 2019): 726–37. http://dx.doi.org/10.4153/s0008439519000730.

Full text
Abstract:
AbstractIn this paper, we introduce a class of nonsmooth nonconvex optimization problems, and we propose to use a local iterative minimization-majorization (MM) algorithm to find an optimal solution for the optimization problem. The cost functions in our optimization problems are an extension of convex functions with MC separable penalty, which were previously introduced by Ivan Selesnick. These functions are not convex; therefore, convex optimization methods cannot be applied here to prove the existence of optimal minimum point for these functions. For our purpose, we use convex analysis tools to first construct a class of convex majorizers, which approximate the value of non-convex cost function locally, then use the MM algorithm to prove the existence of local minimum. The convergence of the algorithm is guaranteed when the iterative points $x^{(k)}$ are obtained in a ball centred at $x^{(k-1)}$ with small radius. We prove that the algorithm converges to a stationary point (local minimum) of cost function when the surregators are strongly convex.
APA, Harvard, Vancouver, ISO, and other styles
11

Agrawal, Akshay, Shane Barratt, and Stephen Boyd. "Learning Convex Optimization Models." IEEE/CAA Journal of Automatica Sinica 8, no. 8 (August 2021): 1355–64. http://dx.doi.org/10.1109/jas.2021.1004075.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Wiesemann, Wolfram, Daniel Kuhn, and Melvyn Sim. "Distributionally Robust Convex Optimization." Operations Research 62, no. 6 (December 2014): 1358–76. http://dx.doi.org/10.1287/opre.2014.1314.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Kryazhimskii, Arkadii V. "Convex Optimization Via Feedbacks." SIAM Journal on Control and Optimization 37, no. 1 (January 1998): 278–302. http://dx.doi.org/10.1137/s036301299528030x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Tatarenko, Tatiana, and Behrouz Touri. "Non-Convex Distributed Optimization." IEEE Transactions on Automatic Control 62, no. 8 (August 2017): 3744–57. http://dx.doi.org/10.1109/tac.2017.2648041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Gershman, Alex, Nicholas Sidiropoulos, Shahram Shahbazpanahi, Mats Bengtsson, and Bjorn Ottersten. "Convex Optimization-Based Beamforming." IEEE Signal Processing Magazine 27, no. 3 (May 2010): 62–75. http://dx.doi.org/10.1109/msp.2010.936015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Bard, Jonathan F. "Convex two-level optimization." Mathematical Programming 40-40, no. 1-3 (January 1988): 15–27. http://dx.doi.org/10.1007/bf01580720.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Lesage-Landry, Antoine, Iman Shames, and Joshua A. Taylor. "Predictive online convex optimization." Automatica 113 (March 2020): 108771. http://dx.doi.org/10.1016/j.automatica.2019.108771.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Heaton, Howard, Xiaohan Chen, Zhangyang Wang, and Wotao Yin. "Safeguarded Learned Convex Optimization." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 6 (June 26, 2023): 7848–55. http://dx.doi.org/10.1609/aaai.v37i6.25950.

Full text
Abstract:
Applications abound in which optimization problems must be repeatedly solved, each time with new (but similar) data. Analytic optimization algorithms can be hand-designed to provably solve these problems in an iterative fashion. On one hand, data-driven algorithms can "learn to optimize" (L2O) with much fewer iterations and similar cost per iteration as general-purpose optimization algorithms. On the other hand, unfortunately, many L2O algorithms lack converge guarantees. To fuse the advantages of these approaches, we present a Safe-L2O framework. Safe-L2O updates incorporate a safeguard to guarantee convergence for convex problems with proximal and/or gradient oracles. The safeguard is simple and computationally cheap to implement, and it is activated only when the data-driven L2O updates would perform poorly or appear to diverge. This yields the numerical benefits of employing machine learning to create rapid L2O algorithms while still guaranteeing convergence. Our numerical examples show convergence of Safe-L2O algorithms, even when the provided data is not from the distribution of training data.
APA, Harvard, Vancouver, ISO, and other styles
19

Lasserre, Jean B. "Erratum to: On convex optimization without convex representation." Optimization Letters 8, no. 5 (April 6, 2014): 1795–96. http://dx.doi.org/10.1007/s11590-014-0735-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Belotti, Pietro, Christian Kirches, Sven Leyffer, Jeff Linderoth, James Luedtke, and Ashutosh Mahajan. "Mixed-integer nonlinear optimization." Acta Numerica 22 (April 2, 2013): 1–131. http://dx.doi.org/10.1017/s0962492913000032.

Full text
Abstract:
Many optimal decision problems in scientific, engineering, and public sector applications involve both discrete decisions and nonlinear system dynamics that affect the quality of the final design or plan. These decision problems lead to mixed-integer nonlinear programming (MINLP) problems that combine the combinatorial difficulty of optimizing over discrete variable sets with the challenges of handling nonlinear functions. We review models and applications of MINLP, and survey the state of the art in methods for solving this challenging class of problems.Most solution methods for MINLP apply some form of tree search. We distinguish two broad classes of methods: single-tree and multitree methods. We discuss these two classes of methods first in the case where the underlying problem functions are convex. Classical single-tree methods include nonlinear branch-and-bound and branch-and-cut methods, while classical multitree methods include outer approximation and Benders decomposition. The most efficient class of methods for convex MINLP are hybrid methods that combine the strengths of both classes of classical techniques.Non-convex MINLPs pose additional challenges, because they contain non-convex functions in the objective function or the constraints; hence even when the integer variables are relaxed to be continuous, the feasible region is generally non-convex, resulting in many local minima. We discuss a range of approaches for tackling this challenging class of problems, including piecewise linear approximations, generic strategies for obtaining convex relaxations for non-convex functions, spatial branch-and-bound methods, and a small sample of techniques that exploit particular types of non-convex structures to obtain improved convex relaxations.We finish our survey with a brief discussion of three important aspects of MINLP. First, we review heuristic techniques that can obtain good feasible solution in situations where the search-tree has grown too large or we require real-time solutions. Second, we describe an emerging area of mixed-integer optimal control that adds systems of ordinary differential equations to MINLP. Third, we survey the state of the art in software for MINLP.
APA, Harvard, Vancouver, ISO, and other styles
21

Lan, Yu, and Ji Li. "Mars Ascent Trajectory Optimization Based on Convex Optimization." Journal of Physics: Conference Series 2364, no. 1 (November 1, 2022): 012013. http://dx.doi.org/10.1088/1742-6596/2364/1/012013.

Full text
Abstract:
Abstract In this paper, a Mars ascent trajectory optimization algorithm based on convex optimization is designed for the Mars ascent trajectory optimization problem. The main accomplishment of this paper is the trajectory optimization of the second-stage ignition based on the convex optimization. For the first-stage ignition, open-loop guidance is used. In this stage, the ascender flies using the maximum thrust to track the attitude profile, thus completing the ascent flight control. In the presence of errors in the first-stage ignition, the experiment generates multiple sets of first-stage shutdown point data and completes simulation verification experiments based on them. The simulation verifies that the second-stage ignition trajectory optimization based on convex optimization algorithm can achieve high accuracy into orbit despite the large starting point deviation.
APA, Harvard, Vancouver, ISO, and other styles
22

Popovici, Nicolae. "Convexité au sens direct ou inverse et applications dans l'optimisation vectorielle." Journal of Numerical Analysis and Approximation Theory 29, no. 1 (February 1, 2000): 75–82. http://dx.doi.org/10.33993/jnaat291-656.

Full text
Abstract:
(in English) The aim of this paper is to study vector optimization poblems involving objective functions which are convex in some direct or inverse sense (i.e. a special class of cone-quasiconvex functions). In particular, it is shown that the image of the objective function is a cone-convex set, property which is important from the scalarization point of view in vector optimization. (in French) Le but de cet article est d'etudier les problemes d'optimisation vectorielles ayant des fonctions objectifs convexes au sens direct ou inverse. Il s'agit notamment de donner des conditions suffisantes pour que l'image d'un ensemble convexe par des fonction objectifs cone-quasiconvexes soit un ensemble cone-convexe, propriete qui joue un role important dans la plupart des methodes de scalarisation utilisees dans l'optimisation vectorielle.
APA, Harvard, Vancouver, ISO, and other styles
23

Ni, Zhitong, Andrew Jian Zhang, Ren-Ping Liu, and Kai Yang. "Doubly Constrained Waveform Optimization for Integrated Sensing and Communications." Sensors 23, no. 13 (June 28, 2023): 5988. http://dx.doi.org/10.3390/s23135988.

Full text
Abstract:
This paper investigates threshold-constrained joint waveform optimization for an integrated sensing and communication (ISAC) system. Unlike existing studies, we employ mutual information (MI) and sum rate (SR) as sensing and communication metrics, respectively, and optimize the waveform under constraints to both metrics simultaneously. This provides significant flexibility in meeting system performance. We formulate three different optimization problems that constrain the radar performance only, the communication performance only, and the ISAC performance, respectively. New techniques are developed to solve the original problems, which are NP-hard and cannot be directly solved by conventional semi-definite programming (SDP) techniques. Novel gradient descent methods are developed to solve the first two problems. For the third non-convex optimization problem, we transform it into a convex problem and solve it via convex toolboxes. We also disclose the connections between three optimizations using numerical results. Finally, simulation results are provided and validate the proposed optimization solutions.
APA, Harvard, Vancouver, ISO, and other styles
24

Chen, Po-Yu, and Ivan W. Selesnick. "Group-Sparse Signal Denoising: Non-Convex Regularization, Convex Optimization." IEEE Transactions on Signal Processing 62, no. 13 (July 2014): 3464–78. http://dx.doi.org/10.1109/tsp.2014.2329274.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Dutta, Joydeep. "Barrier method in nonsmooth convex optimization without convex representation." Optimization Letters 9, no. 6 (October 10, 2014): 1177–85. http://dx.doi.org/10.1007/s11590-014-0811-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Wang, Xuanxuan, Wujun Ji, and Yun Gao. "Optimization Strategy of the Electric Vehicle Power Battery Based on the Convex Optimization Algorithm." Processes 11, no. 5 (May 6, 2023): 1416. http://dx.doi.org/10.3390/pr11051416.

Full text
Abstract:
With the development of the electric vehicle industry, electric vehicles have provided more choices for people. However, the performance of electric vehicles needs improvement, which makes most consumers take a wait-and-see attitude. Therefore, finding a method that can effectively improve the performance of electric vehicles is of great significance. To improve the current performance of electric vehicles, a convex optimization algorithm is proposed to optimize the motor model and power battery parameters of electric vehicles, improving the overall performance of electric vehicles. The performance of the proposed convex optimization algorithm, dual loop DP optimization algorithm, and nonlinear optimization algorithm is compared. The results show that the hydrogen consumption of electric vehicles optimized by the convex optimization algorithm is 95.364 g. This consumption is lower than 98.165 g of the DCDP optimization algorithm and 105.236 g of the nonlinear optimization algorithm before optimization. It is also significantly better than the 125.59 g of electric vehicles before optimization. The calculation time of the convex optimization algorithm optimization is 4.9 s, which is lower than the DCDP optimization algorithm and nonlinear optimization algorithm. The above results indicate that convex optimization algorithms have better optimization performance. After optimizing the power battery using a convex optimization algorithm, the overall performance of electric vehicles is higher. Therefore, this method can effectively improve the performance of current electric vehicle power batteries, make new energy vehicles develop rapidly, and improve the increasingly serious environmental pollution and energy crisis in China.
APA, Harvard, Vancouver, ISO, and other styles
27

Sun, Xiang-Kai, and Hong-Yong Fu. "A Note on Optimality Conditions for DC Programs Involving Composite Functions." Abstract and Applied Analysis 2014 (2014): 1–6. http://dx.doi.org/10.1155/2014/203467.

Full text
Abstract:
By using the formula of theε-subdifferential for the sum of a convex function with a composition of convex functions, some necessary and sufficient optimality conditions for a DC programming problem involving a composite function are obtained. As applications, a composed convex optimization problem, a DC optimization problem, and a convex optimization problem with a linear operator are examined at the end of this paper.
APA, Harvard, Vancouver, ISO, and other styles
28

Salman, Abbas Musleh, Ahmed Alridha, and Ahmed Hadi Hussain. "Some Topics on Convex Optimization." Journal of Physics: Conference Series 1818, no. 1 (March 1, 2021): 012171. http://dx.doi.org/10.1088/1742-6596/1818/1/012171.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Ghavamzadeh, Mohammed, Shie Mannor, Joelle Pineau, and Aviv Tamar. "Convex Optimization: Algorithms and Complexity." Foundations and Trends® in Machine Learning 8, no. 5-6 (2015): 359–483. http://dx.doi.org/10.1561/2200000049.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Bubeck, Sébastien. "Convex Optimization: Algorithms and Complexity." Foundations and Trends® in Machine Learning 8, no. 3-4 (2015): 231–357. http://dx.doi.org/10.1561/2200000050.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Hazan, Elad. "Introduction to Online Convex Optimization." Foundations and Trends® in Optimization 2, no. 3-4 (2016): 157–325. http://dx.doi.org/10.1561/2400000013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Boyd, Stephen P. "Real-time Embedded Convex Optimization." IFAC Proceedings Volumes 42, no. 11 (2009): 9. http://dx.doi.org/10.3182/20090712-4-tr-2008.00004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Drusvyatskiy, Dmitriy, and Adrian S. Lewis. "Generic nondegeneracy in convex optimization." Proceedings of the American Mathematical Society 139, no. 7 (December 21, 2010): 2519–27. http://dx.doi.org/10.1090/s0002-9939-2010-10692-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Selim, S. Z. "Optimization of linear-convex programs." Optimization 29, no. 4 (January 1994): 319–31. http://dx.doi.org/10.1080/02331939408843961.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Simpkins, Alex. "Convex Optimization [On the Shelf]." IEEE Robotics & Automation Magazine 20, no. 4 (December 2013): 164–65. http://dx.doi.org/10.1109/mra.2013.2283189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Hu, T. C., Victor Klee, and David Larman. "Optimization of Globally Convex Functions." SIAM Journal on Control and Optimization 27, no. 5 (September 1989): 1026–47. http://dx.doi.org/10.1137/0327055.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Temlyakov, V. N. "Greedy expansions in convex optimization." Proceedings of the Steklov Institute of Mathematics 284, no. 1 (May 2014): 244–62. http://dx.doi.org/10.1134/s0081543814010180.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Adivar, Murat, and Shu-Cherng Fang. "Convex optimization on mixed domains." Journal of Industrial & Management Optimization 8, no. 1 (2012): 189–227. http://dx.doi.org/10.3934/jimo.2012.8.189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Park, Poogyeon, and Thomas Kailath. "H filtering via convex optimization." International Journal of Control 66, no. 1 (January 1997): 15–22. http://dx.doi.org/10.1080/002071797224793.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Chen, Niangjun, Anish Agarwal, Adam Wierman, Siddharth Barman, and Lachlan L. H. Andrew. "Online Convex Optimization Using Predictions." ACM SIGMETRICS Performance Evaluation Review 43, no. 1 (June 24, 2015): 191–204. http://dx.doi.org/10.1145/2796314.2745854.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Gawlitza, Thomas Martin, Helmut Seidl, Assalé Adjé, Stéphane Gaubert, and Éric Goubault. "Abstract interpretation meets convex optimization." Journal of Symbolic Computation 47, no. 12 (December 2012): 1416–46. http://dx.doi.org/10.1016/j.jsc.2011.12.048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Joshi, S., and S. Boyd. "Sensor Selection via Convex Optimization." IEEE Transactions on Signal Processing 57, no. 2 (February 2009): 451–62. http://dx.doi.org/10.1109/tsp.2008.2007095.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Rantzer, Anders. "Dynamic programming via convex optimization." IFAC Proceedings Volumes 32, no. 2 (July 1999): 2059–64. http://dx.doi.org/10.1016/s1474-6670(17)56349-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Luan, Nguyen Ngoc, and Jen-Chih Yao. "Generalized polyhedral convex optimization problems." Journal of Global Optimization 75, no. 3 (March 11, 2019): 789–811. http://dx.doi.org/10.1007/s10898-019-00763-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Kryazhimskii, A. V., and R. A. Usachev. "Convex two-level optimization problem." Computational Mathematics and Modeling 19, no. 1 (January 2008): 73–101. http://dx.doi.org/10.1007/s10598-008-0007-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Pintér, J. "Global optimization on convex sets." Operations-Research-Spektrum 8, no. 4 (December 1986): 197–202. http://dx.doi.org/10.1007/bf01721128.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Tsitsiklis, John N., and Zhi-Quan Luo. "Communication complexity of convex optimization." Journal of Complexity 3, no. 3 (September 1987): 231–43. http://dx.doi.org/10.1016/0885-064x(87)90013-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Kulikov, A. N., and V. R. Fazylov. "Convex optimization with prescribed accuracy." USSR Computational Mathematics and Mathematical Physics 30, no. 3 (January 1990): 16–22. http://dx.doi.org/10.1016/0041-5553(90)90185-u.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Nguyen, Hao, and Guergana Petrova. "Greedy Strategies for Convex Optimization." Calcolo 54, no. 1 (March 30, 2016): 207–24. http://dx.doi.org/10.1007/s10092-016-0183-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

DeVore, R. A., and V. N. Temlyakov. "Convex Optimization on Banach Spaces." Foundations of Computational Mathematics 16, no. 2 (February 14, 2015): 369–94. http://dx.doi.org/10.1007/s10208-015-9248-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography