Artykuły w czasopismach na temat „Global function optimization”

Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Global function optimization.

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Global function optimization”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Liu, Y., i K. L. Teo. "A bridging method for global optimization". Journal of the Australian Mathematical Society. Series B. Applied Mathematics 41, nr 1 (lipiec 1999): 41–57. http://dx.doi.org/10.1017/s0334270000011024.

Pełny tekst źródła
Streszczenie:
AbstractIn this paper a bridging method is introduced for numerical solutions of one-dimensional global optimization problems where a continuously differentiable function is to be minimized over a finite interval which can be given either explicitly or by constraints involving continuously differentiable functions. The concept of a bridged function is introduced. Some properties of the bridged function are given. On this basis, several bridging algorithm are developed for the computation of global optimal solutions. The algorithms are demonstrated by solving several numerical examples.
Style APA, Harvard, Vancouver, ISO itp.
2

Zhu, Jinghao, Jiani Zhou i David Gao. "Global optimization by canonical dual function". Journal of Computational and Applied Mathematics 234, nr 2 (maj 2010): 538–44. http://dx.doi.org/10.1016/j.cam.2009.12.045.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Norkin, Vladimir. "A Stochastic Smoothing Method for Nonsmooth Global Optimization". Cybernetics and Computer Technologies, nr 1 (31.03.2020): 5–14. http://dx.doi.org/10.34229/2707-451x.20.1.1.

Pełny tekst źródła
Streszczenie:
The paper presents the results of testing the stochastic smoothing method for global optimization of a multiextremal function in a convex feasible subset of Euclidean space. Preliminarily, the objective function is extended outside the admissible region so that its global minimum does not change, and it becomes coercive. The smoothing of a function at any point is carried out by averaging the values of the function over some neighborhood of this point. The size of the neighborhood is a smoothing parameter. Smoothing eliminates small local extrema of the original function. With a sufficiently large value of the smoothing parameter, the averaged function can have only one minimum. The smoothing method consists in replacing the original function with a sequence of smoothed approximations with vanishing to zero smoothing parameter and optimization of the latter functions by contemporary stochastic optimization methods. Passing from the minimum of one smoothed function to a close minimum of the next smoothed function, we can gradually come to the region of the global minimum of the original function. The smoothing method is also applicable for the optimization of nonsmooth nonconvex functions. It is shown that the smoothing method steadily solves test global optimization problems of small dimensions from the literature.
Style APA, Harvard, Vancouver, ISO itp.
4

Satapathy, Suresh Chandra. "Improved teaching learning based optimization for global function optimization". Decision Science Letters 2, nr 1 (1.01.2013): 23–34. http://dx.doi.org/10.5267/j.dsl.2012.10.005.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Satapathy, Suresh Chandra, Anima Naik i K. Parvathi. "Weighted Teaching-Learning-Based Optimization for Global Function Optimization". Applied Mathematics 04, nr 03 (2013): 429–39. http://dx.doi.org/10.4236/am.2013.43064.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Wang, Wei, Xiaoshan Zhang i Min Li. "A Filled Function Method Dominated by Filter for Nonlinearly Global Optimization". Journal of Applied Mathematics 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/245427.

Pełny tekst źródła
Streszczenie:
This work presents a filled function method based on the filter technique for global optimization. Filled function method is one of the effective methods for nonlinear global optimization, since it can effectively find a better minimizer. Filter technique is applied to local optimization methods for its excellent numerical results. In order to optimize the filled function method, the filter method is employed for global optimizations in this method. A new filled function is proposed first, and then the algorithm and its properties are proved. The numerical results are listed at the end.
Style APA, Harvard, Vancouver, ISO itp.
7

DJALIL, BOUDJEHEM, BOUDJEHEM BADREDDINE i BOUKAACHE ABDENOUR. "REDUCING DIMENSION IN GLOBAL OPTIMIZATION". International Journal of Computational Methods 08, nr 03 (wrzesień 2011): 535–44. http://dx.doi.org/10.1142/s0219876211002460.

Pełny tekst źródła
Streszczenie:
In this paper, we propose a very interesting idea in global optimization making it easer and a low-cost task. The main idea is to reduce the dimension of the optimization problem in hand to a mono-dimensional one using variables coding. At this level, the algorithm will look for the global optimum of a mono-dimensional cost function. The new algorithm has the ability to avoid local optima, reduces the number of evaluations, and improves the speed of the algorithm convergence. This method is suitable for functions that have many extremes. Our algorithm can determine a narrow space around the global optimum in very restricted time based on a stochastic tests and an adaptive partition of the search space. Illustrative examples are presented to show the efficiency of the proposed idea. It was found that the algorithm was able to locate the global optimum even though the objective function has a large number of optima.
Style APA, Harvard, Vancouver, ISO itp.
8

Xiao, Jin-ke, Wei-min Li, Wei Li i Xin-rong Xiao. "Optimization on Black Box Function Optimization Problem". Mathematical Problems in Engineering 2015 (2015): 1–10. http://dx.doi.org/10.1155/2015/647234.

Pełny tekst źródła
Streszczenie:
There are a large number of engineering optimization problems in real world, whose input-output relationships are vague and indistinct. Here, they are called black box function optimization problem (BBFOP). Then, inspired by the mechanism of neuroendocrine system regulating immune system, BP neural network modified immune optimization algorithm (NN-MIA) is proposed. NN-MIA consists of two phases: the first phase is training BP neural network with expected precision to confirm input-output relationship and the other phase is immune optimization phase, whose aim is to search global optima. BP neural network fitting without expected fitting precision could be replaced with polynomial fitting or other fitting methods within expected fitting precision. Experimental simulation confirms global optimization capability of MIA and the practical application of BBFOP optimization method.
Style APA, Harvard, Vancouver, ISO itp.
9

Mou-Yan Zou i Xi Zou. "Global optimization: an auxiliary cost function approach". IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans 30, nr 3 (maj 2000): 347–54. http://dx.doi.org/10.1109/3468.844358.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Han, Qiaoming, i Jiye Han. "Revised filled function methods for global optimization". Applied Mathematics and Computation 119, nr 2-3 (kwiecień 2001): 217–28. http://dx.doi.org/10.1016/s0096-3003(99)00266-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
11

Wu, Z. Y., L. S. Zhang, K. L. Teo i F. S. Bai. "New Modified Function Method for Global Optimization". Journal of Optimization Theory and Applications 125, nr 1 (kwiecień 2005): 181–203. http://dx.doi.org/10.1007/s10957-004-1718-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
12

Liang, Y. M., L. S. Zhang, M. M. Li i B. S. Han. "A filled function method for global optimization". Journal of Computational and Applied Mathematics 205, nr 1 (sierpień 2007): 16–31. http://dx.doi.org/10.1016/j.cam.2006.04.038.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
13

Isshiki, Masaki, Hiroki Ono, Kouichi Hiraga, Jun Ishikawa i Suezou Nakadate. "Lens Design: Global Optimization with Escape Function". Optical Review 2, nr 6 (listopad 1995): 463–70. http://dx.doi.org/10.1007/s10043-995-0463-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

Pandiya, Ridwan, Widodo Widodo, Salmah i Irwan Endrayanto. "Non parameter-filled function for global optimization". Applied Mathematics and Computation 391 (luty 2021): 125642. http://dx.doi.org/10.1016/j.amc.2020.125642.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
15

Bemporad, Alberto. "Global optimization via inverse distance weighting and radial basis functions". Computational Optimization and Applications 77, nr 2 (27.07.2020): 571–95. http://dx.doi.org/10.1007/s10589-020-00215-w.

Pełny tekst źródła
Streszczenie:
Abstract Global optimization problems whose objective function is expensive to evaluate can be solved effectively by recursively fitting a surrogate function to function samples and minimizing an acquisition function to generate new samples. The acquisition step trades off between seeking for a new optimization vector where the surrogate is minimum (exploitation of the surrogate) and looking for regions of the feasible space that have not yet been visited and that may potentially contain better values of the objective function (exploration of the feasible space). This paper proposes a new global optimization algorithm that uses inverse distance weighting (IDW) and radial basis functions (RBF) to construct the acquisition function. Rather arbitrary constraints that are simple to evaluate can be easily taken into account. Compared to Bayesian optimization, the proposed algorithm, that we call GLIS (GLobal minimum using Inverse distance weighting and Surrogate radial basis functions), is competitive and computationally lighter, as we show in a set of benchmark global optimization and hyperparameter tuning problems. MATLAB and Python implementations of GLIS are available at http://cse.lab.imtlucca.it/~bemporad/glis.
Style APA, Harvard, Vancouver, ISO itp.
16

Aaid, Djamel, i Özen Özer. "New technique for solving multivariate global optimization". Journal of Numerical Analysis and Approximation Theory 52, nr 1 (10.07.2023): 3–16. http://dx.doi.org/10.33993/jnaat521-1287.

Pełny tekst źródła
Streszczenie:
In this paper, we propose an algorithm based on branch and bound method to underestimate the objective function and reductive transformation which is transformed the all multivariable functions on univariable functions. We also demonstrate several quadratic lower bound functions are proposed which they are better/preferable than the others well-known in literature. We obtain that our experimental results are more effective when we face different nonconvex functions.
Style APA, Harvard, Vancouver, ISO itp.
17

Weixiang Wang, Youlin Shang i Shenhua Gui. "Application of Global Descent Function Method to Discrete Global Optimization". INTERNATIONAL JOURNAL ON Advances in Information Sciences and Service Sciences 5, nr 7 (15.04.2013): 684–91. http://dx.doi.org/10.4156/aiss.vol5.issue7.80.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
18

Wang, Wei-xiang, You-lin Shang i Ying Zhang. "Global Minimization of Nonsmooth Constrained Global Optimization with Filled Function". Mathematical Problems in Engineering 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/563860.

Pełny tekst źródła
Streszczenie:
A novel filled function is constructed to locate a global optimizer or an approximate global optimizer of smooth or nonsmooth constrained global minimization problems. The constructed filled function contains only one parameter which can be easily adjusted during the minimization. The theoretical properties of the filled function are discussed and a corresponding solution algorithm is proposed. The solution algorithm comprises two phases: local minimization and filling. The first phase minimizes the original problem and obtains one of its local optimizers, while the second phase minimizes the constructed filled function and identifies a better initial point for the first phase. Some preliminary numerical results are also reported.
Style APA, Harvard, Vancouver, ISO itp.
19

Pandiya, Ridwan, Atina Ahdika, Siti Khomsah i Rima Dias Ramadhani. "A New Integral Function Algorithm for Global Optimization and Its Application to the Data Clustering Problem". MENDEL 29, nr 2 (20.12.2023): 162–68. http://dx.doi.org/10.13164/mendel.2023.2.162.

Pełny tekst źródła
Streszczenie:
The filled function method is an approach to finding global minimum points of multidimensional unconstrained global optimization problems. The conventional parametric filled functions have computational weaknesses when they are employed in some benchmark optimization functions. This paper proposes a new integral function algorithm based on the auxiliary function approach. The proposed method can successfully be used to find the global minimum point of a function of several variables. Some testing global optimization problems have been used to show the ability of this recommended method. The integral function algorithm is then implemented to solve the center-based data clustering problem. The results show that the proposed algorithm can solve the problem successfully.
Style APA, Harvard, Vancouver, ISO itp.
20

Zabotin,, Vladislav I., i Pavel A. Chernyshevskij. "Extension of Strongins Global Optimization Algorithm to a Function Continuous on a Compact Interval". Computer Research and Modeling 11, nr 6 (grudzień 2019): 1111–19. http://dx.doi.org/10.20537/2076-7633-2019-11-6-1111-1119.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
21

Horai, Mio, Hideo Kobayashi i Takashi G. Nitta. "Global Optimization for the Sum of Certain Nonlinear Functions". Abstract and Applied Analysis 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/408918.

Pełny tekst źródła
Streszczenie:
We extend work by Pei-Ping and Gui-Xia, 2007, to a global optimization problem for more general functions. Pei-Ping and Gui-Xia treat the optimization problem for the linear sum of polynomial fractional functions, using a branch and bound approach. We prove that this extension makes possible to solve the following nonconvex optimization problems which Pei-Ping and Gui-Xia, 2007, cannot solve, that the sum of the positive (or negative) first and second derivatives function with the variable defined by sum of polynomial fractional function by using branch and bound algorithm.
Style APA, Harvard, Vancouver, ISO itp.
22

Wang, Weixiang, Youlin Shang i Ying Zhang. "Finding Global Minima with a Filled Function Approach for Non-Smooth Global Optimization". Discrete Dynamics in Nature and Society 2010 (2010): 1–10. http://dx.doi.org/10.1155/2010/843609.

Pełny tekst źródła
Streszczenie:
A filled function approach is proposed for solving a non-smooth unconstrained global optimization problem. First, the definition of filled function in Zhang (2009) for smooth global optimization is extended to non-smooth case and a new one is put forwarded. Then, a novel filled function is proposed for non-smooth the global optimization and a corresponding non-smooth algorithm based on the filled function is designed. At last, a numerical test is made. The computational results demonstrate that the proposed approach is effcient and reliable.
Style APA, Harvard, Vancouver, ISO itp.
23

Wang, Wei, Yong-jian Yang i Lian-sheng Zhang. "Unification of Filled Function and Tunnelling Function in Global Optimization". Acta Mathematicae Applicatae Sinica, English Series 23, nr 1 (styczeń 2007): 59–66. http://dx.doi.org/10.1007/s10255-006-0349-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
24

Tsoulos, Ioannis G., Alexandros Tzallas, Evangelos Karvounis i Dimitrios Tsalikakis. "NeuralMinimizer: A Novel Method for Global Optimization". Information 14, nr 2 (25.01.2023): 66. http://dx.doi.org/10.3390/info14020066.

Pełny tekst źródła
Streszczenie:
The problem of finding the global minimum of multidimensional functions is often applied to a wide range of problems. An innovative method of finding the global minimum of multidimensional functions is presented here. This method first generates an approximation of the objective function using only a few real samples from it. These samples construct the approach using a machine learning model. Next, the required sampling is performed by the approximation function. Furthermore, the approach is improved on each sample by using found local minima as samples for the training set of the machine learning model. In addition, as a termination criterion, the proposed technique uses a widely used criterion from the relevant literature which in fact evaluates it after each execution of the local minimization. The proposed technique was applied to a number of well-known problems from the relevant literature, and the comparative results with respect to modern global minimization techniques are shown to be extremely promising.
Style APA, Harvard, Vancouver, ISO itp.
25

Angeles, J., i A. Bernier. "The Global Least-Square Optimization of Function-Generating Linkages". Journal of Mechanisms, Transmissions, and Automation in Design 109, nr 2 (1.06.1987): 204–9. http://dx.doi.org/10.1115/1.3267439.

Pełny tekst źródła
Streszczenie:
As shown elsewhere, the approximate least-square synthesis of function-generating linkages can be formulated in terms of one single decision variable. Hence, functions representing global properties of linkages such as accuracy and transmission quality, which are terms precisely defined in this paper, as well as mobility type, are all functions of one single variable. The optimization procedure presented here is aimed at the synthesis of function-generating linkages with a minimum least-square design error and a maximum transmission quality, while meeting conditions prescribed on the type of mobility (crank or rocker) of its input and/or output links. Alternative performance indices and/or constraints can be treated likewise.
Style APA, Harvard, Vancouver, ISO itp.
26

Zhou, Heng Jun, Ming Yan Jiang i Xian Ye Ben. "Niche Brain Storm Optimization Algorithm for Multi-Peak Function Optimization". Advanced Materials Research 989-994 (lipiec 2014): 1626–30. http://dx.doi.org/10.4028/www.scientific.net/amr.989-994.1626.

Pełny tekst źródła
Streszczenie:
Brain Storm Optimization (BSO) is a novel proposed swarm intelligence optimization algorithm which has a fast convergent speed. However, it is easy to trap into local optimal. In this paper, a new model based on niche technology, which is named Niche Brain Storm Optimization (NBSO), is proposed to overcome the shortcoming of BSO. Niche technology effectively prevents premature and maintains population diversity during the evolution process. NBSO shows excellent performance in searching global value and finding multiple global and local optimal solutions for the multi-peak problems. Several benchmark functions are introduced to evaluate its performance. Experimental results show that NBSO performs better than BSO in global searching ability and faster than Niche Genetic Algorithm (NGA) in finding peaks for multi-peak function.
Style APA, Harvard, Vancouver, ISO itp.
27

Fateen, Seif-Eddeen K., i Adrián Bonilla-Petriciolet. "Gradient-Based Cuckoo Search for Global Optimization". Mathematical Problems in Engineering 2014 (2014): 1–12. http://dx.doi.org/10.1155/2014/493740.

Pełny tekst źródła
Streszczenie:
One of the major advantages of stochastic global optimization methods is the lack of the need of the gradient of the objective function. However, in some cases, this gradient is readily available and can be used to improve the numerical performance of stochastic optimization methods specially the quality and precision of global optimal solution. In this study, we proposed a gradient-based modification to the cuckoo search algorithm, which is a nature-inspired swarm-based stochastic global optimization method. We introduced the gradient-based cuckoo search (GBCS) and evaluated its performance vis-à-vis the original algorithm in solving twenty-four benchmark functions. The use of GBCS improved reliability and effectiveness of the algorithm in all but four of the tested benchmark problems. GBCS proved to be a strong candidate for solving difficult optimization problems, for which the gradient of the objective function is readily available.
Style APA, Harvard, Vancouver, ISO itp.
28

Zhao, Ruxin, Qifang Luo i Yongquan Zhou. "Elite Opposition-Based Social Spider Optimization Algorithm for Global Function Optimization". Algorithms 10, nr 1 (8.01.2017): 9. http://dx.doi.org/10.3390/a10010009.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
29

Wei, Fei, i Yuping Wang. "A New Filled Function Method with One Parameter for Global Optimization". Mathematical Problems in Engineering 2013 (2013): 1–12. http://dx.doi.org/10.1155/2013/532325.

Pełny tekst źródła
Streszczenie:
The filled function method is an effective approach to find the global minimizer of multidimensional multimodal functions. The conventional filled functions are numerically unstable due to exponential or logarithmic term and sensitive to parameters. In this paper, a new filled function with only one parameter is proposed, which is continuously differentiable and proved to satisfy all conditions of the filled function definition. Moreover, this filled function is not sensitive to parameter, and the overflow can not happen for this function. Based on these, a new filled function method is proposed, and it is numerically stable to the initial point and the parameter variable. The computer simulations indicate that the proposed filled function method is efficient and effective.
Style APA, Harvard, Vancouver, ISO itp.
30

Wang, Weixiang, Youlin Shang i Ying Zhang. "A Filled Function Approach for Nonsmooth Constrained Global Optimization". Mathematical Problems in Engineering 2010 (2010): 1–9. http://dx.doi.org/10.1155/2010/310391.

Pełny tekst źródła
Streszczenie:
A novel filled function is given in this paper to find a global minima for a nonsmooth constrained optimization problem. First, a modified concept of the filled function for nonsmooth constrained global optimization is introduced, and a filled function, which makes use of the idea of the filled function for unconstrained optimization and penalty function for constrained optimization, is proposed. Then, a solution algorithm based on the proposed filled function is developed. At last, some preliminary numerical results are reported. The results show that the proposed approach is promising.
Style APA, Harvard, Vancouver, ISO itp.
31

Li, Jia, Yuelin Gao, Tiantian Chen i Xiaohua Ma. "A new filled function method based on global search for solving unconstrained optimization problems". AIMS Mathematics 9, nr 7 (2024): 18475–505. http://dx.doi.org/10.3934/math.2024900.

Pełny tekst źródła
Streszczenie:
<abstract><p>The filled function method is a deterministic algorithm for finding a global minimizer of global optimization problems, and its effectiveness is closely related to the form of the constructed filled function. Currently, the filled functions mainly have three drawbacks in form, namely, parameter adjustment and control (if any), inclusion of exponential or logarithmic functions, and properties that are discontinuous and non-differentiable. In order to overcome these limitations, this paper proposed a parameter-free filled function that does not include exponential or logarithmic functions and is continuous and differentiable. Based on the new filled function, a filled function method for solving unconstrained global optimization problems was designed. The algorithm selected points in the feasible domain that were far from the global minimum point as initial points, and improved the setting of the step size in the stage of minimizing the filled function to enhance the algorithm's global optimization capability. In addition, tests were conducted on 14 benchmark functions and compared with existing filled function algorithms. The numerical experimental results showed that the new algorithm proposed in this paper was feasible and effective.</p></abstract>
Style APA, Harvard, Vancouver, ISO itp.
32

Lin, Youjiang, Yongjian Yang i Liansheng Zhang. "A NOVEL FILLED FUNCTION METHOD FOR GLOBAL OPTIMIZATION". Journal of the Korean Mathematical Society 47, nr 6 (1.11.2010): 1253–67. http://dx.doi.org/10.4134/jkms.2010.47.6.1253.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
33

Zhang, Lian-Sheng, Chi-Kong Ng, Duan Li i Wei-Wen Tian. "A New Filled Function Method for Global Optimization". Journal of Global Optimization 28, nr 1 (styczeń 2004): 17–43. http://dx.doi.org/10.1023/b:jogo.0000006653.60256.f6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
34

Sun, X. L., i D. Li. "Value-Estimation Function Method for Constrained Global Optimization". Journal of Optimization Theory and Applications 102, nr 2 (sierpień 1999): 385–409. http://dx.doi.org/10.1023/a:1021736608968.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
35

Liu, Xian. "The impelling function method applied to global optimization". Applied Mathematics and Computation 151, nr 3 (kwiecień 2004): 745–54. http://dx.doi.org/10.1016/s0096-3003(03)00525-3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
36

Wang, Yuncheng, Weiwu Fang i Tianjiao Wu. "A cut-peak function method for global optimization". Journal of Computational and Applied Mathematics 230, nr 1 (sierpień 2009): 135–42. http://dx.doi.org/10.1016/j.cam.2008.10.069.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
37

Liu, Xian, i Wilsun Xu. "A new filled function applied to global optimization". Computers & Operations Research 31, nr 1 (styczeń 2004): 61–80. http://dx.doi.org/10.1016/s0305-0548(02)00154-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
38

Wang, Xiaoli, i Guobiao Zhou. "A new filled function for unconstrained global optimization". Applied Mathematics and Computation 174, nr 1 (marzec 2006): 419–29. http://dx.doi.org/10.1016/j.amc.2005.05.009.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
39

Ruan, Zhao-Hui, Yuan Yuan, Qi-Xiang Chen, Chuan-Xin Zhang, Yong Shuai i He-Ping Tan. "A new multi-function global particle swarm optimization". Applied Soft Computing 49 (grudzień 2016): 279–91. http://dx.doi.org/10.1016/j.asoc.2016.07.034.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
40

Schnabel, Robert B. "Concurrent function evaluations in local and global optimization". Computer Methods in Applied Mechanics and Engineering 64, nr 1-3 (październik 1987): 537–52. http://dx.doi.org/10.1016/0045-7825(87)90055-7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
41

Gao, Yuelin, Yongjian Yang i Mi You. "A new filled function method for global optimization". Applied Mathematics and Computation 268 (październik 2015): 685–95. http://dx.doi.org/10.1016/j.amc.2015.06.090.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
42

Wu, Z. Y., F. S. Bai, H. W. J. Lee i Y. J. Yang. "A filled function method for constrained global optimization". Journal of Global Optimization 39, nr 4 (19.04.2007): 495–507. http://dx.doi.org/10.1007/s10898-007-9152-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
43

Ng, Chi-Kong, Lian-Sheng Zhang, Duan Li i Wei-Wen Tian. "Discrete Filled Function Method for Discrete Global Optimization". Computational Optimization and Applications 31, nr 1 (maj 2005): 87–115. http://dx.doi.org/10.1007/s10589-005-0985-7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
44

Lampariello, F., i G. Liuzzi. "A filling function method for unconstrained global optimization". Computational Optimization and Applications 61, nr 3 (5.02.2015): 713–29. http://dx.doi.org/10.1007/s10589-015-9728-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
45

Ge, Renpu. "The filled function transformations for constrained global optimization". Applied Mathematics and Computation 39, nr 1 (wrzesień 1990): 1–20. http://dx.doi.org/10.1016/0096-3003(90)90118-m.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
46

Wu, Xiuli, Yongquan Zhou i Yuting Lu. "Elite Opposition-Based Water Wave Optimization Algorithm for Global Optimization". Mathematical Problems in Engineering 2017 (2017): 1–25. http://dx.doi.org/10.1155/2017/3498363.

Pełny tekst źródła
Streszczenie:
Water wave optimization (WWO) is a novel metaheuristic method that is based on shallow water wave theory, which has simple structure, easy realization, and good performance even with a small population. To improve the convergence speed and calculation precision even further, this paper on elite opposition-based strategy water wave optimization (EOBWWO) is proposed, and it has been applied for function optimization and structure engineering design problems. There are three major optimization strategies in the improvement: elite opposition-based (EOB) learning strategy enhances the diversity of population, local neighborhood search strategy is introduced to enhance local search in breaking operation, and improved propagation operator provides the improved algorithm with a better balance between exploration and exploitation. EOBWWO algorithm is verified by using 20 benchmark functions and two structure engineering design problems and the performance of EOBWWO is compared against those of the state-of-the-art algorithms. Experimental results show that the proposed algorithm has faster convergence speed, higher calculation precision, with the exact solution being even obtained on some benchmark functions, and a higher degree of stability than other comparative algorithms.
Style APA, Harvard, Vancouver, ISO itp.
47

Pirot, Guillaume, Tipaluck Krityakierne, David Ginsbourger i Philippe Renard. "Contaminant source localization via Bayesian global optimization". Hydrology and Earth System Sciences 23, nr 1 (21.01.2019): 351–69. http://dx.doi.org/10.5194/hess-23-351-2019.

Pełny tekst źródła
Streszczenie:
Abstract. Contaminant source localization problems require efficient and robust methods that can account for geological heterogeneities and accommodate relatively small data sets of noisy observations. As realism commands hi-fidelity simulations, computation costs call for global optimization algorithms under parsimonious evaluation budgets. Bayesian optimization approaches are well adapted to such settings as they allow the exploration of parameter spaces in a principled way so as to iteratively locate the point(s) of global optimum while maintaining an approximation of the objective function with an instrumental quantification of prediction uncertainty. Here, we adapt a Bayesian optimization approach to localize a contaminant source in a discretized spatial domain. We thus demonstrate the potential of such a method for hydrogeological applications and also provide test cases for the optimization community. The localization problem is illustrated for cases where the geology is assumed to be perfectly known. Two 2-D synthetic cases that display sharp hydraulic conductivity contrasts and specific connectivity patterns are investigated. These cases generate highly nonlinear objective functions that present multiple local minima. A derivative-free global optimization algorithm relying on a Gaussian process model and on the expected improvement criterion is used to efficiently localize the point of minimum of the objective functions, which corresponds to the contaminant source location. Even though concentration measurements contain a significant level of proportional noise, the algorithm efficiently localizes the contaminant source location. The variations of the objective function are essentially driven by the geology, followed by the design of the monitoring well network. The data and scripts used to generate objective functions are shared to favor reproducible research. This contribution is important because the functions present multiple local minima and are inspired from a practical field application. Sharing these complex objective functions provides a source of test cases for global optimization benchmarks and should help with designing new and efficient methods to solve this type of problem.
Style APA, Harvard, Vancouver, ISO itp.
48

Degachi, Hajer, Bechir Naffeti, Wassila Chagra i Moufida Ksouri. "Filled Function Method for Nonlinear Model Predictive Control". Mathematical Problems in Engineering 2018 (4.06.2018): 1–8. http://dx.doi.org/10.1155/2018/9497618.

Pełny tekst źródła
Streszczenie:
A new method is used to solve the nonconvex optimization problem of the nonlinear model predictive control (NMPC) for Hammerstein model. Using nonlinear models in MPC leads to a nonlinear and nonconvex optimization problem. Since control performances depend essentially on the results of the optimization method, in this work, we propose to use the filled function as a global optimization method to solve the nonconvex optimization problem. Using this method, the control law can be obtained through two steps. The first step consists of determining a local minimum of the objective function. In the second step, a new function is constructed using the local minimum of the objective function found in the first step. The new function is called the filled function; the new constructed function allows us to obtain an initialization near the global minimum. Once this initialization is determined, we can use a local optimization method to determine the global control sequence. The efficiency of the proposed method is proved firstly through benchmark functions and then through the ball and beam system described by Hammerstein model. The results obtained by the presented method are compared with those of the genetic algorithm (GA) and the particle swarm optimization (PSO).
Style APA, Harvard, Vancouver, ISO itp.
49

Regis, Rommel G., i Christine A. Shoemaker. "Parallel radial basis function methods for the global optimization of expensive functions". European Journal of Operational Research 182, nr 2 (październik 2007): 514–35. http://dx.doi.org/10.1016/j.ejor.2006.08.040.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
50

Schweidtmann, Artur M., Dominik Bongartz, Daniel Grothe, Tim Kerkenhoff, Xiaopeng Lin, Jaromił Najman i Alexander Mitsos. "Deterministic global optimization with Gaussian processes embedded". Mathematical Programming Computation 13, nr 3 (25.06.2021): 553–81. http://dx.doi.org/10.1007/s12532-021-00204-y.

Pełny tekst źródła
Streszczenie:
AbstractGaussian processes (Kriging) are interpolating data-driven models that are frequently applied in various disciplines. Often, Gaussian processes are trained on datasets and are subsequently embedded as surrogate models in optimization problems. These optimization problems are nonconvex and global optimization is desired. However, previous literature observed computational burdens limiting deterministic global optimization to Gaussian processes trained on few data points. We propose a reduced-space formulation for deterministic global optimization with trained Gaussian processes embedded. For optimization, the branch-and-bound solver branches only on the free variables and McCormick relaxations are propagated through explicit Gaussian process models. The approach also leads to significantly smaller and computationally cheaper subproblems for lower and upper bounding. To further accelerate convergence, we derive envelopes of common covariance functions for GPs and tight relaxations of acquisition functions used in Bayesian optimization including expected improvement, probability of improvement, and lower confidence bound. In total, we reduce computational time by orders of magnitude compared to state-of-the-art methods, thus overcoming previous computational burdens. We demonstrate the performance and scaling of the proposed method and apply it to Bayesian optimization with global optimization of the acquisition function and chance-constrained programming. The Gaussian process models, acquisition functions, and training scripts are available open-source within the “MeLOn—MachineLearning Models for Optimization” toolbox (https://git.rwth-aachen.de/avt.svt/public/MeLOn).
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii