Articles de revues sur le sujet « Global function optimization »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Global function optimization.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « Global function optimization ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Liu, Y., et K. L. Teo. « A bridging method for global optimization ». Journal of the Australian Mathematical Society. Series B. Applied Mathematics 41, no 1 (juillet 1999) : 41–57. http://dx.doi.org/10.1017/s0334270000011024.

Texte intégral
Résumé :
AbstractIn this paper a bridging method is introduced for numerical solutions of one-dimensional global optimization problems where a continuously differentiable function is to be minimized over a finite interval which can be given either explicitly or by constraints involving continuously differentiable functions. The concept of a bridged function is introduced. Some properties of the bridged function are given. On this basis, several bridging algorithm are developed for the computation of global optimal solutions. The algorithms are demonstrated by solving several numerical examples.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Zhu, Jinghao, Jiani Zhou et David Gao. « Global optimization by canonical dual function ». Journal of Computational and Applied Mathematics 234, no 2 (mai 2010) : 538–44. http://dx.doi.org/10.1016/j.cam.2009.12.045.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Norkin, Vladimir. « A Stochastic Smoothing Method for Nonsmooth Global Optimization ». Cybernetics and Computer Technologies, no 1 (31 mars 2020) : 5–14. http://dx.doi.org/10.34229/2707-451x.20.1.1.

Texte intégral
Résumé :
The paper presents the results of testing the stochastic smoothing method for global optimization of a multiextremal function in a convex feasible subset of Euclidean space. Preliminarily, the objective function is extended outside the admissible region so that its global minimum does not change, and it becomes coercive. The smoothing of a function at any point is carried out by averaging the values of the function over some neighborhood of this point. The size of the neighborhood is a smoothing parameter. Smoothing eliminates small local extrema of the original function. With a sufficiently large value of the smoothing parameter, the averaged function can have only one minimum. The smoothing method consists in replacing the original function with a sequence of smoothed approximations with vanishing to zero smoothing parameter and optimization of the latter functions by contemporary stochastic optimization methods. Passing from the minimum of one smoothed function to a close minimum of the next smoothed function, we can gradually come to the region of the global minimum of the original function. The smoothing method is also applicable for the optimization of nonsmooth nonconvex functions. It is shown that the smoothing method steadily solves test global optimization problems of small dimensions from the literature.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Satapathy, Suresh Chandra. « Improved teaching learning based optimization for global function optimization ». Decision Science Letters 2, no 1 (1 janvier 2013) : 23–34. http://dx.doi.org/10.5267/j.dsl.2012.10.005.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Satapathy, Suresh Chandra, Anima Naik et K. Parvathi. « Weighted Teaching-Learning-Based Optimization for Global Function Optimization ». Applied Mathematics 04, no 03 (2013) : 429–39. http://dx.doi.org/10.4236/am.2013.43064.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Wang, Wei, Xiaoshan Zhang et Min Li. « A Filled Function Method Dominated by Filter for Nonlinearly Global Optimization ». Journal of Applied Mathematics 2015 (2015) : 1–8. http://dx.doi.org/10.1155/2015/245427.

Texte intégral
Résumé :
This work presents a filled function method based on the filter technique for global optimization. Filled function method is one of the effective methods for nonlinear global optimization, since it can effectively find a better minimizer. Filter technique is applied to local optimization methods for its excellent numerical results. In order to optimize the filled function method, the filter method is employed for global optimizations in this method. A new filled function is proposed first, and then the algorithm and its properties are proved. The numerical results are listed at the end.
Styles APA, Harvard, Vancouver, ISO, etc.
7

DJALIL, BOUDJEHEM, BOUDJEHEM BADREDDINE et BOUKAACHE ABDENOUR. « REDUCING DIMENSION IN GLOBAL OPTIMIZATION ». International Journal of Computational Methods 08, no 03 (septembre 2011) : 535–44. http://dx.doi.org/10.1142/s0219876211002460.

Texte intégral
Résumé :
In this paper, we propose a very interesting idea in global optimization making it easer and a low-cost task. The main idea is to reduce the dimension of the optimization problem in hand to a mono-dimensional one using variables coding. At this level, the algorithm will look for the global optimum of a mono-dimensional cost function. The new algorithm has the ability to avoid local optima, reduces the number of evaluations, and improves the speed of the algorithm convergence. This method is suitable for functions that have many extremes. Our algorithm can determine a narrow space around the global optimum in very restricted time based on a stochastic tests and an adaptive partition of the search space. Illustrative examples are presented to show the efficiency of the proposed idea. It was found that the algorithm was able to locate the global optimum even though the objective function has a large number of optima.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Xiao, Jin-ke, Wei-min Li, Wei Li et Xin-rong Xiao. « Optimization on Black Box Function Optimization Problem ». Mathematical Problems in Engineering 2015 (2015) : 1–10. http://dx.doi.org/10.1155/2015/647234.

Texte intégral
Résumé :
There are a large number of engineering optimization problems in real world, whose input-output relationships are vague and indistinct. Here, they are called black box function optimization problem (BBFOP). Then, inspired by the mechanism of neuroendocrine system regulating immune system, BP neural network modified immune optimization algorithm (NN-MIA) is proposed. NN-MIA consists of two phases: the first phase is training BP neural network with expected precision to confirm input-output relationship and the other phase is immune optimization phase, whose aim is to search global optima. BP neural network fitting without expected fitting precision could be replaced with polynomial fitting or other fitting methods within expected fitting precision. Experimental simulation confirms global optimization capability of MIA and the practical application of BBFOP optimization method.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Mou-Yan Zou et Xi Zou. « Global optimization : an auxiliary cost function approach ». IEEE Transactions on Systems, Man, and Cybernetics - Part A : Systems and Humans 30, no 3 (mai 2000) : 347–54. http://dx.doi.org/10.1109/3468.844358.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Han, Qiaoming, et Jiye Han. « Revised filled function methods for global optimization ». Applied Mathematics and Computation 119, no 2-3 (avril 2001) : 217–28. http://dx.doi.org/10.1016/s0096-3003(99)00266-0.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
11

Wu, Z. Y., L. S. Zhang, K. L. Teo et F. S. Bai. « New Modified Function Method for Global Optimization ». Journal of Optimization Theory and Applications 125, no 1 (avril 2005) : 181–203. http://dx.doi.org/10.1007/s10957-004-1718-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
12

Liang, Y. M., L. S. Zhang, M. M. Li et B. S. Han. « A filled function method for global optimization ». Journal of Computational and Applied Mathematics 205, no 1 (août 2007) : 16–31. http://dx.doi.org/10.1016/j.cam.2006.04.038.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
13

Isshiki, Masaki, Hiroki Ono, Kouichi Hiraga, Jun Ishikawa et Suezou Nakadate. « Lens Design : Global Optimization with Escape Function ». Optical Review 2, no 6 (novembre 1995) : 463–70. http://dx.doi.org/10.1007/s10043-995-0463-6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
14

Pandiya, Ridwan, Widodo Widodo, Salmah et Irwan Endrayanto. « Non parameter-filled function for global optimization ». Applied Mathematics and Computation 391 (février 2021) : 125642. http://dx.doi.org/10.1016/j.amc.2020.125642.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
15

Bemporad, Alberto. « Global optimization via inverse distance weighting and radial basis functions ». Computational Optimization and Applications 77, no 2 (27 juillet 2020) : 571–95. http://dx.doi.org/10.1007/s10589-020-00215-w.

Texte intégral
Résumé :
Abstract Global optimization problems whose objective function is expensive to evaluate can be solved effectively by recursively fitting a surrogate function to function samples and minimizing an acquisition function to generate new samples. The acquisition step trades off between seeking for a new optimization vector where the surrogate is minimum (exploitation of the surrogate) and looking for regions of the feasible space that have not yet been visited and that may potentially contain better values of the objective function (exploration of the feasible space). This paper proposes a new global optimization algorithm that uses inverse distance weighting (IDW) and radial basis functions (RBF) to construct the acquisition function. Rather arbitrary constraints that are simple to evaluate can be easily taken into account. Compared to Bayesian optimization, the proposed algorithm, that we call GLIS (GLobal minimum using Inverse distance weighting and Surrogate radial basis functions), is competitive and computationally lighter, as we show in a set of benchmark global optimization and hyperparameter tuning problems. MATLAB and Python implementations of GLIS are available at http://cse.lab.imtlucca.it/~bemporad/glis.
Styles APA, Harvard, Vancouver, ISO, etc.
16

Aaid, Djamel, et Özen Özer. « New technique for solving multivariate global optimization ». Journal of Numerical Analysis and Approximation Theory 52, no 1 (10 juillet 2023) : 3–16. http://dx.doi.org/10.33993/jnaat521-1287.

Texte intégral
Résumé :
In this paper, we propose an algorithm based on branch and bound method to underestimate the objective function and reductive transformation which is transformed the all multivariable functions on univariable functions. We also demonstrate several quadratic lower bound functions are proposed which they are better/preferable than the others well-known in literature. We obtain that our experimental results are more effective when we face different nonconvex functions.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Weixiang Wang, Youlin Shang et Shenhua Gui. « Application of Global Descent Function Method to Discrete Global Optimization ». INTERNATIONAL JOURNAL ON Advances in Information Sciences and Service Sciences 5, no 7 (15 avril 2013) : 684–91. http://dx.doi.org/10.4156/aiss.vol5.issue7.80.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
18

Wang, Wei-xiang, You-lin Shang et Ying Zhang. « Global Minimization of Nonsmooth Constrained Global Optimization with Filled Function ». Mathematical Problems in Engineering 2014 (2014) : 1–5. http://dx.doi.org/10.1155/2014/563860.

Texte intégral
Résumé :
A novel filled function is constructed to locate a global optimizer or an approximate global optimizer of smooth or nonsmooth constrained global minimization problems. The constructed filled function contains only one parameter which can be easily adjusted during the minimization. The theoretical properties of the filled function are discussed and a corresponding solution algorithm is proposed. The solution algorithm comprises two phases: local minimization and filling. The first phase minimizes the original problem and obtains one of its local optimizers, while the second phase minimizes the constructed filled function and identifies a better initial point for the first phase. Some preliminary numerical results are also reported.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Pandiya, Ridwan, Atina Ahdika, Siti Khomsah et Rima Dias Ramadhani. « A New Integral Function Algorithm for Global Optimization and Its Application to the Data Clustering Problem ». MENDEL 29, no 2 (20 décembre 2023) : 162–68. http://dx.doi.org/10.13164/mendel.2023.2.162.

Texte intégral
Résumé :
The filled function method is an approach to finding global minimum points of multidimensional unconstrained global optimization problems. The conventional parametric filled functions have computational weaknesses when they are employed in some benchmark optimization functions. This paper proposes a new integral function algorithm based on the auxiliary function approach. The proposed method can successfully be used to find the global minimum point of a function of several variables. Some testing global optimization problems have been used to show the ability of this recommended method. The integral function algorithm is then implemented to solve the center-based data clustering problem. The results show that the proposed algorithm can solve the problem successfully.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Zabotin,, Vladislav I., et Pavel A. Chernyshevskij. « Extension of Strongins Global Optimization Algorithm to a Function Continuous on a Compact Interval ». Computer Research and Modeling 11, no 6 (décembre 2019) : 1111–19. http://dx.doi.org/10.20537/2076-7633-2019-11-6-1111-1119.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
21

Horai, Mio, Hideo Kobayashi et Takashi G. Nitta. « Global Optimization for the Sum of Certain Nonlinear Functions ». Abstract and Applied Analysis 2014 (2014) : 1–8. http://dx.doi.org/10.1155/2014/408918.

Texte intégral
Résumé :
We extend work by Pei-Ping and Gui-Xia, 2007, to a global optimization problem for more general functions. Pei-Ping and Gui-Xia treat the optimization problem for the linear sum of polynomial fractional functions, using a branch and bound approach. We prove that this extension makes possible to solve the following nonconvex optimization problems which Pei-Ping and Gui-Xia, 2007, cannot solve, that the sum of the positive (or negative) first and second derivatives function with the variable defined by sum of polynomial fractional function by using branch and bound algorithm.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Wang, Weixiang, Youlin Shang et Ying Zhang. « Finding Global Minima with a Filled Function Approach for Non-Smooth Global Optimization ». Discrete Dynamics in Nature and Society 2010 (2010) : 1–10. http://dx.doi.org/10.1155/2010/843609.

Texte intégral
Résumé :
A filled function approach is proposed for solving a non-smooth unconstrained global optimization problem. First, the definition of filled function in Zhang (2009) for smooth global optimization is extended to non-smooth case and a new one is put forwarded. Then, a novel filled function is proposed for non-smooth the global optimization and a corresponding non-smooth algorithm based on the filled function is designed. At last, a numerical test is made. The computational results demonstrate that the proposed approach is effcient and reliable.
Styles APA, Harvard, Vancouver, ISO, etc.
23

Wang, Wei, Yong-jian Yang et Lian-sheng Zhang. « Unification of Filled Function and Tunnelling Function in Global Optimization ». Acta Mathematicae Applicatae Sinica, English Series 23, no 1 (janvier 2007) : 59–66. http://dx.doi.org/10.1007/s10255-006-0349-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
24

Tsoulos, Ioannis G., Alexandros Tzallas, Evangelos Karvounis et Dimitrios Tsalikakis. « NeuralMinimizer : A Novel Method for Global Optimization ». Information 14, no 2 (25 janvier 2023) : 66. http://dx.doi.org/10.3390/info14020066.

Texte intégral
Résumé :
The problem of finding the global minimum of multidimensional functions is often applied to a wide range of problems. An innovative method of finding the global minimum of multidimensional functions is presented here. This method first generates an approximation of the objective function using only a few real samples from it. These samples construct the approach using a machine learning model. Next, the required sampling is performed by the approximation function. Furthermore, the approach is improved on each sample by using found local minima as samples for the training set of the machine learning model. In addition, as a termination criterion, the proposed technique uses a widely used criterion from the relevant literature which in fact evaluates it after each execution of the local minimization. The proposed technique was applied to a number of well-known problems from the relevant literature, and the comparative results with respect to modern global minimization techniques are shown to be extremely promising.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Angeles, J., et A. Bernier. « The Global Least-Square Optimization of Function-Generating Linkages ». Journal of Mechanisms, Transmissions, and Automation in Design 109, no 2 (1 juin 1987) : 204–9. http://dx.doi.org/10.1115/1.3267439.

Texte intégral
Résumé :
As shown elsewhere, the approximate least-square synthesis of function-generating linkages can be formulated in terms of one single decision variable. Hence, functions representing global properties of linkages such as accuracy and transmission quality, which are terms precisely defined in this paper, as well as mobility type, are all functions of one single variable. The optimization procedure presented here is aimed at the synthesis of function-generating linkages with a minimum least-square design error and a maximum transmission quality, while meeting conditions prescribed on the type of mobility (crank or rocker) of its input and/or output links. Alternative performance indices and/or constraints can be treated likewise.
Styles APA, Harvard, Vancouver, ISO, etc.
26

Zhou, Heng Jun, Ming Yan Jiang et Xian Ye Ben. « Niche Brain Storm Optimization Algorithm for Multi-Peak Function Optimization ». Advanced Materials Research 989-994 (juillet 2014) : 1626–30. http://dx.doi.org/10.4028/www.scientific.net/amr.989-994.1626.

Texte intégral
Résumé :
Brain Storm Optimization (BSO) is a novel proposed swarm intelligence optimization algorithm which has a fast convergent speed. However, it is easy to trap into local optimal. In this paper, a new model based on niche technology, which is named Niche Brain Storm Optimization (NBSO), is proposed to overcome the shortcoming of BSO. Niche technology effectively prevents premature and maintains population diversity during the evolution process. NBSO shows excellent performance in searching global value and finding multiple global and local optimal solutions for the multi-peak problems. Several benchmark functions are introduced to evaluate its performance. Experimental results show that NBSO performs better than BSO in global searching ability and faster than Niche Genetic Algorithm (NGA) in finding peaks for multi-peak function.
Styles APA, Harvard, Vancouver, ISO, etc.
27

Fateen, Seif-Eddeen K., et Adrián Bonilla-Petriciolet. « Gradient-Based Cuckoo Search for Global Optimization ». Mathematical Problems in Engineering 2014 (2014) : 1–12. http://dx.doi.org/10.1155/2014/493740.

Texte intégral
Résumé :
One of the major advantages of stochastic global optimization methods is the lack of the need of the gradient of the objective function. However, in some cases, this gradient is readily available and can be used to improve the numerical performance of stochastic optimization methods specially the quality and precision of global optimal solution. In this study, we proposed a gradient-based modification to the cuckoo search algorithm, which is a nature-inspired swarm-based stochastic global optimization method. We introduced the gradient-based cuckoo search (GBCS) and evaluated its performance vis-à-vis the original algorithm in solving twenty-four benchmark functions. The use of GBCS improved reliability and effectiveness of the algorithm in all but four of the tested benchmark problems. GBCS proved to be a strong candidate for solving difficult optimization problems, for which the gradient of the objective function is readily available.
Styles APA, Harvard, Vancouver, ISO, etc.
28

Zhao, Ruxin, Qifang Luo et Yongquan Zhou. « Elite Opposition-Based Social Spider Optimization Algorithm for Global Function Optimization ». Algorithms 10, no 1 (8 janvier 2017) : 9. http://dx.doi.org/10.3390/a10010009.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
29

Wei, Fei, et Yuping Wang. « A New Filled Function Method with One Parameter for Global Optimization ». Mathematical Problems in Engineering 2013 (2013) : 1–12. http://dx.doi.org/10.1155/2013/532325.

Texte intégral
Résumé :
The filled function method is an effective approach to find the global minimizer of multidimensional multimodal functions. The conventional filled functions are numerically unstable due to exponential or logarithmic term and sensitive to parameters. In this paper, a new filled function with only one parameter is proposed, which is continuously differentiable and proved to satisfy all conditions of the filled function definition. Moreover, this filled function is not sensitive to parameter, and the overflow can not happen for this function. Based on these, a new filled function method is proposed, and it is numerically stable to the initial point and the parameter variable. The computer simulations indicate that the proposed filled function method is efficient and effective.
Styles APA, Harvard, Vancouver, ISO, etc.
30

Wang, Weixiang, Youlin Shang et Ying Zhang. « A Filled Function Approach for Nonsmooth Constrained Global Optimization ». Mathematical Problems in Engineering 2010 (2010) : 1–9. http://dx.doi.org/10.1155/2010/310391.

Texte intégral
Résumé :
A novel filled function is given in this paper to find a global minima for a nonsmooth constrained optimization problem. First, a modified concept of the filled function for nonsmooth constrained global optimization is introduced, and a filled function, which makes use of the idea of the filled function for unconstrained optimization and penalty function for constrained optimization, is proposed. Then, a solution algorithm based on the proposed filled function is developed. At last, some preliminary numerical results are reported. The results show that the proposed approach is promising.
Styles APA, Harvard, Vancouver, ISO, etc.
31

Li, Jia, Yuelin Gao, Tiantian Chen et Xiaohua Ma. « A new filled function method based on global search for solving unconstrained optimization problems ». AIMS Mathematics 9, no 7 (2024) : 18475–505. http://dx.doi.org/10.3934/math.2024900.

Texte intégral
Résumé :
<abstract><p>The filled function method is a deterministic algorithm for finding a global minimizer of global optimization problems, and its effectiveness is closely related to the form of the constructed filled function. Currently, the filled functions mainly have three drawbacks in form, namely, parameter adjustment and control (if any), inclusion of exponential or logarithmic functions, and properties that are discontinuous and non-differentiable. In order to overcome these limitations, this paper proposed a parameter-free filled function that does not include exponential or logarithmic functions and is continuous and differentiable. Based on the new filled function, a filled function method for solving unconstrained global optimization problems was designed. The algorithm selected points in the feasible domain that were far from the global minimum point as initial points, and improved the setting of the step size in the stage of minimizing the filled function to enhance the algorithm's global optimization capability. In addition, tests were conducted on 14 benchmark functions and compared with existing filled function algorithms. The numerical experimental results showed that the new algorithm proposed in this paper was feasible and effective.</p></abstract>
Styles APA, Harvard, Vancouver, ISO, etc.
32

Lin, Youjiang, Yongjian Yang et Liansheng Zhang. « A NOVEL FILLED FUNCTION METHOD FOR GLOBAL OPTIMIZATION ». Journal of the Korean Mathematical Society 47, no 6 (1 novembre 2010) : 1253–67. http://dx.doi.org/10.4134/jkms.2010.47.6.1253.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
33

Zhang, Lian-Sheng, Chi-Kong Ng, Duan Li et Wei-Wen Tian. « A New Filled Function Method for Global Optimization ». Journal of Global Optimization 28, no 1 (janvier 2004) : 17–43. http://dx.doi.org/10.1023/b:jogo.0000006653.60256.f6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
34

Sun, X. L., et D. Li. « Value-Estimation Function Method for Constrained Global Optimization ». Journal of Optimization Theory and Applications 102, no 2 (août 1999) : 385–409. http://dx.doi.org/10.1023/a:1021736608968.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
35

Liu, Xian. « The impelling function method applied to global optimization ». Applied Mathematics and Computation 151, no 3 (avril 2004) : 745–54. http://dx.doi.org/10.1016/s0096-3003(03)00525-3.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
36

Wang, Yuncheng, Weiwu Fang et Tianjiao Wu. « A cut-peak function method for global optimization ». Journal of Computational and Applied Mathematics 230, no 1 (août 2009) : 135–42. http://dx.doi.org/10.1016/j.cam.2008.10.069.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
37

Liu, Xian, et Wilsun Xu. « A new filled function applied to global optimization ». Computers & ; Operations Research 31, no 1 (janvier 2004) : 61–80. http://dx.doi.org/10.1016/s0305-0548(02)00154-5.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
38

Wang, Xiaoli, et Guobiao Zhou. « A new filled function for unconstrained global optimization ». Applied Mathematics and Computation 174, no 1 (mars 2006) : 419–29. http://dx.doi.org/10.1016/j.amc.2005.05.009.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
39

Ruan, Zhao-Hui, Yuan Yuan, Qi-Xiang Chen, Chuan-Xin Zhang, Yong Shuai et He-Ping Tan. « A new multi-function global particle swarm optimization ». Applied Soft Computing 49 (décembre 2016) : 279–91. http://dx.doi.org/10.1016/j.asoc.2016.07.034.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
40

Schnabel, Robert B. « Concurrent function evaluations in local and global optimization ». Computer Methods in Applied Mechanics and Engineering 64, no 1-3 (octobre 1987) : 537–52. http://dx.doi.org/10.1016/0045-7825(87)90055-7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
41

Gao, Yuelin, Yongjian Yang et Mi You. « A new filled function method for global optimization ». Applied Mathematics and Computation 268 (octobre 2015) : 685–95. http://dx.doi.org/10.1016/j.amc.2015.06.090.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
42

Wu, Z. Y., F. S. Bai, H. W. J. Lee et Y. J. Yang. « A filled function method for constrained global optimization ». Journal of Global Optimization 39, no 4 (19 avril 2007) : 495–507. http://dx.doi.org/10.1007/s10898-007-9152-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
43

Ng, Chi-Kong, Lian-Sheng Zhang, Duan Li et Wei-Wen Tian. « Discrete Filled Function Method for Discrete Global Optimization ». Computational Optimization and Applications 31, no 1 (mai 2005) : 87–115. http://dx.doi.org/10.1007/s10589-005-0985-7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
44

Lampariello, F., et G. Liuzzi. « A filling function method for unconstrained global optimization ». Computational Optimization and Applications 61, no 3 (5 février 2015) : 713–29. http://dx.doi.org/10.1007/s10589-015-9728-6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
45

Ge, Renpu. « The filled function transformations for constrained global optimization ». Applied Mathematics and Computation 39, no 1 (septembre 1990) : 1–20. http://dx.doi.org/10.1016/0096-3003(90)90118-m.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
46

Wu, Xiuli, Yongquan Zhou et Yuting Lu. « Elite Opposition-Based Water Wave Optimization Algorithm for Global Optimization ». Mathematical Problems in Engineering 2017 (2017) : 1–25. http://dx.doi.org/10.1155/2017/3498363.

Texte intégral
Résumé :
Water wave optimization (WWO) is a novel metaheuristic method that is based on shallow water wave theory, which has simple structure, easy realization, and good performance even with a small population. To improve the convergence speed and calculation precision even further, this paper on elite opposition-based strategy water wave optimization (EOBWWO) is proposed, and it has been applied for function optimization and structure engineering design problems. There are three major optimization strategies in the improvement: elite opposition-based (EOB) learning strategy enhances the diversity of population, local neighborhood search strategy is introduced to enhance local search in breaking operation, and improved propagation operator provides the improved algorithm with a better balance between exploration and exploitation. EOBWWO algorithm is verified by using 20 benchmark functions and two structure engineering design problems and the performance of EOBWWO is compared against those of the state-of-the-art algorithms. Experimental results show that the proposed algorithm has faster convergence speed, higher calculation precision, with the exact solution being even obtained on some benchmark functions, and a higher degree of stability than other comparative algorithms.
Styles APA, Harvard, Vancouver, ISO, etc.
47

Pirot, Guillaume, Tipaluck Krityakierne, David Ginsbourger et Philippe Renard. « Contaminant source localization via Bayesian global optimization ». Hydrology and Earth System Sciences 23, no 1 (21 janvier 2019) : 351–69. http://dx.doi.org/10.5194/hess-23-351-2019.

Texte intégral
Résumé :
Abstract. Contaminant source localization problems require efficient and robust methods that can account for geological heterogeneities and accommodate relatively small data sets of noisy observations. As realism commands hi-fidelity simulations, computation costs call for global optimization algorithms under parsimonious evaluation budgets. Bayesian optimization approaches are well adapted to such settings as they allow the exploration of parameter spaces in a principled way so as to iteratively locate the point(s) of global optimum while maintaining an approximation of the objective function with an instrumental quantification of prediction uncertainty. Here, we adapt a Bayesian optimization approach to localize a contaminant source in a discretized spatial domain. We thus demonstrate the potential of such a method for hydrogeological applications and also provide test cases for the optimization community. The localization problem is illustrated for cases where the geology is assumed to be perfectly known. Two 2-D synthetic cases that display sharp hydraulic conductivity contrasts and specific connectivity patterns are investigated. These cases generate highly nonlinear objective functions that present multiple local minima. A derivative-free global optimization algorithm relying on a Gaussian process model and on the expected improvement criterion is used to efficiently localize the point of minimum of the objective functions, which corresponds to the contaminant source location. Even though concentration measurements contain a significant level of proportional noise, the algorithm efficiently localizes the contaminant source location. The variations of the objective function are essentially driven by the geology, followed by the design of the monitoring well network. The data and scripts used to generate objective functions are shared to favor reproducible research. This contribution is important because the functions present multiple local minima and are inspired from a practical field application. Sharing these complex objective functions provides a source of test cases for global optimization benchmarks and should help with designing new and efficient methods to solve this type of problem.
Styles APA, Harvard, Vancouver, ISO, etc.
48

Degachi, Hajer, Bechir Naffeti, Wassila Chagra et Moufida Ksouri. « Filled Function Method for Nonlinear Model Predictive Control ». Mathematical Problems in Engineering 2018 (4 juin 2018) : 1–8. http://dx.doi.org/10.1155/2018/9497618.

Texte intégral
Résumé :
A new method is used to solve the nonconvex optimization problem of the nonlinear model predictive control (NMPC) for Hammerstein model. Using nonlinear models in MPC leads to a nonlinear and nonconvex optimization problem. Since control performances depend essentially on the results of the optimization method, in this work, we propose to use the filled function as a global optimization method to solve the nonconvex optimization problem. Using this method, the control law can be obtained through two steps. The first step consists of determining a local minimum of the objective function. In the second step, a new function is constructed using the local minimum of the objective function found in the first step. The new function is called the filled function; the new constructed function allows us to obtain an initialization near the global minimum. Once this initialization is determined, we can use a local optimization method to determine the global control sequence. The efficiency of the proposed method is proved firstly through benchmark functions and then through the ball and beam system described by Hammerstein model. The results obtained by the presented method are compared with those of the genetic algorithm (GA) and the particle swarm optimization (PSO).
Styles APA, Harvard, Vancouver, ISO, etc.
49

Regis, Rommel G., et Christine A. Shoemaker. « Parallel radial basis function methods for the global optimization of expensive functions ». European Journal of Operational Research 182, no 2 (octobre 2007) : 514–35. http://dx.doi.org/10.1016/j.ejor.2006.08.040.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
50

Schweidtmann, Artur M., Dominik Bongartz, Daniel Grothe, Tim Kerkenhoff, Xiaopeng Lin, Jaromił Najman et Alexander Mitsos. « Deterministic global optimization with Gaussian processes embedded ». Mathematical Programming Computation 13, no 3 (25 juin 2021) : 553–81. http://dx.doi.org/10.1007/s12532-021-00204-y.

Texte intégral
Résumé :
AbstractGaussian processes (Kriging) are interpolating data-driven models that are frequently applied in various disciplines. Often, Gaussian processes are trained on datasets and are subsequently embedded as surrogate models in optimization problems. These optimization problems are nonconvex and global optimization is desired. However, previous literature observed computational burdens limiting deterministic global optimization to Gaussian processes trained on few data points. We propose a reduced-space formulation for deterministic global optimization with trained Gaussian processes embedded. For optimization, the branch-and-bound solver branches only on the free variables and McCormick relaxations are propagated through explicit Gaussian process models. The approach also leads to significantly smaller and computationally cheaper subproblems for lower and upper bounding. To further accelerate convergence, we derive envelopes of common covariance functions for GPs and tight relaxations of acquisition functions used in Bayesian optimization including expected improvement, probability of improvement, and lower confidence bound. In total, we reduce computational time by orders of magnitude compared to state-of-the-art methods, thus overcoming previous computational burdens. We demonstrate the performance and scaling of the proposed method and apply it to Bayesian optimization with global optimization of the acquisition function and chance-constrained programming. The Gaussian process models, acquisition functions, and training scripts are available open-source within the “MeLOn—MachineLearning Models for Optimization” toolbox (https://git.rwth-aachen.de/avt.svt/public/MeLOn).
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie