Journal articles on the topic 'Global function optimization'

To see the other types of publications on this topic, follow the link: Global function optimization.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Global function optimization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Liu, Y., and K. L. Teo. "A bridging method for global optimization." Journal of the Australian Mathematical Society. Series B. Applied Mathematics 41, no. 1 (July 1999): 41–57. http://dx.doi.org/10.1017/s0334270000011024.

Full text
Abstract:
AbstractIn this paper a bridging method is introduced for numerical solutions of one-dimensional global optimization problems where a continuously differentiable function is to be minimized over a finite interval which can be given either explicitly or by constraints involving continuously differentiable functions. The concept of a bridged function is introduced. Some properties of the bridged function are given. On this basis, several bridging algorithm are developed for the computation of global optimal solutions. The algorithms are demonstrated by solving several numerical examples.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhu, Jinghao, Jiani Zhou, and David Gao. "Global optimization by canonical dual function." Journal of Computational and Applied Mathematics 234, no. 2 (May 2010): 538–44. http://dx.doi.org/10.1016/j.cam.2009.12.045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Norkin, Vladimir. "A Stochastic Smoothing Method for Nonsmooth Global Optimization." Cybernetics and Computer Technologies, no. 1 (March 31, 2020): 5–14. http://dx.doi.org/10.34229/2707-451x.20.1.1.

Full text
Abstract:
The paper presents the results of testing the stochastic smoothing method for global optimization of a multiextremal function in a convex feasible subset of Euclidean space. Preliminarily, the objective function is extended outside the admissible region so that its global minimum does not change, and it becomes coercive. The smoothing of a function at any point is carried out by averaging the values of the function over some neighborhood of this point. The size of the neighborhood is a smoothing parameter. Smoothing eliminates small local extrema of the original function. With a sufficiently large value of the smoothing parameter, the averaged function can have only one minimum. The smoothing method consists in replacing the original function with a sequence of smoothed approximations with vanishing to zero smoothing parameter and optimization of the latter functions by contemporary stochastic optimization methods. Passing from the minimum of one smoothed function to a close minimum of the next smoothed function, we can gradually come to the region of the global minimum of the original function. The smoothing method is also applicable for the optimization of nonsmooth nonconvex functions. It is shown that the smoothing method steadily solves test global optimization problems of small dimensions from the literature.
APA, Harvard, Vancouver, ISO, and other styles
4

Satapathy, Suresh Chandra. "Improved teaching learning based optimization for global function optimization." Decision Science Letters 2, no. 1 (January 1, 2013): 23–34. http://dx.doi.org/10.5267/j.dsl.2012.10.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Satapathy, Suresh Chandra, Anima Naik, and K. Parvathi. "Weighted Teaching-Learning-Based Optimization for Global Function Optimization." Applied Mathematics 04, no. 03 (2013): 429–39. http://dx.doi.org/10.4236/am.2013.43064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Wei, Xiaoshan Zhang, and Min Li. "A Filled Function Method Dominated by Filter for Nonlinearly Global Optimization." Journal of Applied Mathematics 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/245427.

Full text
Abstract:
This work presents a filled function method based on the filter technique for global optimization. Filled function method is one of the effective methods for nonlinear global optimization, since it can effectively find a better minimizer. Filter technique is applied to local optimization methods for its excellent numerical results. In order to optimize the filled function method, the filter method is employed for global optimizations in this method. A new filled function is proposed first, and then the algorithm and its properties are proved. The numerical results are listed at the end.
APA, Harvard, Vancouver, ISO, and other styles
7

DJALIL, BOUDJEHEM, BOUDJEHEM BADREDDINE, and BOUKAACHE ABDENOUR. "REDUCING DIMENSION IN GLOBAL OPTIMIZATION." International Journal of Computational Methods 08, no. 03 (September 2011): 535–44. http://dx.doi.org/10.1142/s0219876211002460.

Full text
Abstract:
In this paper, we propose a very interesting idea in global optimization making it easer and a low-cost task. The main idea is to reduce the dimension of the optimization problem in hand to a mono-dimensional one using variables coding. At this level, the algorithm will look for the global optimum of a mono-dimensional cost function. The new algorithm has the ability to avoid local optima, reduces the number of evaluations, and improves the speed of the algorithm convergence. This method is suitable for functions that have many extremes. Our algorithm can determine a narrow space around the global optimum in very restricted time based on a stochastic tests and an adaptive partition of the search space. Illustrative examples are presented to show the efficiency of the proposed idea. It was found that the algorithm was able to locate the global optimum even though the objective function has a large number of optima.
APA, Harvard, Vancouver, ISO, and other styles
8

Xiao, Jin-ke, Wei-min Li, Wei Li, and Xin-rong Xiao. "Optimization on Black Box Function Optimization Problem." Mathematical Problems in Engineering 2015 (2015): 1–10. http://dx.doi.org/10.1155/2015/647234.

Full text
Abstract:
There are a large number of engineering optimization problems in real world, whose input-output relationships are vague and indistinct. Here, they are called black box function optimization problem (BBFOP). Then, inspired by the mechanism of neuroendocrine system regulating immune system, BP neural network modified immune optimization algorithm (NN-MIA) is proposed. NN-MIA consists of two phases: the first phase is training BP neural network with expected precision to confirm input-output relationship and the other phase is immune optimization phase, whose aim is to search global optima. BP neural network fitting without expected fitting precision could be replaced with polynomial fitting or other fitting methods within expected fitting precision. Experimental simulation confirms global optimization capability of MIA and the practical application of BBFOP optimization method.
APA, Harvard, Vancouver, ISO, and other styles
9

Mou-Yan Zou and Xi Zou. "Global optimization: an auxiliary cost function approach." IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans 30, no. 3 (May 2000): 347–54. http://dx.doi.org/10.1109/3468.844358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Han, Qiaoming, and Jiye Han. "Revised filled function methods for global optimization." Applied Mathematics and Computation 119, no. 2-3 (April 2001): 217–28. http://dx.doi.org/10.1016/s0096-3003(99)00266-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Wu, Z. Y., L. S. Zhang, K. L. Teo, and F. S. Bai. "New Modified Function Method for Global Optimization." Journal of Optimization Theory and Applications 125, no. 1 (April 2005): 181–203. http://dx.doi.org/10.1007/s10957-004-1718-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Liang, Y. M., L. S. Zhang, M. M. Li, and B. S. Han. "A filled function method for global optimization." Journal of Computational and Applied Mathematics 205, no. 1 (August 2007): 16–31. http://dx.doi.org/10.1016/j.cam.2006.04.038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Isshiki, Masaki, Hiroki Ono, Kouichi Hiraga, Jun Ishikawa, and Suezou Nakadate. "Lens Design: Global Optimization with Escape Function." Optical Review 2, no. 6 (November 1995): 463–70. http://dx.doi.org/10.1007/s10043-995-0463-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Pandiya, Ridwan, Widodo Widodo, Salmah, and Irwan Endrayanto. "Non parameter-filled function for global optimization." Applied Mathematics and Computation 391 (February 2021): 125642. http://dx.doi.org/10.1016/j.amc.2020.125642.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Bemporad, Alberto. "Global optimization via inverse distance weighting and radial basis functions." Computational Optimization and Applications 77, no. 2 (July 27, 2020): 571–95. http://dx.doi.org/10.1007/s10589-020-00215-w.

Full text
Abstract:
Abstract Global optimization problems whose objective function is expensive to evaluate can be solved effectively by recursively fitting a surrogate function to function samples and minimizing an acquisition function to generate new samples. The acquisition step trades off between seeking for a new optimization vector where the surrogate is minimum (exploitation of the surrogate) and looking for regions of the feasible space that have not yet been visited and that may potentially contain better values of the objective function (exploration of the feasible space). This paper proposes a new global optimization algorithm that uses inverse distance weighting (IDW) and radial basis functions (RBF) to construct the acquisition function. Rather arbitrary constraints that are simple to evaluate can be easily taken into account. Compared to Bayesian optimization, the proposed algorithm, that we call GLIS (GLobal minimum using Inverse distance weighting and Surrogate radial basis functions), is competitive and computationally lighter, as we show in a set of benchmark global optimization and hyperparameter tuning problems. MATLAB and Python implementations of GLIS are available at http://cse.lab.imtlucca.it/~bemporad/glis.
APA, Harvard, Vancouver, ISO, and other styles
16

Aaid, Djamel, and Özen Özer. "New technique for solving multivariate global optimization." Journal of Numerical Analysis and Approximation Theory 52, no. 1 (July 10, 2023): 3–16. http://dx.doi.org/10.33993/jnaat521-1287.

Full text
Abstract:
In this paper, we propose an algorithm based on branch and bound method to underestimate the objective function and reductive transformation which is transformed the all multivariable functions on univariable functions. We also demonstrate several quadratic lower bound functions are proposed which they are better/preferable than the others well-known in literature. We obtain that our experimental results are more effective when we face different nonconvex functions.
APA, Harvard, Vancouver, ISO, and other styles
17

Weixiang Wang, Youlin Shang, and Shenhua Gui. "Application of Global Descent Function Method to Discrete Global Optimization." INTERNATIONAL JOURNAL ON Advances in Information Sciences and Service Sciences 5, no. 7 (April 15, 2013): 684–91. http://dx.doi.org/10.4156/aiss.vol5.issue7.80.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Wang, Wei-xiang, You-lin Shang, and Ying Zhang. "Global Minimization of Nonsmooth Constrained Global Optimization with Filled Function." Mathematical Problems in Engineering 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/563860.

Full text
Abstract:
A novel filled function is constructed to locate a global optimizer or an approximate global optimizer of smooth or nonsmooth constrained global minimization problems. The constructed filled function contains only one parameter which can be easily adjusted during the minimization. The theoretical properties of the filled function are discussed and a corresponding solution algorithm is proposed. The solution algorithm comprises two phases: local minimization and filling. The first phase minimizes the original problem and obtains one of its local optimizers, while the second phase minimizes the constructed filled function and identifies a better initial point for the first phase. Some preliminary numerical results are also reported.
APA, Harvard, Vancouver, ISO, and other styles
19

Pandiya, Ridwan, Atina Ahdika, Siti Khomsah, and Rima Dias Ramadhani. "A New Integral Function Algorithm for Global Optimization and Its Application to the Data Clustering Problem." MENDEL 29, no. 2 (December 20, 2023): 162–68. http://dx.doi.org/10.13164/mendel.2023.2.162.

Full text
Abstract:
The filled function method is an approach to finding global minimum points of multidimensional unconstrained global optimization problems. The conventional parametric filled functions have computational weaknesses when they are employed in some benchmark optimization functions. This paper proposes a new integral function algorithm based on the auxiliary function approach. The proposed method can successfully be used to find the global minimum point of a function of several variables. Some testing global optimization problems have been used to show the ability of this recommended method. The integral function algorithm is then implemented to solve the center-based data clustering problem. The results show that the proposed algorithm can solve the problem successfully.
APA, Harvard, Vancouver, ISO, and other styles
20

Zabotin,, Vladislav I., and Pavel A. Chernyshevskij. "Extension of Strongins Global Optimization Algorithm to a Function Continuous on a Compact Interval." Computer Research and Modeling 11, no. 6 (December 2019): 1111–19. http://dx.doi.org/10.20537/2076-7633-2019-11-6-1111-1119.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Horai, Mio, Hideo Kobayashi, and Takashi G. Nitta. "Global Optimization for the Sum of Certain Nonlinear Functions." Abstract and Applied Analysis 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/408918.

Full text
Abstract:
We extend work by Pei-Ping and Gui-Xia, 2007, to a global optimization problem for more general functions. Pei-Ping and Gui-Xia treat the optimization problem for the linear sum of polynomial fractional functions, using a branch and bound approach. We prove that this extension makes possible to solve the following nonconvex optimization problems which Pei-Ping and Gui-Xia, 2007, cannot solve, that the sum of the positive (or negative) first and second derivatives function with the variable defined by sum of polynomial fractional function by using branch and bound algorithm.
APA, Harvard, Vancouver, ISO, and other styles
22

Wang, Weixiang, Youlin Shang, and Ying Zhang. "Finding Global Minima with a Filled Function Approach for Non-Smooth Global Optimization." Discrete Dynamics in Nature and Society 2010 (2010): 1–10. http://dx.doi.org/10.1155/2010/843609.

Full text
Abstract:
A filled function approach is proposed for solving a non-smooth unconstrained global optimization problem. First, the definition of filled function in Zhang (2009) for smooth global optimization is extended to non-smooth case and a new one is put forwarded. Then, a novel filled function is proposed for non-smooth the global optimization and a corresponding non-smooth algorithm based on the filled function is designed. At last, a numerical test is made. The computational results demonstrate that the proposed approach is effcient and reliable.
APA, Harvard, Vancouver, ISO, and other styles
23

Wang, Wei, Yong-jian Yang, and Lian-sheng Zhang. "Unification of Filled Function and Tunnelling Function in Global Optimization." Acta Mathematicae Applicatae Sinica, English Series 23, no. 1 (January 2007): 59–66. http://dx.doi.org/10.1007/s10255-006-0349-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Tsoulos, Ioannis G., Alexandros Tzallas, Evangelos Karvounis, and Dimitrios Tsalikakis. "NeuralMinimizer: A Novel Method for Global Optimization." Information 14, no. 2 (January 25, 2023): 66. http://dx.doi.org/10.3390/info14020066.

Full text
Abstract:
The problem of finding the global minimum of multidimensional functions is often applied to a wide range of problems. An innovative method of finding the global minimum of multidimensional functions is presented here. This method first generates an approximation of the objective function using only a few real samples from it. These samples construct the approach using a machine learning model. Next, the required sampling is performed by the approximation function. Furthermore, the approach is improved on each sample by using found local minima as samples for the training set of the machine learning model. In addition, as a termination criterion, the proposed technique uses a widely used criterion from the relevant literature which in fact evaluates it after each execution of the local minimization. The proposed technique was applied to a number of well-known problems from the relevant literature, and the comparative results with respect to modern global minimization techniques are shown to be extremely promising.
APA, Harvard, Vancouver, ISO, and other styles
25

Angeles, J., and A. Bernier. "The Global Least-Square Optimization of Function-Generating Linkages." Journal of Mechanisms, Transmissions, and Automation in Design 109, no. 2 (June 1, 1987): 204–9. http://dx.doi.org/10.1115/1.3267439.

Full text
Abstract:
As shown elsewhere, the approximate least-square synthesis of function-generating linkages can be formulated in terms of one single decision variable. Hence, functions representing global properties of linkages such as accuracy and transmission quality, which are terms precisely defined in this paper, as well as mobility type, are all functions of one single variable. The optimization procedure presented here is aimed at the synthesis of function-generating linkages with a minimum least-square design error and a maximum transmission quality, while meeting conditions prescribed on the type of mobility (crank or rocker) of its input and/or output links. Alternative performance indices and/or constraints can be treated likewise.
APA, Harvard, Vancouver, ISO, and other styles
26

Zhou, Heng Jun, Ming Yan Jiang, and Xian Ye Ben. "Niche Brain Storm Optimization Algorithm for Multi-Peak Function Optimization." Advanced Materials Research 989-994 (July 2014): 1626–30. http://dx.doi.org/10.4028/www.scientific.net/amr.989-994.1626.

Full text
Abstract:
Brain Storm Optimization (BSO) is a novel proposed swarm intelligence optimization algorithm which has a fast convergent speed. However, it is easy to trap into local optimal. In this paper, a new model based on niche technology, which is named Niche Brain Storm Optimization (NBSO), is proposed to overcome the shortcoming of BSO. Niche technology effectively prevents premature and maintains population diversity during the evolution process. NBSO shows excellent performance in searching global value and finding multiple global and local optimal solutions for the multi-peak problems. Several benchmark functions are introduced to evaluate its performance. Experimental results show that NBSO performs better than BSO in global searching ability and faster than Niche Genetic Algorithm (NGA) in finding peaks for multi-peak function.
APA, Harvard, Vancouver, ISO, and other styles
27

Fateen, Seif-Eddeen K., and Adrián Bonilla-Petriciolet. "Gradient-Based Cuckoo Search for Global Optimization." Mathematical Problems in Engineering 2014 (2014): 1–12. http://dx.doi.org/10.1155/2014/493740.

Full text
Abstract:
One of the major advantages of stochastic global optimization methods is the lack of the need of the gradient of the objective function. However, in some cases, this gradient is readily available and can be used to improve the numerical performance of stochastic optimization methods specially the quality and precision of global optimal solution. In this study, we proposed a gradient-based modification to the cuckoo search algorithm, which is a nature-inspired swarm-based stochastic global optimization method. We introduced the gradient-based cuckoo search (GBCS) and evaluated its performance vis-à-vis the original algorithm in solving twenty-four benchmark functions. The use of GBCS improved reliability and effectiveness of the algorithm in all but four of the tested benchmark problems. GBCS proved to be a strong candidate for solving difficult optimization problems, for which the gradient of the objective function is readily available.
APA, Harvard, Vancouver, ISO, and other styles
28

Zhao, Ruxin, Qifang Luo, and Yongquan Zhou. "Elite Opposition-Based Social Spider Optimization Algorithm for Global Function Optimization." Algorithms 10, no. 1 (January 8, 2017): 9. http://dx.doi.org/10.3390/a10010009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Wei, Fei, and Yuping Wang. "A New Filled Function Method with One Parameter for Global Optimization." Mathematical Problems in Engineering 2013 (2013): 1–12. http://dx.doi.org/10.1155/2013/532325.

Full text
Abstract:
The filled function method is an effective approach to find the global minimizer of multidimensional multimodal functions. The conventional filled functions are numerically unstable due to exponential or logarithmic term and sensitive to parameters. In this paper, a new filled function with only one parameter is proposed, which is continuously differentiable and proved to satisfy all conditions of the filled function definition. Moreover, this filled function is not sensitive to parameter, and the overflow can not happen for this function. Based on these, a new filled function method is proposed, and it is numerically stable to the initial point and the parameter variable. The computer simulations indicate that the proposed filled function method is efficient and effective.
APA, Harvard, Vancouver, ISO, and other styles
30

Wang, Weixiang, Youlin Shang, and Ying Zhang. "A Filled Function Approach for Nonsmooth Constrained Global Optimization." Mathematical Problems in Engineering 2010 (2010): 1–9. http://dx.doi.org/10.1155/2010/310391.

Full text
Abstract:
A novel filled function is given in this paper to find a global minima for a nonsmooth constrained optimization problem. First, a modified concept of the filled function for nonsmooth constrained global optimization is introduced, and a filled function, which makes use of the idea of the filled function for unconstrained optimization and penalty function for constrained optimization, is proposed. Then, a solution algorithm based on the proposed filled function is developed. At last, some preliminary numerical results are reported. The results show that the proposed approach is promising.
APA, Harvard, Vancouver, ISO, and other styles
31

Li, Jia, Yuelin Gao, Tiantian Chen, and Xiaohua Ma. "A new filled function method based on global search for solving unconstrained optimization problems." AIMS Mathematics 9, no. 7 (2024): 18475–505. http://dx.doi.org/10.3934/math.2024900.

Full text
Abstract:
<abstract><p>The filled function method is a deterministic algorithm for finding a global minimizer of global optimization problems, and its effectiveness is closely related to the form of the constructed filled function. Currently, the filled functions mainly have three drawbacks in form, namely, parameter adjustment and control (if any), inclusion of exponential or logarithmic functions, and properties that are discontinuous and non-differentiable. In order to overcome these limitations, this paper proposed a parameter-free filled function that does not include exponential or logarithmic functions and is continuous and differentiable. Based on the new filled function, a filled function method for solving unconstrained global optimization problems was designed. The algorithm selected points in the feasible domain that were far from the global minimum point as initial points, and improved the setting of the step size in the stage of minimizing the filled function to enhance the algorithm's global optimization capability. In addition, tests were conducted on 14 benchmark functions and compared with existing filled function algorithms. The numerical experimental results showed that the new algorithm proposed in this paper was feasible and effective.</p></abstract>
APA, Harvard, Vancouver, ISO, and other styles
32

Lin, Youjiang, Yongjian Yang, and Liansheng Zhang. "A NOVEL FILLED FUNCTION METHOD FOR GLOBAL OPTIMIZATION." Journal of the Korean Mathematical Society 47, no. 6 (November 1, 2010): 1253–67. http://dx.doi.org/10.4134/jkms.2010.47.6.1253.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Zhang, Lian-Sheng, Chi-Kong Ng, Duan Li, and Wei-Wen Tian. "A New Filled Function Method for Global Optimization." Journal of Global Optimization 28, no. 1 (January 2004): 17–43. http://dx.doi.org/10.1023/b:jogo.0000006653.60256.f6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Sun, X. L., and D. Li. "Value-Estimation Function Method for Constrained Global Optimization." Journal of Optimization Theory and Applications 102, no. 2 (August 1999): 385–409. http://dx.doi.org/10.1023/a:1021736608968.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Liu, Xian. "The impelling function method applied to global optimization." Applied Mathematics and Computation 151, no. 3 (April 2004): 745–54. http://dx.doi.org/10.1016/s0096-3003(03)00525-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Wang, Yuncheng, Weiwu Fang, and Tianjiao Wu. "A cut-peak function method for global optimization." Journal of Computational and Applied Mathematics 230, no. 1 (August 2009): 135–42. http://dx.doi.org/10.1016/j.cam.2008.10.069.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Liu, Xian, and Wilsun Xu. "A new filled function applied to global optimization." Computers & Operations Research 31, no. 1 (January 2004): 61–80. http://dx.doi.org/10.1016/s0305-0548(02)00154-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Wang, Xiaoli, and Guobiao Zhou. "A new filled function for unconstrained global optimization." Applied Mathematics and Computation 174, no. 1 (March 2006): 419–29. http://dx.doi.org/10.1016/j.amc.2005.05.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Ruan, Zhao-Hui, Yuan Yuan, Qi-Xiang Chen, Chuan-Xin Zhang, Yong Shuai, and He-Ping Tan. "A new multi-function global particle swarm optimization." Applied Soft Computing 49 (December 2016): 279–91. http://dx.doi.org/10.1016/j.asoc.2016.07.034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Schnabel, Robert B. "Concurrent function evaluations in local and global optimization." Computer Methods in Applied Mechanics and Engineering 64, no. 1-3 (October 1987): 537–52. http://dx.doi.org/10.1016/0045-7825(87)90055-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Gao, Yuelin, Yongjian Yang, and Mi You. "A new filled function method for global optimization." Applied Mathematics and Computation 268 (October 2015): 685–95. http://dx.doi.org/10.1016/j.amc.2015.06.090.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Wu, Z. Y., F. S. Bai, H. W. J. Lee, and Y. J. Yang. "A filled function method for constrained global optimization." Journal of Global Optimization 39, no. 4 (April 19, 2007): 495–507. http://dx.doi.org/10.1007/s10898-007-9152-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Ng, Chi-Kong, Lian-Sheng Zhang, Duan Li, and Wei-Wen Tian. "Discrete Filled Function Method for Discrete Global Optimization." Computational Optimization and Applications 31, no. 1 (May 2005): 87–115. http://dx.doi.org/10.1007/s10589-005-0985-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Lampariello, F., and G. Liuzzi. "A filling function method for unconstrained global optimization." Computational Optimization and Applications 61, no. 3 (February 5, 2015): 713–29. http://dx.doi.org/10.1007/s10589-015-9728-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Ge, Renpu. "The filled function transformations for constrained global optimization." Applied Mathematics and Computation 39, no. 1 (September 1990): 1–20. http://dx.doi.org/10.1016/0096-3003(90)90118-m.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Wu, Xiuli, Yongquan Zhou, and Yuting Lu. "Elite Opposition-Based Water Wave Optimization Algorithm for Global Optimization." Mathematical Problems in Engineering 2017 (2017): 1–25. http://dx.doi.org/10.1155/2017/3498363.

Full text
Abstract:
Water wave optimization (WWO) is a novel metaheuristic method that is based on shallow water wave theory, which has simple structure, easy realization, and good performance even with a small population. To improve the convergence speed and calculation precision even further, this paper on elite opposition-based strategy water wave optimization (EOBWWO) is proposed, and it has been applied for function optimization and structure engineering design problems. There are three major optimization strategies in the improvement: elite opposition-based (EOB) learning strategy enhances the diversity of population, local neighborhood search strategy is introduced to enhance local search in breaking operation, and improved propagation operator provides the improved algorithm with a better balance between exploration and exploitation. EOBWWO algorithm is verified by using 20 benchmark functions and two structure engineering design problems and the performance of EOBWWO is compared against those of the state-of-the-art algorithms. Experimental results show that the proposed algorithm has faster convergence speed, higher calculation precision, with the exact solution being even obtained on some benchmark functions, and a higher degree of stability than other comparative algorithms.
APA, Harvard, Vancouver, ISO, and other styles
47

Pirot, Guillaume, Tipaluck Krityakierne, David Ginsbourger, and Philippe Renard. "Contaminant source localization via Bayesian global optimization." Hydrology and Earth System Sciences 23, no. 1 (January 21, 2019): 351–69. http://dx.doi.org/10.5194/hess-23-351-2019.

Full text
Abstract:
Abstract. Contaminant source localization problems require efficient and robust methods that can account for geological heterogeneities and accommodate relatively small data sets of noisy observations. As realism commands hi-fidelity simulations, computation costs call for global optimization algorithms under parsimonious evaluation budgets. Bayesian optimization approaches are well adapted to such settings as they allow the exploration of parameter spaces in a principled way so as to iteratively locate the point(s) of global optimum while maintaining an approximation of the objective function with an instrumental quantification of prediction uncertainty. Here, we adapt a Bayesian optimization approach to localize a contaminant source in a discretized spatial domain. We thus demonstrate the potential of such a method for hydrogeological applications and also provide test cases for the optimization community. The localization problem is illustrated for cases where the geology is assumed to be perfectly known. Two 2-D synthetic cases that display sharp hydraulic conductivity contrasts and specific connectivity patterns are investigated. These cases generate highly nonlinear objective functions that present multiple local minima. A derivative-free global optimization algorithm relying on a Gaussian process model and on the expected improvement criterion is used to efficiently localize the point of minimum of the objective functions, which corresponds to the contaminant source location. Even though concentration measurements contain a significant level of proportional noise, the algorithm efficiently localizes the contaminant source location. The variations of the objective function are essentially driven by the geology, followed by the design of the monitoring well network. The data and scripts used to generate objective functions are shared to favor reproducible research. This contribution is important because the functions present multiple local minima and are inspired from a practical field application. Sharing these complex objective functions provides a source of test cases for global optimization benchmarks and should help with designing new and efficient methods to solve this type of problem.
APA, Harvard, Vancouver, ISO, and other styles
48

Degachi, Hajer, Bechir Naffeti, Wassila Chagra, and Moufida Ksouri. "Filled Function Method for Nonlinear Model Predictive Control." Mathematical Problems in Engineering 2018 (June 4, 2018): 1–8. http://dx.doi.org/10.1155/2018/9497618.

Full text
Abstract:
A new method is used to solve the nonconvex optimization problem of the nonlinear model predictive control (NMPC) for Hammerstein model. Using nonlinear models in MPC leads to a nonlinear and nonconvex optimization problem. Since control performances depend essentially on the results of the optimization method, in this work, we propose to use the filled function as a global optimization method to solve the nonconvex optimization problem. Using this method, the control law can be obtained through two steps. The first step consists of determining a local minimum of the objective function. In the second step, a new function is constructed using the local minimum of the objective function found in the first step. The new function is called the filled function; the new constructed function allows us to obtain an initialization near the global minimum. Once this initialization is determined, we can use a local optimization method to determine the global control sequence. The efficiency of the proposed method is proved firstly through benchmark functions and then through the ball and beam system described by Hammerstein model. The results obtained by the presented method are compared with those of the genetic algorithm (GA) and the particle swarm optimization (PSO).
APA, Harvard, Vancouver, ISO, and other styles
49

Regis, Rommel G., and Christine A. Shoemaker. "Parallel radial basis function methods for the global optimization of expensive functions." European Journal of Operational Research 182, no. 2 (October 2007): 514–35. http://dx.doi.org/10.1016/j.ejor.2006.08.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Schweidtmann, Artur M., Dominik Bongartz, Daniel Grothe, Tim Kerkenhoff, Xiaopeng Lin, Jaromił Najman, and Alexander Mitsos. "Deterministic global optimization with Gaussian processes embedded." Mathematical Programming Computation 13, no. 3 (June 25, 2021): 553–81. http://dx.doi.org/10.1007/s12532-021-00204-y.

Full text
Abstract:
AbstractGaussian processes (Kriging) are interpolating data-driven models that are frequently applied in various disciplines. Often, Gaussian processes are trained on datasets and are subsequently embedded as surrogate models in optimization problems. These optimization problems are nonconvex and global optimization is desired. However, previous literature observed computational burdens limiting deterministic global optimization to Gaussian processes trained on few data points. We propose a reduced-space formulation for deterministic global optimization with trained Gaussian processes embedded. For optimization, the branch-and-bound solver branches only on the free variables and McCormick relaxations are propagated through explicit Gaussian process models. The approach also leads to significantly smaller and computationally cheaper subproblems for lower and upper bounding. To further accelerate convergence, we derive envelopes of common covariance functions for GPs and tight relaxations of acquisition functions used in Bayesian optimization including expected improvement, probability of improvement, and lower confidence bound. In total, we reduce computational time by orders of magnitude compared to state-of-the-art methods, thus overcoming previous computational burdens. We demonstrate the performance and scaling of the proposed method and apply it to Bayesian optimization with global optimization of the acquisition function and chance-constrained programming. The Gaussian process models, acquisition functions, and training scripts are available open-source within the “MeLOn—MachineLearning Models for Optimization” toolbox (https://git.rwth-aachen.de/avt.svt/public/MeLOn).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography