Academic literature on the topic 'Pure adaptive search'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Pure adaptive search.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Pure adaptive search"

1

Baritompa, W. P., Zhang Baoping, R. H. Mladineo, G. R. Wood, and Z. B. Zabinsky. "Towards Pure Adaptive Search." Journal of Global Optimization 7, no. 1 (July 1995): 93–110. http://dx.doi.org/10.1007/bf01100207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zabinsky, Zelda B., and Robert L. Smith. "Pure adaptive search in global optimization." Mathematical Programming 53, no. 1-3 (January 1992): 323–38. http://dx.doi.org/10.1007/bf01585710.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Patel, Nitin R., Robert L. Smith, and Zelda B. Zabinsky. "Pure adaptive search in monte carlo optimization." Mathematical Programming 43, no. 1-3 (January 1989): 317–28. http://dx.doi.org/10.1007/bf01582296.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zabinsky, Z. B., G. R. Wood, M. A. Steel, and W. P. Baritompa. "Pure adaptive search for finite global optimization." Mathematical Programming 69, no. 1-3 (July 1995): 443–48. http://dx.doi.org/10.1007/bf01585570.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bulger, D., W. P. Baritompa, and G. R. Wood. "Implementing Pure Adaptive Search with Grover's Quantum Algorithm." Journal of Optimization Theory and Applications 116, no. 3 (March 2003): 517–29. http://dx.doi.org/10.1023/a:1023061218864.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Simić, Dragan, Vasa Svirčević, Vladimir Ilin, Svetislav D. Simić, and Svetlana Simić. "Particle Swarm Optimization and Pure Adaptive Search in Finish Goods’ Inventory Management." Cybernetics and Systems 50, no. 1 (January 2, 2019): 58–77. http://dx.doi.org/10.1080/01969722.2018.1558014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

SALAHI, M. "A SELF-REGULAR NEWTON BASED ALGORITHM FOR LINEAR OPTIMIZATION." ANZIAM Journal 51, no. 2 (October 2009): 286–301. http://dx.doi.org/10.1017/s1446181109000340.

Full text
Abstract:
AbstractIn this paper, using the framework of self-regularity, we propose a hybrid adaptive algorithm for the linear optimization problem. If the current iterates are far from a central path, the algorithm employs a self-regular search direction, otherwise the classical Newton search direction is employed. This feature of the algorithm allows us to prove a worst case iteration bound. Our result matches the best iteration bound obtained by the pure self-regular approach and improves on the worst case iteration bound of the classical algorithm.
APA, Harvard, Vancouver, ISO, and other styles
8

Dinh, Bach, Thang Nguyen, Nguyen Quynh, and Le Dai. "A Novel Method for Economic Dispatch of Combined Heat and Power Generation." Energies 11, no. 11 (November 10, 2018): 3113. http://dx.doi.org/10.3390/en11113113.

Full text
Abstract:
The paper proposes a modified Bat algorithm (MBA) for searching optimal solutions of Economic dispatch of combined heat and power generation (CHPGED) with the heat and power generation from three different types of units consisting of pure power generation units, pure heat generation units and cogeneration units. The CHPGED problem becomes complicated and big challenge to optimization tools since it considers both heat and power generation from cogeneration units. Thus, we apply MBA method with the purpose of enhancing high quality solution search ability as well as search speed of conventional Bat algorithm (BA). This proposed approach is established based on three modifications on BA. The first is the adaptive frequency adjustment, the second is the optimal range of updated velocity, and the third is the retained condition of a good solution with objective to ameliorate the search performance of traditional BA. The effectiveness of the proposed approach is evaluated by testing on 7, 24, and 48 units systems and IEEE 14-bus system and comparing results with BA together with other existing methods. As a result, it can conclude that the proposed MBA method is a favorable meta-heuristic algorithm for solving CHPGED problem
APA, Harvard, Vancouver, ISO, and other styles
9

Chen, Fu Xing, and Xu Sheng Xie. "Application on Query of Distributed Database Based on Improved Genetic Algorithm." Applied Mechanics and Materials 556-562 (May 2014): 4617–21. http://dx.doi.org/10.4028/www.scientific.net/amm.556-562.4617.

Full text
Abstract:
The query cost usually as an important criterion for a distributed database. The genetic algorithm is an adaptive probabilistic search algorithm, but the crossover and mutation probability usually keep a probability in traditional genetic algorithm. If the crossover probability keep a large value, the possibility of damage for genetic algorithm model is greater; In turn, if the crossover probability keep a small value, the search process will transform a slow processing or even stagnating. If the mutation probability keep a small value, a new individual can be reproduced difficultly; In turn, if the mutation probability keep a large value, the genetic algorithm will as a Pure random search algorithm. To solve this problem, proposed a improved genetic algorithm that multiple possibility of crossover and mutation based on k-means clustering algorithm. The experiment results indicate that the algorithm is effective.
APA, Harvard, Vancouver, ISO, and other styles
10

Angora, G., M. Brescia, S. Cavuoti, M. Paolillo, G. Longo, M. Cantiello, M. Capaccioli, et al. "Astroinformatics-based search for globular clusters in the Fornax Deep Survey." Monthly Notices of the Royal Astronomical Society 490, no. 3 (October 7, 2019): 4080–106. http://dx.doi.org/10.1093/mnras/stz2801.

Full text
Abstract:
ABSTRACT In the last years, Astroinformatics has become a well-defined paradigm for many fields of Astronomy. In this work, we demonstrate the potential of a multidisciplinary approach to identify globular clusters (GCs) in the Fornax cluster of galaxies taking advantage of multiband photometry produced by the VLT Survey Telescope using automatic self-adaptive methodologies. The data analysed in this work consist of deep, multiband, partially overlapping images centred on the core of the Fornax cluster. In this work, we use a Neural Gas model, a pure clustering machine learning methodology, to approach the GC detection, while a novel feature selection method (ΦLAB) is exploited to perform the parameter space analysis and optimization. We demonstrate that the use of an Astroinformatics-based methodology is able to provide GC samples that are comparable, in terms of purity and completeness with those obtained using single-band HST data and two approaches based, respectively, on a morpho-photometric and a Principal Component Analysis using the same data discussed in this work.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Pure adaptive search"

1

(13992058), David W. Bulger. "Stochastic global optimisation algorithms." Thesis, 1996. https://figshare.com/articles/thesis/Stochastic_global_optimisation_algorithms/21377646.

Full text
Abstract:

This thesis addresses aspects of stochastic algorithms for the solution of global optimisation problems. The bulk of the research investigates algorithm models of the adaptive search variety. Performances of stochastic and deterministic algorithms are also compared.

Chapter 2 defines pure adaptive search, the prototypical improving region search scheme from the literature. Analyses from the literature of the search duration of pure adaptive search in two specialised situations are recounted and interpreted. In each case pure adaptive search is shown to require an expected search time which varies only linearly with the dimension of the feasible region.

In Chapter 3 a generalisation of pure adaptive search is introduced under the name of hesitant adaptive search. This original algorithm retains the sample point generation mechanism of pure adaptive search, but allows for hesitation, in which an algorithm iteration passes without an improving sample being located. In this way hesitant adaptive search is intended to model more closely practically implementable algorithms. The analysis of the convergence of hesitant adaptive search is more thorough than the analyses already appearing in the literature, as it describes how hesitant adaptive search behaves when given more general objective functions than in previous studies. By specialising to the case of pure adaptive search we obtain a unification of the results appearing in those papers.

Chapter 4 is the most applied of the chapters in this thesis. Here hesitant adaptive search is specialised to describe the convergence behaviour of localisation search schemes. A localisation search scheme produces a bracket of the current improving region at each iteration. The results of Chapter 3 are applied to find necessary and sufficient conditions on the 'tightness' of the brackets to ensure that the dependence of the expected search duration on the dimension of the feasible region is linear, quadratic, cubic, and so forth.

Chapter 5 describes another original generalisation of pure adaptive search, known as fenestral adaptive search. This algorithm generates sample points from a region determined not merely by the previous sample, but by the previous w samples, where w is some prespecified positive integer. The expected search duration of fenestral adaptive search is greater than that of pure adaptive search, but still varies only linearly with the dimension of the feasible region. The sequence of objective function values obtained constitutes an interesting stochastic process, and Chapter 5 is devoted to understanding this process.

Chapter 6 presents a theoretical comparison of the search durations of deterministic and stochastic global optimisation algorithms, together with some discussion of the implications. It is shown that to any stochastic algorithm, there corresponds a deterministic algorithm which requires no more iterations on average, but we discuss why stochastic algorithms may still be more efficient than their deterministic counterparts in practice.

APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Pure adaptive search"

1

Zabinsky, Zelda B. "Pure Random Search and Pure Adaptive Search." In Nonconvex Optimization and Its Applications, 25–54. Boston, MA: Springer US, 2003. http://dx.doi.org/10.1007/978-1-4419-9182-9_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zabinsky, Z. B., and B. P. Kristinsdottir. "Complexity Analysis Integrating Pure Adaptive Search (PAS) and Pure Random Search (PRS)." In Nonconvex Optimization and Its Applications, 171–81. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4757-2600-8_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Pure adaptive search"

1

Schmeiser, Bruce W., and Jin Wang. "On the performance of pure adaptive search." In the 27th conference. New York, New York, USA: ACM Press, 1995. http://dx.doi.org/10.1145/224401.224634.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lin, T. W., Jianyi Lu, Jian Lin, and Don A. Gregory. "Adaptive learning of binary patterns by using correlation processes." In OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1992. http://dx.doi.org/10.1364/oam.1992.tud4.

Full text
Abstract:
A learning process is essential in constructing synthetic reference patterns for pattern recognition. A knowledge base is generally built up or updated through correlating adaptively changed reference patterns with a given standard. The optical correlator provides an alternative in the learning process. Compared to learning by a pure digital process, an optical approach substantially reduces the necessary learning time through high speed Fourier transforms and parallel processing. In this paper, we present an adaptive learning process for binary patterns using optical correlators. A direct search algorithm is employed in the learning process to ensure that an optimum pattern can be reached. The variation of correlation peaks obtained from a given input object and adaptively changed reference patterns are used to determine an optimum synthetic reference pattern. The approach is verified by the results obtained from a full range of computer simulations. The ability of the presented algorithm for tolerating noise is also demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography