Journal articles on the topic 'Local search'

To see the other types of publications on this topic, follow the link: Local search.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Local search.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

HASEGAWA, Manabu. "Potentially Local Search in Local Search Heuristics." Proceedings of The Computational Mechanics Conference 2004.17 (2004): 403–4. http://dx.doi.org/10.1299/jsmecmd.2004.17.403.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ishtaiwi, Abdelraouf, Ghassan Issa, Wael Hadi, and Nawaf Ali. "Weight Resets in Local Search for SAT." International Journal of Machine Learning and Computing 9, no. 6 (December 2019): 874–79. http://dx.doi.org/10.18178/ijmlc.2019.9.6.886.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Verhoeven, M. G. A., and E. H. L. Aarts. "Parallel local search." Journal of Heuristics 1, no. 1 (September 1995): 43–65. http://dx.doi.org/10.1007/bf02430365.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lancia, Giuseppe, Franca Rinaldi, and Paolo Serafini. "Local search inequalities." Discrete Optimization 16 (May 2015): 76–89. http://dx.doi.org/10.1016/j.disopt.2015.02.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Choi, Changkyu, and Ju-Jang Lee. "Chaotic local search algorithm." Artificial Life and Robotics 2, no. 1 (March 1998): 41–47. http://dx.doi.org/10.1007/bf02471151.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Vaessens, R. J. M., E. H. L. Aarts, and J. K. Lenstra. "A local search template." Computers & Operations Research 25, no. 11 (November 1998): 969–79. http://dx.doi.org/10.1016/s0305-0548(97)00093-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Oliveira, Leonardo D., Fernando Ciriaco, Taufik Abrão, and Paul Jean E. Jeszensky. "Local search multiuser detection." AEU - International Journal of Electronics and Communications 63, no. 4 (April 2009): 259–70. http://dx.doi.org/10.1016/j.aeue.2008.01.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Whitley, Darrell, and Jonathan Rowe. "Subthreshold-seeking local search." Theoretical Computer Science 361, no. 1 (August 2006): 2–17. http://dx.doi.org/10.1016/j.tcs.2006.04.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Dubois-Lacoste, Jérémie, Manuel López-Ibáñez, and Thomas Stützle. "Anytime Pareto local search." European Journal of Operational Research 243, no. 2 (June 2015): 369–85. http://dx.doi.org/10.1016/j.ejor.2014.10.062.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Anderson, E. J. "Mechanisms for local search." European Journal of Operational Research 88, no. 1 (January 1996): 139–51. http://dx.doi.org/10.1016/0377-2217(94)00164-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Pirlot, Marc. "General local search methods." European Journal of Operational Research 92, no. 3 (August 1996): 493–511. http://dx.doi.org/10.1016/0377-2217(96)00007-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Porumbel, Daniel, and Jin-Kao Hao. "Distance-guided local search." Journal of Heuristics 26, no. 5 (May 26, 2020): 711–41. http://dx.doi.org/10.1007/s10732-020-09446-w.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Tricoire, Fabien. "Multi-directional local search." Computers & Operations Research 39, no. 12 (December 2012): 3089–101. http://dx.doi.org/10.1016/j.cor.2012.03.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Hitotsuyanagi, Yasuhiro, Yoshihiko Wakamatsu, Yusuke Nojima, and Hisao Ishibuchi. "Selection of Initial Solutions for Local Search in Multiobjective Genetic Local Search." Transactions of the Institute of Systems, Control and Information Engineers 23, no. 8 (2010): 178–87. http://dx.doi.org/10.5687/iscie.23.178.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Garnier, Josselin, and Leila Kallel. "Efficiency of Local Search with Multiple Local Optima." SIAM Journal on Discrete Mathematics 15, no. 1 (January 2001): 122–41. http://dx.doi.org/10.1137/s0895480199355225.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Schaerf, Andrea, Marco Cadoli, and Maurizio Lenzerini. "LOCAL++: A C++ framework for local search algorithms." Software: Practice and Experience 30, no. 3 (March 2000): 233–57. http://dx.doi.org/10.1002/(sici)1097-024x(200003)30:3<233::aid-spe297>3.0.co;2-k.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Altschul, Stephen F., Warren Gish, Webb Miller, Eugene W. Myers, and David J. Lipman. "Basic local alignment search tool." Journal of Molecular Biology 215, no. 3 (October 1990): 403–10. http://dx.doi.org/10.1016/s0022-2836(05)80360-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Halim, Felix, Panagiotis Karras, and Roland Yap. "Local Search in Histogram Construction." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (July 5, 2010): 1680–85. http://dx.doi.org/10.1609/aaai.v24i1.7713.

Full text
Abstract:
The problem of dividing a sequence of values into segments occurs in database systems, information retrieval, and knowledge management. The challenge is to select a finite number of boundaries for the segments so as to optimize an objective error function defined over those segments. Although this optimization problem can be solved in polynomial time, the algorithm which achieves the minimum error does not scale well, hence it is not practical for applications with massive data sets. There is considerable research with numerous approximation and heuristic algorithms. Still, none of those approaches has resolved the quality-efficiency tradeoff in a satisfactory manner. In (Halim, Karras, and Yap 2009), we obtain near linear time algorithms which achieve both the desired scalability and near-optimal quality, thus dominating earlier approaches. In this paper, we show how two ideas from artificial intelligence, an efficient local search and recombination of multiple solutions reminiscent of genetic algorithms, are combined in a novel way to obtain state of the art histogram construction algorithms.
APA, Harvard, Vancouver, ISO, and other styles
19

Gutiérrez-Naranjo, Miguel A., and Mario J. Pérez-Jiménez. "Local Search with P Systems." International Journal of Natural Computing Research 2, no. 2 (April 2011): 47–55. http://dx.doi.org/10.4018/jncr.2011040104.

Full text
Abstract:
Local search is currently one of the most used methods for finding solutions in real-life problems. It is usually considered when the research is interested in the final solution of the problem instead of the how the solution is reached. In this paper, the authors present an implementation of local search with Membrane Computing techniques applied to the N-queens problem as a case study. A CLIPS program inspired in the Membrane Computing design has been implemented and several experiments have been performed. The obtained results show better average times than those obtained with other Membrane Computing implementations that solve the N-queens problem.
APA, Harvard, Vancouver, ISO, and other styles
20

Orman, A., E. Aarts, and J. K. Lenstra. "Local Search in Combinatorial Optimisation." Journal of the Operational Research Society 50, no. 2 (February 1999): 191. http://dx.doi.org/10.2307/3010571.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Kadam, Tanmay, Nikhil Saxena, Akash Kosambia, and Anita Lahane. "Local Search –The Perfect Guide." International Journal of Engineering Trends and Technology 9, no. 13 (March 25, 2014): 652–57. http://dx.doi.org/10.14445/22315381/ijett-v9p323.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Musliu, Nysret, Andrea Schaerf, and Wolfgang Slany. "Local search for shift design." European Journal of Operational Research 153, no. 1 (February 2004): 51–64. http://dx.doi.org/10.1016/s0377-2217(03)00098-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Laszlo, Michael, and Sumitra Mukherjee. "Iterated local search for microaggregation." Journal of Systems and Software 100 (February 2015): 15–26. http://dx.doi.org/10.1016/j.jss.2014.10.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Aarts, Emile H. L., and Peter J. M. van Laarhoven. "Local search in coding theory." Discrete Mathematics 106-107 (September 1992): 11–18. http://dx.doi.org/10.1016/0012-365x(92)90524-j.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Rödl, V., and C. Tovey. "Multiple optima in local search." Journal of Algorithms 8, no. 2 (June 1987): 250–59. http://dx.doi.org/10.1016/0196-6774(87)90041-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Johnson, David S., Christos H. Papadimitriou, and Mihalis Yannakakis. "How easy is local search?" Journal of Computer and System Sciences 37, no. 1 (August 1988): 79–100. http://dx.doi.org/10.1016/0022-0000(88)90046-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Jaszkiewicz, Andrzej. "Many-Objective Pareto Local Search." European Journal of Operational Research 271, no. 3 (December 2018): 1001–13. http://dx.doi.org/10.1016/j.ejor.2018.06.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Verhoeven, Yves F. "Enhanced algorithms for Local Search." Information Processing Letters 97, no. 5 (March 2006): 171–76. http://dx.doi.org/10.1016/j.ipl.2005.11.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Barbucha, Dariusz. "Agent-based guided local search." Expert Systems with Applications 39, no. 15 (November 2012): 12032–45. http://dx.doi.org/10.1016/j.eswa.2012.03.074.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Van Hentenryck, Pascal, and Laurent Michel. "Control Abstractions for Local Search." Constraints 10, no. 2 (April 2005): 137–57. http://dx.doi.org/10.1007/s10601-005-0553-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Grégoire, Éric, Bertrand Mazure, and Cédric Piette. "Local-search Extraction of MUSes." Constraints 12, no. 3 (June 9, 2007): 325–44. http://dx.doi.org/10.1007/s10601-007-9019-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Guo, Zhaolu, Wensheng Zhang, and Shenwen Wang. "Improved gravitational search algorithm based on chaotic local search." International Journal of Bio-Inspired Computation 17, no. 3 (2021): 154. http://dx.doi.org/10.1504/ijbic.2021.114873.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Guo, Zhaolu, Wensheng Zhang, and Shenwen Wang. "Improved gravitational search algorithm based on chaotic local search." International Journal of Bio-Inspired Computation 17, no. 3 (2021): 154. http://dx.doi.org/10.1504/ijbic.2021.10037528.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Yan, Shaoqiang, Ping Yang, Donglin Zhu, Wanli Zheng, and Fengxuan Wu. "Improved Sparrow Search Algorithm Based on Iterative Local Search." Computational Intelligence and Neuroscience 2021 (December 15, 2021): 1–31. http://dx.doi.org/10.1155/2021/6860503.

Full text
Abstract:
This paper solves the shortcomings of sparrow search algorithm in poor utilization to the current individual and lack of effective search, improves its search performance, achieves good results on 23 basic benchmark functions and CEC 2017, and effectively improves the problem that the algorithm falls into local optimal solution and has low search accuracy. This paper proposes an improved sparrow search algorithm based on iterative local search (ISSA). In the global search phase of the followers, the variable helix factor is introduced, which makes full use of the individual’s opposite solution about the origin, reduces the number of individuals beyond the boundary, and ensures the algorithm has a detailed and flexible search ability. In the local search phase of the followers, an improved iterative local search strategy is adopted to increase the search accuracy and prevent the omission of the optimal solution. By adding the dimension by dimension lens learning strategy to scouters, the search range is more flexible and helps jump out of the local optimal solution by changing the focusing ability of the lens and the dynamic boundary of each dimension. Finally, the boundary control is improved to effectively utilize the individuals beyond the boundary while retaining the randomness of the individuals. The ISSA is compared with PSO, SCA, GWO, WOA, MWOA, SSA, BSSA, CSSA, and LSSA on 23 basic functions to verify the optimization performance of the algorithm. In addition, in order to further verify the optimization performance of the algorithm when the optimal solution is not 0, the above algorithms are compared in CEC 2017 test function. The simulation results show that the ISSA has good universality. Finally, this paper applies ISSA to PID parameter tuning and robot path planning, and the results show that the algorithm has good practicability and effect.
APA, Harvard, Vancouver, ISO, and other styles
35

Zhang, Maoqing, Hui Wang, Zhihua Cui, and Jinjun Chen. "Hybrid multi-objective cuckoo search with dynamical local search." Memetic Computing 10, no. 2 (July 4, 2017): 199–208. http://dx.doi.org/10.1007/s12293-017-0237-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Wu, Jun, Chu-Min Li, Lu Jiang, Junping Zhou, and Minghao Yin. "Local search for diversified Top-k clique search problem." Computers & Operations Research 116 (April 2020): 104867. http://dx.doi.org/10.1016/j.cor.2019.104867.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Gyllenberg, M., T. Koski, T. Lund, and O. Nevalainen. "Clustering by Adaptive Local Search with Multiple Search Operators." Pattern Analysis & Applications 3, no. 4 (December 1, 2000): 348–57. http://dx.doi.org/10.1007/s100440070006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Queiroz dos Santos, João Paulo, Jorge Dantas de Melo, Adrião Dória Duarte Neto, and Daniel Aloise. "Reactive Search strategies using Reinforcement Learning, local search algorithms and Variable Neighborhood Search." Expert Systems with Applications 41, no. 10 (August 2014): 4939–49. http://dx.doi.org/10.1016/j.eswa.2014.01.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Seppälä, Jane, and Timo Olavi Vuori. "Machine Learning in Strategic Search: Alleviating and Aggravating Local Search." Academy of Management Proceedings 2021, no. 1 (August 2021): 14980. http://dx.doi.org/10.5465/ambpp.2021.14980abstract.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Wang, Jie, and Hongwen Li. "Particle Swarm Optimization with Enhanced Global Search and Local Search." Journal of Intelligent Systems 26, no. 3 (July 26, 2017): 421–32. http://dx.doi.org/10.1515/jisys-2015-0153.

Full text
Abstract:
AbstractIn order to mitigate the problems of premature convergence and low search accuracy that exist in traditional particle swarm optimization (PSO), this paper presents PSO with enhanced global search and local search (EGLPSO). In EGLPSO, most of the particles would be concentrated in global search at the beginning. Along with the iteration, the particles would slowly focus on local search. A new updating strategy would be used for global search, and a partial mutation strategy is applied to the leader particle of local search for a better position. During each iteration, the best particle of global search would exchange information with some particles of local search. EGLPSO is tested on a set of 12 benchmark functions, and it is also compared with other four PSO variants and another six well-known PSO variants. The experimental results showed that EGLPSO can greatly improve the performance of traditional PSO in terms of search accuracy, search efficiency, and global optimality.
APA, Harvard, Vancouver, ISO, and other styles
41

Li, Chunting, Zigang Lian, and Tipeng Zhang. "An Optimized Bat Algorithm Combining Local Search and Global Search." IOP Conference Series: Earth and Environmental Science 571 (November 26, 2020): 012018. http://dx.doi.org/10.1088/1755-1315/571/1/012018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Jackson, Warren G., Ender Özcan, and Robert I. John. "Move acceptance in local search metaheuristics for cross-domain search." Expert Systems with Applications 109 (November 2018): 131–51. http://dx.doi.org/10.1016/j.eswa.2018.05.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Desai, Hemang, and Birajkumar V. Patel. "Search Engine Optimization Using Local Language." International Journal of Computer Sciences and Engineering 6, no. 7 (July 31, 2018): 1238–39. http://dx.doi.org/10.26438/ijcse/v6i7.12381239.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Frank, J., P. Cheeseman, and J. Stutz. "When Gravity Fails: Local Search Topology." Journal of Artificial Intelligence Research 7 (December 1, 1997): 249–81. http://dx.doi.org/10.1613/jair.445.

Full text
Abstract:
Local search algorithms for combinatorial search problems frequently encounter a sequence of states in which it is impossible to improve the value of the objective function; moves through these regions, called plateau moves, dominate the time spent in local search. We analyze and characterize plateaus for three different classes of randomly generated Boolean Satisfiability problems. We identify several interesting features of plateaus that impact the performance of local search algorithms. We show that local minima tend to be small but occasionally may be very large. We also show that local minima can be escaped without unsatisfying a large number of clauses, but that systematically searching for an escape route may be computationally expensive if the local minimum is large. We show that plateaus with exits, called benches, tend to be much larger than minima, and that some benches have very few exit states which local search can use to escape. We show that the solutions (i.e., global minima) of randomly generated problem instances form clusters, which behave similarly to local minima. We revisit several enhancements of local search algorithms and explain their performance in light of our results. Finally we discuss strategies for creating the next generation of local search algorithms.
APA, Harvard, Vancouver, ISO, and other styles
45

Mouhoub, Malek. "Stochastic local search for incremental SAT." International Journal of Knowledge-based and Intelligent Engineering Systems 9, no. 3 (September 13, 2005): 191–95. http://dx.doi.org/10.3233/kes-2005-9303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Gaspers, Serge, Eun Jung Kim, Sebastian Ordyniak, Saket Saurabh, and Stefan Szeider. "Don't Be Strict in Local Search!" Proceedings of the AAAI Conference on Artificial Intelligence 26, no. 1 (September 20, 2021): 486–92. http://dx.doi.org/10.1609/aaai.v26i1.8128.

Full text
Abstract:
Local Search is one of the fundamental approaches to combinatorial optimization and it is used throughout AI. Several local search algorithms are based on searching the k-exchange neighborhood. This is the set of solutions that can be obtained from the current solution by exchanging at most k elements. As a rule of thumb, the larger k is, the better are the chances of finding an improved solution. However, for inputs of size n, a naive brute-force search of the k-exchange neighborhood requires n(O(k)) time, which is not practical even for very small values of k. Fellows et al. (IJCAI 2009) studied whether this brute-force search is avoidable and gave positive and negative answers for several combinatorial problems. They used the notion of local search in a strict sense. That is, an improved solution needs to be found in the k-exchange neighborhood even if a global optimum can be found efficiently. In this paper we consider a natural relaxation of local search, called permissive local search (Marx and Schlotter, IWPEC 2009) and investigate whether it enhances the domain of tractable inputs. We exemplify this approach on a fundamental combinatorial problem, Vertex Cover. More precisely, we show that for a class of inputs, finding an optimum is hard, strict local search is hard, but permissive local search is tractable. We carry out this investigation in the framework of parameterized complexity.
APA, Harvard, Vancouver, ISO, and other styles
47

Liu, Defeng, Matteo Fischetti, and Andrea Lodi. "Learning to Search in Local Branching." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 4 (June 28, 2022): 3796–803. http://dx.doi.org/10.1609/aaai.v36i4.20294.

Full text
Abstract:
Finding high-quality solutions to mixed-integer linear programming problems (MILPs) is of great importance for many practical applications. In this respect, the refinement heuristic local branching (LB) has been proposed to produce improving solutions and has been highly influential for the development of local search methods in MILP. The algorithm iteratively explores a sequence of solution neighborhoods defined by the so-called local branching constraint, namely, a linear inequality limiting the distance from a reference solution. For a LB algorithm, the choice of the neighborhood size is critical to performance. Although it was initialized by a conservative value in the original LB scheme, our new observation is that the "best" size is strongly dependent on the particular MILP instance. In this work, we investigate the relation between the size of the search neighborhood and the behavior of the underlying LB algorithm, and we devise a leaning-based framework for guiding the neighborhood search of the LB heuristic. The framework consists of a two-phase strategy. For the first phase, a scaled regression model is trained to predict the size of the LB neighborhood at the first iteration through a regression task. In the second phase, we leverage reinforcement learning and devise a reinforced neighborhood search strategy to dynamically adapt the size at the subsequent iterations. We computationally show that the neighborhood size can indeed be learned, leading to improved performances and that the overall algorithm generalizes well both with respect to the instance size and, remarkably, across instances.
APA, Harvard, Vancouver, ISO, and other styles
48

Jabba, Ayad. "Rule Induction with Iterated Local Search." International Journal of Intelligent Engineering and Systems 14, no. 4 (August 31, 2021): 289–98. http://dx.doi.org/10.22266/ijies2021.0831.26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Moreno-Espino, Mailyn, and Alejandro Rosete-Suárez. "Proactive local search based on FDC." DYNA 81, no. 184 (April 24, 2014): 201. http://dx.doi.org/10.15446/dyna.v81n184.37303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

He, Jun, Pierre Flener, and Justin Pearson. "An automaton Constraint for Local Search." Fundamenta Informaticae 107, no. 2-3 (2011): 223–48. http://dx.doi.org/10.3233/fi-2011-401.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography