Journal articles on the topic 'Stochastic local search'

To see the other types of publications on this topic, follow the link: Stochastic local search.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Stochastic local search.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Mouhoub, Malek. "Stochastic local search for incremental SAT." International Journal of Knowledge-based and Intelligent Engineering Systems 9, no. 3 (September 13, 2005): 191–95. http://dx.doi.org/10.3233/kes-2005-9303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

CHEN, Gong-gui. "Particle swarm optimization with stochastic local search." Journal of Computer Applications 28, no. 1 (June 30, 2008): 94–96. http://dx.doi.org/10.3724/sp.j.1087.2008.00094.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kastrati, Muhamet, and Marenglen Biba. "Stochastic local search: a state-of-the-art review." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 1 (February 1, 2021): 716. http://dx.doi.org/10.11591/ijece.v11i1.pp716-727.

Full text
Abstract:
The main objective of this paper is to provide a state-of-the-art review, analyze and discuss stochastic local search techniques used for solving hard combinatorial problems. It begins with a short introduction, motivation and some basic notation on combinatorial problems, search paradigms and other relevant features of searching techniques as needed for background. In the following a brief overview of the stochastic local search methods along with an analysis of the state-of-the-art stochastic local search algorithms is given. Finally, the last part of the paper present and discuss some of the most latest trends in application of stochastic local search algorithms in machine learning, data mining and some other areas of science and engineering. We conclude with a discussion on capabilities and limitations of stochastic local search algorithms.
APA, Harvard, Vancouver, ISO, and other styles
4

Nekkaa, Messaouda, and Dalila Boughaci. "Hybrid Harmony Search Combined with Stochastic Local Search for Feature Selection." Neural Processing Letters 44, no. 1 (June 26, 2015): 199–220. http://dx.doi.org/10.1007/s11063-015-9450-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

HA*, Meesoon. "Stochastic Local Search in Random Constraint Satisfaction Problems." New Physics: Sae Mulli 63, no. 9 (September 30, 2013): 995–99. http://dx.doi.org/10.3938/npsm.63.995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Al-Behadili, Hayder. "Stochastic Local Search Algorithms for Feature Selection: A Review." Iraqi Journal for Electrical and Electronic Engineering 17, no. 1 (February 2, 2021): 1–10. http://dx.doi.org/10.37917/ijeee.17.1.1.

Full text
Abstract:
In today’s world, the data generated by many applications are increasing drastically, and finding an optimal subset of features from the data has become a crucial task. The main objective of this review is to analyze and comprehend different stochastic local search algorithms to find an optimal feature subset. Simulated annealing, tabu search, genetic programming, genetic algorithm, particle swarm optimization, artificial bee colony, grey wolf optimization, and bat algorithm, which have been used in feature selection, are discussed. This review also highlights the filter and wrapper approaches for feature selection. Furthermore, this review highlights the main components of stochastic local search algorithms, categorizes these algorithms in accordance with the type, and discusses the promising research directions for such algorithms in future research of feature selection.
APA, Harvard, Vancouver, ISO, and other styles
7

Pérez Cáceres, Leslie, Ignacio Araya, and Guillermo Cabrera-Guerrero. "Stochastic Local Search for the Direct Aperture Optimisation Problem." Expert Systems with Applications 182 (November 2021): 115206. http://dx.doi.org/10.1016/j.eswa.2021.115206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Yu, Hang, Yu Zhang, Pengxing Cai, Junyan Yi, Sheng Li, and Shi Wang. "Stochastic Multiple Chaotic Local Search-Incorporated Gradient-Based Optimizer." Discrete Dynamics in Nature and Society 2021 (December 2, 2021): 1–16. http://dx.doi.org/10.1155/2021/3353926.

Full text
Abstract:
In this study, a hybrid metaheuristic algorithm chaotic gradient-based optimizer (CGBO) is proposed. The gradient-based optimizer (GBO) is a novel metaheuristic inspired by Newton’s method which has two search strategies to ensure excellent performance. One is the gradient search rule (GSR), and the other is local escaping operation (LEO). GSR utilizes the gradient method to enhance ability of exploitation and convergence rate, and LEO employs random operators to escape the local optima. It is verified that gradient-based metaheuristic algorithms have obvious shortcomings in exploration. Meanwhile, chaotic local search (CLS) is an efficient search strategy with randomicity and ergodicity, which is usually used to improve global optimization algorithms. Accordingly, we incorporate GBO with CLS to strengthen the ability of exploration and keep high-level population diversity for original GBO. In this study, CGBO is tested with over 30 CEC2017 benchmark functions and a parameter optimization problem of the dendritic neuron model (DNM). Experimental results indicate that CGBO performs better than other state-of-the-art algorithms in terms of effectiveness and robustness.
APA, Harvard, Vancouver, ISO, and other styles
9

Kaur, Harshdeep. "LOCAL SEARCH BASED ALGORITHM FOR CVRP WITH STOCHASTIC DEMANDS." International Journal of Advanced Research in Computer Science 8, no. 7 (August 20, 2017): 1087–92. http://dx.doi.org/10.26483/ijarcs.v8i7.4468.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Caramia, Massimiliano, and Paolo Dell’Olmo. "Coupling Stochastic and Deterministic Local Search in Examination Timetabling." Operations Research 55, no. 2 (April 2007): 351–66. http://dx.doi.org/10.1287/opre.1060.0354.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Chu, Yi, Chuan Luo, Shaowei Cai, and Haihang You. "Empirical investigation of stochastic local search for maximum satisfiability." Frontiers of Computer Science 13, no. 1 (August 30, 2018): 86–98. http://dx.doi.org/10.1007/s11704-018-7107-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Tria, Francesca, Emanuele Caglioti, Vittorio Loreto, and Andrea Pagnani. "A stochastic local search approach to language tree reconstruction." Diachronica 27, no. 2 (October 11, 2010): 341–58. http://dx.doi.org/10.1075/dia.27.2.09tri.

Full text
Abstract:
In this paper we introduce a novel stochastic local search algorithm to reconstruct phylogenetic trees. We focus in particular on the reconstruction of language trees based on the comparison of the Swadesh lists of the recently compiled ASJP database. Starting from a generic tree configuration, our scheme stochastically explores the space of possible trees driven by the minimization of a pseudo-functional quantifying the violations of additivity of the distance matrix. As a consequence the resulting tree can be annotated with the values of the violations on each internal branch. The values of the deviations are strongly correlated with the stability of the internal edges; they are measured with a novel bootstrap procedure and displayed on the tree as an additional annotation. As a case study we considered the reconstruction of the Indo-European language tree. The results are quite encouraging, highlighting a potential new avenue to investigate the role of the deviations from additivity and check the reliability and consistency of the reconstructed trees.
APA, Harvard, Vancouver, ISO, and other styles
13

Ribeiro, Celso C., and Mauricio G. C. Resende. "Path-relinking intensification methods for stochastic local search algorithms." Journal of Heuristics 18, no. 2 (May 3, 2011): 193–214. http://dx.doi.org/10.1007/s10732-011-9167-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Chen, Wenxiang, Darrell Whitley, Adele Howe, and Brian Goldman. "Stochastic Local Search over Minterms on Structured SAT Instances." Proceedings of the International Symposium on Combinatorial Search 7, no. 1 (September 1, 2021): 125–26. http://dx.doi.org/10.1609/socs.v7i1.18403.

Full text
Abstract:
We observed that Conjunctive Normal Form (CNF) encodings of structured SAT instances often have a set of consecutive clauses defined over a small number of Boolean variables. To exploit the pattern, we propose a transformation of CNF to an alternative representation, Conjunctive Minterm Canonical Form (CMCF). The transformation is a two-step process: CNF clauses are first partitioned into disjoint subsets such that each subset contains CNF clauses with shared Boolean variables. CNF clauses in each subset are then replaced by Minterm Canonical Form (i.e., partial solutions), which is found by enumeration. We show empirically that a simple Stochastic Local Search (SLS) solver based on CMCF can consistently achieve a higher success rate using fewer evaluations than the SLS solver WalkSAT on two representative classes of structured SAT problems.
APA, Harvard, Vancouver, ISO, and other styles
15

He, Jun. "Constraints for membership in formal languages under systematic search and stochastic local search." Constraints 20, no. 4 (September 10, 2015): 477–78. http://dx.doi.org/10.1007/s10601-015-9212-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Tria, F., E. Caglioti, V. Loreto, and A. Pagnani. "A Stochastic Local Search Algorithm for Distance-Based Phylogeny Reconstruction." Molecular Biology and Evolution 27, no. 11 (June 18, 2010): 2587–95. http://dx.doi.org/10.1093/molbev/msq154.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Melo, Wendel A. X., Marcia H. C. Fampa, and Fernanda M. P. Raupp. "A stochastic local search algorithm for constrained continuous global optimization." International Transactions in Operational Research 19, no. 6 (June 1, 2012): 825–46. http://dx.doi.org/10.1111/j.1475-3995.2012.00854.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Hoos, Holger, Laetitia Jourdan, Marie‐Eléonore Kessaci, Thomas Stützle, and Nadarajen Veerapen. "Special issue on “Stochastic Local Search: Recent developments and trends”." International Transactions in Operational Research 26, no. 6 (June 14, 2019): 2580–81. http://dx.doi.org/10.1111/itor.12679.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Hoos, Holger, Laetitia Jourdan, Marie-Eléonore Kessaci, Thomas Stützle, and Nadarajen Veerapen. "Special issue on “Stochastic Local Search: Recent developments and trends”." International Transactions in Operational Research 27, no. 1 (August 5, 2019): 697–98. http://dx.doi.org/10.1111/itor.12695.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Hoos, Holger, Laetitia Jourdan, Marie‐Eléonore Kessaci, Thomas Stützle, and Nadarajen Veerapen. "Special issue on “Stochastic Local Search: Recent developments and trends”." International Transactions in Operational Research 27, no. 2 (October 13, 2019): 1263–64. http://dx.doi.org/10.1111/itor.12728.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Hoos, Holger, Laetitia Jourdan, Marie‐Eléonore Kessaci, Thomas Stützle, and Nadarajen Veerapen. "Special issue on “Stochastic Local Search: Recent developments and trends”." International Transactions in Operational Research 27, no. 3 (December 19, 2019): 1806–7. http://dx.doi.org/10.1111/itor.12752.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Hoos, Holger, Laetitia Jourdan, Marie‐Eléonore Kessaci, Thomas Stützle, and Nadarajen Veerapen. "Special issue on “Stochastic Local Search: Recent developments and trends”." International Transactions in Operational Research 27, no. 4 (February 14, 2020): 2253–54. http://dx.doi.org/10.1111/itor.12772.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Hoos, Holger, Laetitia Jourdan, Marie‐Eléonore Kessaci, Thomas Stützle, and Nadarajen Veerapen. "Special issue on “Stochastic Local Search: Recent developments and trends”." International Transactions in Operational Research 27, no. 5 (September 2020): 2685–86. http://dx.doi.org/10.1111/itor.12791.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Fournier, Nicolas G. "Modelling the dynamics of stochastic local search on k-sat." Journal of Heuristics 13, no. 6 (May 1, 2007): 587–639. http://dx.doi.org/10.1007/s10732-007-9023-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Drugan, Mădălina M., and Dirk Thierens. "Stochastic Pareto local search: Pareto neighbourhood exploration and perturbation strategies." Journal of Heuristics 18, no. 5 (July 3, 2012): 727–66. http://dx.doi.org/10.1007/s10732-012-9205-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Chetty, Sivashan, and Aderemi Oluyinka Adewumi. "Three new stochastic local search algorithms for continuous optimization problems." Computational Optimization and Applications 56, no. 3 (May 9, 2013): 675–721. http://dx.doi.org/10.1007/s10589-013-9566-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Milchgrub, Alon, and Rina Dechter. "STLS: Cycle-Cutset-Driven Local Search For MPE." Proceedings of the International Symposium on Combinatorial Search 5, no. 1 (September 1, 2021): 204–5. http://dx.doi.org/10.1609/socs.v5i1.18336.

Full text
Abstract:
In this paper we present Stochastic Tree-based Local Search or STLS, a local search algorithm combining the notion of cycle-cutsets with the well-known Belief Propagation to approximatethe optimum of sums of unary and binary potentials. This is done by the previously unexplored concept oftraversal from one cutset to another and updating the induced forest, thus creating a local search algorithm, whose updatephase spans over all the forest variables. We study empirically two pure variants of STLS against the state-of-the art GLS+ scheme and against a hybrid.
APA, Harvard, Vancouver, ISO, and other styles
28

Lančinskas, Algirdas, Pilar Martinez Ortigosa, and Julius Žilinskas. "Multi-objective single agent stochastic search in non-dominated sorting genetic algorithm." Nonlinear Analysis: Modelling and Control 18, no. 3 (July 25, 2013): 293–313. http://dx.doi.org/10.15388/na.18.3.14011.

Full text
Abstract:
A hybrid multi-objective optimization algorithm based on genetic algorithm and stochastic local search is developed and evaluated. The single agent stochastic search local optimization algorithm has been modified in order to be suitable for multi-objective optimization where the local optimization is performed towards non-dominated points. The presented algorithm has been experimentally investigated by solving a set of well known test problems, and evaluated according to several metrics for measuring the performance of algorithms for multi-objective optimization. Results of the experimental investigation are presented and discussed.
APA, Harvard, Vancouver, ISO, and other styles
29

Alava, Mikko, John Ardelius, Erik Aurell, Petteri Kaski, Supriya Krishnamurthy, Pekka Orponen, and Sakari Seitz. "Circumspect descent prevails in solving random constraint satisfaction problems." Proceedings of the National Academy of Sciences 105, no. 40 (October 1, 2008): 15253–57. http://dx.doi.org/10.1073/pnas.0712263105.

Full text
Abstract:
We study the performance of stochastic local search algorithms for random instances of the K-satisfiability (K-SAT) problem. We present a stochastic local search algorithm, ChainSAT, which moves in the energy landscape of a problem instance by never going upwards in energy. ChainSAT is a focused algorithm in the sense that it focuses on variables occurring in unsatisfied clauses. We show by extensive numerical investigations that ChainSAT and other focused algorithms solve large K-SAT instances almost surely in linear time, up to high clause-to-variable ratios α; for example, for K = 4 we observe linear-time performance well beyond the recently postulated clustering and condensation transitions in the solution space. The performance of ChainSAT is a surprise given that by design the algorithm gets trapped into the first local energy minimum it encounters, yet no such minima are encountered. We also study the geometry of the solution space as accessed by stochastic local search algorithms.
APA, Harvard, Vancouver, ISO, and other styles
30

Gerevini, A., A. Saetti, and I. Serina. "Planning Through Stochastic Local Search and Temporal Action Graphs in LPG." Journal of Artificial Intelligence Research 20 (December 1, 2003): 239–90. http://dx.doi.org/10.1613/jair.1183.

Full text
Abstract:
We present some techniques for planning in domains specified with the recent standard language PDDL2.1, supporting 'durative actions' and numerical quantities. These techniques are implemented in LPG, a domain-independent planner that took part in the 3rd International Planning Competition (IPC). LPG is an incremental, any time system producing multi-criteria quality plans. The core of the system is based on a stochastic local search method and on a graph-based representation called 'Temporal Action Graphs' (TA-graphs). This paper focuses on temporal planning, introducing TA-graphs and proposing some techniques to guide the search in LPG using this representation. The experimental results of the 3rd IPC, as well as further results presented in this paper, show that our techniques can be very effective. Often LPG outperforms all other fully-automated planners of the 3rd IPC in terms of speed to derive a solution, or quality of the solutions that can be produced.
APA, Harvard, Vancouver, ISO, and other styles
31

Pongchairerks, Pisut, and Jutathip Putponpai. "Deterministic, stochastic and local search algorithms for real-world ATSP instances." International Journal of Logistics Systems and Management 24, no. 2 (2016): 265. http://dx.doi.org/10.1504/ijlsm.2016.076474.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Farhi, Sidali Hocine, and Dalila Boughaci. "Graph based model for information retrieval using a stochastic local search." Pattern Recognition Letters 105 (April 2018): 234–39. http://dx.doi.org/10.1016/j.patrec.2017.09.019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hedar, Abdel-Rahman, and Rashad Ismail. "Simulated annealing with stochastic local search for minimum dominating set problem." International Journal of Machine Learning and Cybernetics 3, no. 2 (August 12, 2011): 97–109. http://dx.doi.org/10.1007/s13042-011-0043-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Luo, Chuan, Kaile Su, and Shaowei Cai. "More efficient two-mode stochastic local search for random 3-satisfiability." Applied Intelligence 41, no. 3 (June 22, 2014): 665–80. http://dx.doi.org/10.1007/s10489-014-0556-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Dong, Yumin, Shiqi Zhao, and Wanbin Hu. "Quantum firefly algorithm with stochastic search strategies." Journal of Applied Physics 132, no. 7 (August 21, 2022): 074401. http://dx.doi.org/10.1063/5.0102339.

Full text
Abstract:
The firefly algorithm (FA) is a popular swarm intelligence optimization algorithm. The FA is used to solve various optimization problems, but it still has some deficiencies, such as high complexity, slow convergence rate, and low accuracy of the solution. This paper proposes a highly efficient quantum firefly algorithm with stochastic search strategies (QSSFA). In QSSFA, individuals are generated in the way of quantum angle coding by introducing the laws of quantum physics and quantum gates, and combined with the random neighborhood attraction model, an adaptive step size strategy is also introduced in the optimization. The complexity of the algorithm is greatly reduced, and the global search ability of the algorithm is optimized. The convergence speed of the algorithm, the ability to jump out of the local optimum, and the algorithm accuracy are improved. The proposed QSSFA’s performance is tested on ten mathematical test functions. The obtained results show that the QSSFA algorithm is very competitive compared to the firefly algorithm and three other FA variants.
APA, Harvard, Vancouver, ISO, and other styles
36

Pullan, W., and H. H. Hoos. "Dynamic Local Search for the Maximum Clique Problem." Journal of Artificial Intelligence Research 25 (February 14, 2006): 159–85. http://dx.doi.org/10.1613/jair.1815.

Full text
Abstract:
In this paper, we introduce DLS-MC, a new stochastic local search algorithm for the maximum clique problem. DLS-MC alternates between phases of iterative improvement, during which suitable vertices are added to the current clique, and plateau search, during which vertices of the current clique are swapped with vertices not contained in the current clique. The selection of vertices is solely based on vertex penalties that are dynamically adjusted during the search, and a perturbation mechanism is used to overcome search stagnation. The behaviour of DLS-MC is controlled by a single parameter, penalty delay, which controls the frequency at which vertex penalties are reduced. We show empirically that DLS-MC achieves substantial performance improvements over state-of-the-art algorithms for the maximum clique problem over a large range of the commonly used DIMACS benchmark instances.
APA, Harvard, Vancouver, ISO, and other styles
37

Nikolic, Kostantin P. "Stochastic Search Algorithms for Identification, Optimization, and Training of Artificial Neural Networks." Advances in Artificial Neural Systems 2015 (February 28, 2015): 1–16. http://dx.doi.org/10.1155/2015/931379.

Full text
Abstract:
This paper presents certain stochastic search algorithms (SSA) suitable for effective identification, optimization, and training of artificial neural networks (ANN). The modified algorithm of nonlinear stochastic search (MN-SDS) has been introduced by the author. Its basic objectives are to improve convergence property of the source defined nonlinear stochastic search (N-SDS) method as per Professor Rastrigin. Having in mind vast range of possible algorithms and procedures a so-called method of stochastic direct search (SDS) has been practiced (in the literature is called stochastic local search-SLS). The MN-SDS convergence property is rather advancing over N-SDS; namely it has even better convergence over range of gradient procedures of optimization. The SDS, that is, SLS, has not been practiced enough in the process of identification, optimization, and training of ANN. Their efficiency in some cases of pure nonlinear systems makes them suitable for optimization and training of ANN. The presented examples illustrate only partially operatively end efficiency of SDS, that is, MN-SDS. For comparative method backpropagation error (BPE) method was used.
APA, Harvard, Vancouver, ISO, and other styles
38

Niu, Dangdang, Lei Liu, and Shuai Lü. "New stochastic local search approaches for computing preferred extensions of abstract argumentation." AI Communications 31, no. 4 (June 11, 2018): 369–82. http://dx.doi.org/10.3233/aic-180769.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Sassa, Shohei, Kenji Kanazawa, Shaowei Cai, and Moritoshi Yasunaga. "An FPGA Solver for Partial MaxSAT Problems Based on Stochastic Local Search." ACM SIGARCH Computer Architecture News 44, no. 4 (January 11, 2017): 32–37. http://dx.doi.org/10.1145/3039902.3039909.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Scheid, S., and R. Spang. "A stochastic downhill search algorithm for estimating the local false discovery rate." IEEE/ACM Transactions on Computational Biology and Bioinformatics 1, no. 3 (July 2004): 98–108. http://dx.doi.org/10.1109/tcbb.2004.24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Pagnozzi, Federico, and Thomas Stützle. "Automatic design of hybrid stochastic local search algorithms for permutation flowshop problems." European Journal of Operational Research 276, no. 2 (July 2019): 409–21. http://dx.doi.org/10.1016/j.ejor.2019.01.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

CAO, Q., S. GAO, J. ZHANG, Z. TANG, and H. KIMURA. "A Stochastic Dynamic Local Search Method for Learning Multiple-Valued Logic Networks." IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences E90-A, no. 5 (May 1, 2007): 1085–92. http://dx.doi.org/10.1093/ietfec/e90-a.5.1085.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Mengshoel, Ole J. "Understanding the role of noise in stochastic local search: Analysis and experiments." Artificial Intelligence 172, no. 8-9 (May 2008): 955–90. http://dx.doi.org/10.1016/j.artint.2007.09.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Kheiri, Ahmed, Ender Özcan, and Andrew J. Parkes. "A stochastic local search algorithm with adaptive acceptance for high-school timetabling." Annals of Operations Research 239, no. 1 (June 22, 2014): 135–51. http://dx.doi.org/10.1007/s10479-014-1660-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Chiarandini, Marco, and Thomas Stützle. "Stochastic Local Search Algorithms for Graph Set T-colouring and Frequency Assignment." Constraints 12, no. 3 (June 21, 2007): 371–403. http://dx.doi.org/10.1007/s10601-007-9023-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Cai, Shaowei, and Kaile Su. "Configuration Checking with Aspiration in Local Search for SAT." Proceedings of the AAAI Conference on Artificial Intelligence 26, no. 1 (September 20, 2021): 434–40. http://dx.doi.org/10.1609/aaai.v26i1.8133.

Full text
Abstract:
An interesting strategy called configuration checking (CC) was recently proposed to handle the cycling problem in local search for Minimum Vertex Cover. A natural question is whether this CC strategy also works for SAT. The direct application of CC did not result in stochastic local search (SLS) algorithms that can compete with the current best SLS algorithms for SAT. In this paper, we propose a new heuristic based on CC for SLS algorithms for SAT, which is called configuration checking with aspiration (CCA). It is used to develop a new SLS algorithm called Swcca. The experiments on random 3-SAT instances show that Swcca significantly outperforms Sparrow2011, the winner of the random satisfiable category of the SAT Competition 2011, which is considered to be the best local search solver for random 3-SAT instances. Moreover, the experiments on structured instances show that Swcca is competitive with Sattime, the best local search solver for the crafted benchmark in the SAT Competition 2011.
APA, Harvard, Vancouver, ISO, and other styles
47

Guillet, Marianne, Gerhard Hiermann, Alexander Kröller, and Maximilian Schiffer. "Electric Vehicle Charging Station Search in Stochastic Environments." Transportation Science 56, no. 2 (March 2022): 483–500. http://dx.doi.org/10.1287/trsc.2021.1102.

Full text
Abstract:
Electric vehicles are a central component of future mobility systems as they promise to reduce local noxious and fine dust emissions, as well as CO2 emissions, if fed by clean energy sources. However, the adoption of electric vehicles so far fell short of expectations despite significant governmental incentives. One reason for this slow adoption is the drivers’ perceived range anxiety, especially for individually owned vehicles. Here, bad user experiences (e.g., conventional cars blocking charging stations or inconsistent real-time availability data) manifest the drivers’ range anxiety. Against this background, we study stochastic search algorithms that can be readily deployed in today’s navigation systems in order to minimize detours to reach an available charging station. We model such a search as a finite-horizon Markov decision process and present a comprehensive framework that considers different problem variants, speedup techniques, and three solution algorithms: an exact labeling algorithm, a heuristic labeling algorithm, and a rollout algorithm. Extensive numerical studies show that our algorithms significantly decrease the expected time to find a free charging station while increasing the solution-quality robustness and the likelihood that a search is successful compared with myopic approaches.
APA, Harvard, Vancouver, ISO, and other styles
48

Rachmut, Ben, Roie Zivan, and William Yeoh. "Communication-Aware Local Search for Distributed Constraint Optimization." Journal of Artificial Intelligence Research 75 (October 28, 2022): 637–75. http://dx.doi.org/10.1613/jair.1.13826.

Full text
Abstract:
Most studies investigating models and algorithms for distributed constraint optimization problems (DCOPs) assume that messages arrive instantaneously and are never lost. Specifically, distributed local search DCOP algorithms, have been designed as synchronous algorithms (i.e., they perform in synchronous iterations in which each agent exchanges messages with all its neighbors), despite running in asynchronous environments. This is true also for an anytime mechanism that reports the best solution explored during the run of synchronous distributed local search algorithms. Thus, when the assumption of perfect communication is relaxed, the properties that were established for the state-of-the-art local search algorithms and the anytime mechanism may not necessarily apply. In this work, we address this limitation by: (1) Proposing a Communication-Aware DCOP model (CA-DCOP) that can represent scenarios with different communication disturbances; (2) Investigating the performance of existing local search DCOP algorithms, specifically Distributed Stochastic Algorithm (DSA) and Maximum Gain Messages (MGM), in the presence of message latency and message loss; (3) Proposing a latency-aware monotonic distributed local search DCOP algorithm; and (4) Proposing an asynchronous anytime framework for reporting the best solution explored by non-monotonic asynchronous local search DCOP algorithms. Our empirical results demonstrate that imperfect communication has a positive effect on distributed local search algorithms due to increased exploration. Furthermore, the asynchronous anytime framework we proposed allows one to benefit from algorithms with inherent explorative heuristics.
APA, Harvard, Vancouver, ISO, and other styles
49

Whitley, Darrell, Adele Howe, and Doug Hains. "Greedy or Not? Best Improving versus First Improving Stochastic Local Search for MAXSAT." Proceedings of the AAAI Conference on Artificial Intelligence 27, no. 1 (June 30, 2013): 940–46. http://dx.doi.org/10.1609/aaai.v27i1.8668.

Full text
Abstract:
Stochastic local search (SLS) is the dominant paradigm for incomplete SAT and MAXSAT solvers. Early studies on small 3SAT instances found that the use of “best improving” moves did not improve search compared to using an arbitrary “first improving” move. Yet SLS algorithms continue to use best improving moves. We revisit this issue by studying very large random and industrial MAXSAT problems. Because locating best improving moves is more expensive than first improving moves, we designed an “approximate best” improving move algorithm and prove that it is as efficient as first improving move SLS. For industrial problems the first local optima found using best improving moves are statistically significantly better than local optima found using first improving moves. However, this advantage reverses as search continues and algorithms must explore equal moves on plateaus. This reversal appears to be associated with critical variables that are in many clauses and that also yield large improving moves.
APA, Harvard, Vancouver, ISO, and other styles
50

Tan, Xujie, Seong-Yoon Shin, Kwang-Seong Shin, and Guangxing Wang. "Multi-Population Differential Evolution Algorithm with Uniform Local Search." Applied Sciences 12, no. 16 (August 12, 2022): 8087. http://dx.doi.org/10.3390/app12168087.

Full text
Abstract:
Differential evolution (DE) is a very effective stochastic optimization algorithm based on population for solving various real-world problems. The quality of solutions to these problems is mainly determined by the combination of mutation strategies and their parameters in DE. However, in the process of solving these problems, the population diversity and local search ability will gradually deteriorate. Therefore, we propose a multi-population differential evolution (MUDE) algorithm with a uniform local search to balance exploitation and exploration. With MUDE, the population is divided into multiple subpopulations with different population sizes, which perform different mutation strategies according to the evolution ratio, i.e., DE/rand/1, DE/current-to-rand/1, and DE/current-to-pbest/1. To improve the diversity of the population, the information is migrated between subpopulations by the soft-island model. Furthermore, the local search ability is improved by way of the uniform local search. As a result, the proposed MUDE maintains exploitation and exploration capabilities throughout the process. MUDE is extensively evaluated on 25 functions of the CEC 2005 benchmark. The comparison results show that the MUDE algorithm is very competitive with other DE variants and optimization algorithms in generating efficient solutions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography