Journal articles on the topic 'PSO (PRATICLE SWARM OPTIMIZATION)'

To see the other types of publications on this topic, follow the link: PSO (PRATICLE SWARM OPTIMIZATION).

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'PSO (PRATICLE SWARM OPTIMIZATION).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Aziz, Nor Azlina Ab, Zuwairie Ibrahim, Marizan Mubin, Sophan Wahyudi Nawawi, and Nor Hidayati Abdul Aziz. "Transitional Particle Swarm Optimization." International Journal of Electrical and Computer Engineering (IJECE) 7, no. 3 (June 1, 2017): 1611. http://dx.doi.org/10.11591/ijece.v7i3.pp1611-1619.

Full text
Abstract:
A new variation of particle swarm optimization (PSO) termed as transitional PSO (T-PSO) is proposed here. T-PSO attempts to improve PSO via its iteration strategy. Traditionally, PSO adopts either the synchronous or the asynchronous iteration strategy. Both of these iteration strategies have their own strengths and weaknesses. The synchronous strategy has reputation of better exploitation while asynchronous strategy is stronger in exploration. The particles of T-PSO start with asynchronous update to encourage more exploration at the start of the search. If no better solution is found for a number of iteration, the iteration strategy is changed to synchronous update to allow fine tuning by the particles. The results show that T-PSO is ranked better than the traditional PSOs.
APA, Harvard, Vancouver, ISO, and other styles
2

Golubovic, Ruzica, and Dragan Olcan. "Antenna optimization using Particle Swarm Optimization algorithm." Journal of Automatic Control 16, no. 1 (2006): 21–24. http://dx.doi.org/10.2298/jac0601021g.

Full text
Abstract:
We present the results for two different antenna optimization problems that are found using the Particle Swarm Optimization (PSO) algorithm. The first problem is finding the maximal forward gain of a Yagi antenna. The second problem is finding the optimal feeding of a broadside antenna array. The optimization problems have 6 and 20 optimization variables, respectively. The preferred values of the parameters of the PSO algorithm are found for presented problems. The results show that the preferred parameters of PSO are somewhat different for optimization problems with different number of dimensions of the optimization space. The results that are found using the PSO algorithm are compared with the results that are found using other optimization algorithms, in order to estimate the efficiency of the PSO.
APA, Harvard, Vancouver, ISO, and other styles
3

Jiang, Chang Yuan, Shu Guang Zhao, Li Zheng Guo, and Chuan Ji. "An Improved Particle Swarm Optimization Algorithm." Applied Mechanics and Materials 195-196 (August 2012): 1060–65. http://dx.doi.org/10.4028/www.scientific.net/amm.195-196.1060.

Full text
Abstract:
Based on the analyzing inertia weight of the standard particle swarm optimization (PSO) algorithm, an improved PSO algorithm is presented. Convergence condition of PSO is obtained through solving and analyzing the differential equation. By the experiments of four Benchmark function, the results show the performance of S-PSO improved more clearly than the standard PSO and random inertia weight PSO. Theoretical analysis and simulation experiments show that the S-PSO is efficient and feasible.
APA, Harvard, Vancouver, ISO, and other styles
4

Shen, Yuanxia, Linna Wei, Chuanhua Zeng, and Jian Chen. "Particle Swarm Optimization with Double Learning Patterns." Computational Intelligence and Neuroscience 2016 (2016): 1–19. http://dx.doi.org/10.1155/2016/6510303.

Full text
Abstract:
Particle Swarm Optimization (PSO) is an effective tool in solving optimization problems. However, PSO usually suffers from the premature convergence due to the quick losing of the swarm diversity. In this paper, we first analyze the motion behavior of the swarm based on the probability characteristic of learning parameters. Then a PSO with double learning patterns (PSO-DLP) is developed, which employs the master swarm and the slave swarm with different learning patterns to achieve a trade-off between the convergence speed and the swarm diversity. The particles in the master swarm and the slave swarm are encouraged to explore search for keeping the swarm diversity and to learn from the global best particle for refining a promising solution, respectively. When the evolutionary states of two swarms interact, an interaction mechanism is enabled. This mechanism can help the slave swarm in jumping out of the local optima and improve the convergence precision of the master swarm. The proposed PSO-DLP is evaluated on 20 benchmark functions, including rotated multimodal and complex shifted problems. The simulation results and statistical analysis show that PSO-DLP obtains a promising performance and outperforms eight PSO variants.
APA, Harvard, Vancouver, ISO, and other styles
5

Xu, Yu Fa, Jie Gao, Guo Chu Chen, and Jin Shou Yu. "Quantum Particle Swarm Optimization Algorithm." Applied Mechanics and Materials 63-64 (June 2011): 106–10. http://dx.doi.org/10.4028/www.scientific.net/amm.63-64.106.

Full text
Abstract:
Based on the problem of traditional particle swarm optimization (PSO) easily trapping into local optima, quantum theory is introduced into PSO to strengthen particles’ diversities and avoid the premature convergence effectively. Experimental results show that this method proposed by this paper has stronger optimal ability and better global searching capability than PSO.
APA, Harvard, Vancouver, ISO, and other styles
6

Moraglio, Alberto, Cecilia Di Chio, Julian Togelius, and Riccardo Poli. "Geometric Particle Swarm Optimization." Journal of Artificial Evolution and Applications 2008 (February 21, 2008): 1–14. http://dx.doi.org/10.1155/2008/143624.

Full text
Abstract:
Using a geometric framework for the interpretation of crossover of recent introduction, we show an intimate connection between particle swarm optimisation (PSO) and evolutionary algorithms. This connection enables us to generalise PSO to virtually any solution representation in a natural and straightforward way. The new Geometric PSO (GPSO) applies naturally to both continuous and combinatorial spaces. We demonstrate this for the cases of Euclidean, Manhattan and Hamming spaces and report extensive experimental results. We also demonstrate the applicability of GPSO to more challenging combinatorial spaces. The Sudoku puzzle is a perfect candidate to test new algorithmic ideas because it is entertaining and instructive as well as being a nontrivial constrained combinatorial problem. We apply GPSO to solve the Sudoku puzzle.
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Guan Yu, Xiao Ming Wang, Rui Guo, and Guo Qiang Wang. "An Improved Particle Swarm Optimization Algorithm." Applied Mechanics and Materials 394 (September 2013): 505–8. http://dx.doi.org/10.4028/www.scientific.net/amm.394.505.

Full text
Abstract:
This paper presents an improved particle swarm optimization (PSO) algorithm based on genetic algorithm (GA) and Tabu algorithm. The improved PSO algorithm adds the characteristics of genetic, mutation, and tabu search into the standard PSO to help it overcome the weaknesses of falling into the local optimum and avoids the repeat of the optimum path. By contrasting the improved and standard PSO algorithms through testing classic functions, the improved PSO is found to have better global search characteristics.
APA, Harvard, Vancouver, ISO, and other styles
8

Hudaib, Amjad A., and Ahmad Kamel AL Hwaitat. "Movement Particle Swarm Optimization Algorithm." Modern Applied Science 12, no. 1 (December 31, 2017): 148. http://dx.doi.org/10.5539/mas.v12n1p148.

Full text
Abstract:
Particle Swarm Optimization (PSO) ia a will known meta-heuristic that has been used in many applications for solving optimization problems. But it has some problems such as local minima. In this paper proposed a optimization algorithm called Movement Particle Swarm Optimization (MPSO) that enhances the behavior of PSO by using a random movement function to search for more points in the search space. The meta-heuristic has been experimented over 23 benchmark faction compared with state of the art algorithms: Multi-Verse Optimizer (MFO), Sine Cosine Algorithm (SCA), Grey Wolf Optimizer (GWO) and particle Swarm Optimization (PSO). The Results showed that the proposed algorithm has enhanced the PSO over the tested benchmarked functions.
APA, Harvard, Vancouver, ISO, and other styles
9

Gonsalves, Tad, and Akira Egashira. "Parallel Swarms Oriented Particle Swarm Optimization." Applied Computational Intelligence and Soft Computing 2013 (2013): 1–7. http://dx.doi.org/10.1155/2013/756719.

Full text
Abstract:
The particle swarm optimization (PSO) is a recently invented evolutionary computation technique which is gaining popularity owing to its simplicity in implementation and rapid convergence. In the case of single-peak functions, PSO rapidly converges to the peak; however, in the case of multimodal functions, the PSO particles are known to get trapped in the local optima. In this paper, we propose a variation of the algorithm called parallel swarms oriented particle swarm optimization (PSO-PSO) which consists of a multistage and a single stage of evolution. In the multi-stage of evolution, individual subswarms evolve independently in parallel, and in the single stage of evolution, the sub-swarms exchange information to search for the global-best. The two interweaved stages of evolution demonstrate better performance on test functions, especially of higher dimensions. The attractive feature of the PSO-PSO version of the algorithm is that it does not introduce any new parameters to improve its convergence performance. The strategy maintains the simple and intuitive structure as well as the implemental and computational advantages of the basic PSO.
APA, Harvard, Vancouver, ISO, and other styles
10

Ma, Zi Rui. "Particle Swarm Optimization Based on Multiobjective Optimization." Applied Mechanics and Materials 263-266 (December 2012): 2146–49. http://dx.doi.org/10.4028/www.scientific.net/amm.263-266.2146.

Full text
Abstract:
PSO will population each individual as the search space without a volume and quality of particle. These particles in the search space at a certain speed flight, the speed according to its own flight experience and the entire population of flight experience dynamic adjustment. We describe the standard PSO, multi-objective optimization and MOPSO. The main focus of this thesis is several PSO algorithms which are introduced in detail and studied. MOPSO algorithm introduced adaptive grid mechanism of the external population, not only to groups of particle on variation, but also to the value scope of the particles and variation, and the variation scale and population evolution algebra in proportion.
APA, Harvard, Vancouver, ISO, and other styles
11

Spears, William M., Derek T. Green, and Diana F. Spears. "Biases in Particle Swarm Optimization." International Journal of Swarm Intelligence Research 1, no. 2 (April 2010): 34–57. http://dx.doi.org/10.4018/jsir.2010040103.

Full text
Abstract:
The most common versions of particle swarm optimization (PSO) algorithms are rotationally variant. It has also been pointed out that PSO algorithms can concentrate particles along paths parallel to the coordinate axes. In this paper, the authors explicitly connect these two observations by showing that the rotational variance is related to the concentration along lines parallel to the coordinate axes. Based on this explicit connection, the authors create fitness functions that are easy or hard for PSO to solve, depending on the rotation of the function.
APA, Harvard, Vancouver, ISO, and other styles
12

Liu, Hao, Gang Xu, Gui-yan Ding, and Yu-bo Sun. "Human Behavior-Based Particle Swarm Optimization." Scientific World Journal 2014 (2014): 1–14. http://dx.doi.org/10.1155/2014/194706.

Full text
Abstract:
Particle swarm optimization (PSO) has attracted many researchers interested in dealing with various optimization problems, owing to its easy implementation, few tuned parameters, and acceptable performance. However, the algorithm is easy to trap in the local optima because of rapid losing of the population diversity. Therefore, improving the performance of PSO and decreasing the dependence on parameters are two important research hot points. In this paper, we present a human behavior-based PSO, which is called HPSO. There are two remarkable differences between PSO and HPSO. First, the global worst particle was introduced into the velocity equation of PSO, which is endowed with random weight which obeys the standard normal distribution; this strategy is conducive to trade off exploration and exploitation ability of PSO. Second, we eliminate the two acceleration coefficientsc1andc2in the standard PSO (SPSO) to reduce the parameters sensitivity of solved problems. Experimental results on 28 benchmark functions, which consist of unimodal, multimodal, rotated, and shifted high-dimensional functions, demonstrate the high performance of the proposed algorithm in terms of convergence accuracy and speed with lower computation cost.
APA, Harvard, Vancouver, ISO, and other styles
13

Sun, Tao, and Ming-hai Xu. "A Swarm Optimization Genetic Algorithm Based on Quantum-Behaved Particle Swarm Optimization." Computational Intelligence and Neuroscience 2017 (2017): 1–15. http://dx.doi.org/10.1155/2017/2782679.

Full text
Abstract:
Quantum-behaved particle swarm optimization (QPSO) algorithm is a variant of the traditional particle swarm optimization (PSO). The QPSO that was originally developed for continuous search spaces outperforms the traditional PSO in search ability. This paper analyzes the main factors that impact the search ability of QPSO and converts the particle movement formula to the mutation condition by introducing the rejection region, thus proposing a new binary algorithm, named swarm optimization genetic algorithm (SOGA), because it is more like genetic algorithm (GA) than PSO in form. SOGA has crossover and mutation operator as GA but does not need to set the crossover and mutation probability, so it has fewer parameters to control. The proposed algorithm was tested with several nonlinear high-dimension functions in the binary search space, and the results were compared with those from BPSO, BQPSO, and GA. The experimental results show that SOGA is distinctly superior to the other three algorithms in terms of solution accuracy and convergence.
APA, Harvard, Vancouver, ISO, and other styles
14

Wu, Daqing, and Jianguo Zheng. "A Dynamic Multistage Hybrid Swarm Intelligence Optimization Algorithm for Function Optimization." Discrete Dynamics in Nature and Society 2012 (2012): 1–22. http://dx.doi.org/10.1155/2012/578064.

Full text
Abstract:
A novel dynamic multistage hybrid swarm intelligence optimization algorithm is introduced, which is abbreviated as DM-PSO-ABC. The DM-PSO-ABC combined the exploration capabilities of the dynamic multiswarm particle swarm optimizer (PSO) and the stochastic exploitation of the cooperative artificial bee colony algorithm (CABC) for solving the function optimization. In the proposed hybrid algorithm, the whole process is divided into three stages. In the first stage, a dynamic multiswarm PSO is constructed to maintain the population diversity. In the second stage, the parallel, positive feedback of CABC was implemented in each small swarm. In the third stage, we make use of the particle swarm optimization global model, which has a faster convergence speed to enhance the global convergence in solving the whole problem. To verify the effectiveness and efficiency of the proposed hybrid algorithm, various scale benchmark problems are tested to demonstrate the potential of the proposed multistage hybrid swarm intelligence optimization algorithm. The results show that DM-PSO-ABC is better in the search precision, and convergence property and has strong ability to escape from the local suboptima when compared with several other peer algorithms.
APA, Harvard, Vancouver, ISO, and other styles
15

Ul Hassan, Nafees, Waqas Haider Bangyal, M. Sadiq Ali Khan, Kashif Nisar, Ag Asri Ag. Ibrahim, and Danda B. Rawat. "Improved Opposition-Based Particle Swarm Optimization Algorithm for Global Optimization." Symmetry 13, no. 12 (December 1, 2021): 2280. http://dx.doi.org/10.3390/sym13122280.

Full text
Abstract:
Particle Swarm Optimization (PSO) has been widely used to solve various types of optimization problems. An efficient algorithm must have symmetry of information between participating entities. Enhancing algorithm efficiency relative to the symmetric concept is a critical challenge in the field of information security. PSO also becomes trapped into local optima similarly to other nature-inspired algorithms. The literature depicts that in order to solve pre-mature convergence for PSO algorithms, researchers have adopted various parameters such as population initialization and inertia weight that can provide excellent results with respect to real world problems. This study proposed two newly improved variants of PSO termed Threefry with opposition-based PSO ranked inertia weight (ORIW-PSO-TF) and Philox with opposition-based PSO ranked inertia weight (ORIW-PSO-P) (ORIW-PSO-P). In the proposed variants, we incorporated three novel modifications: (1) pseudo-random sequence Threefry and Philox utilization for the initialization of population; (2) increased population diversity opposition-based learning is used; and (3) a novel introduction of opposition-based rank-based inertia weight to amplify the execution of standard PSO for the acceleration of the convergence speed. The proposed variants are examined on sixteen bench mark test functions and compared with conventional approaches. Similarly, statistical tests are also applied on the simulation results in order to obtain an accurate level of significance. Both proposed variants show highest performance on the stated benchmark functions over the standard approaches. In addition to this, the proposed variants ORIW-PSO-P and ORIW-PSO-P have been examined with respect to training of the artificial neural network (ANN). We have performed experiments using fifteen benchmark datasets obtained and applied from the repository of UCI. Simulation results have shown that the training of an ANN with ORIW-PSO-P and ORIW-PSO-P algorithms provides the best results than compared to traditional methodologies. All the observations from our simulations conclude that the proposed ASOA is superior to conventional optimizers. In addition, the results of our study predict how the proposed opposition-based method profoundly impacts diversity and convergence.
APA, Harvard, Vancouver, ISO, and other styles
16

Beheshti, Zahra, Siti Mariyam Shamsuddin, and Sarina Sulaiman. "Fusion Global-Local-Topology Particle Swarm Optimization for Global Optimization Problems." Mathematical Problems in Engineering 2014 (2014): 1–19. http://dx.doi.org/10.1155/2014/907386.

Full text
Abstract:
In recent years, particle swarm optimization (PSO) has been extensively applied in various optimization problems because of its structural and implementation simplicity. However, the PSO can sometimes find local optima or exhibit slow convergence speed when solving complex multimodal problems. To address these issues, an improved PSO scheme called fusion global-local-topology particle swarm optimization (FGLT-PSO) is proposed in this study. The algorithm employs both global and local topologies in PSO to jump out of the local optima. FGLT-PSO is evaluated using twenty (20) unimodal and multimodal nonlinear benchmark functions and its performance is compared with several well-known PSO algorithms. The experimental results showed that the proposed method improves the performance of PSO algorithm in terms of solution accuracy and convergence speed.
APA, Harvard, Vancouver, ISO, and other styles
17

Ni, Hong Mei, and Wei Gang Wang. "An Improved Particle Swarm Optimization Algorithm." Advanced Materials Research 850-851 (December 2013): 809–12. http://dx.doi.org/10.4028/www.scientific.net/amr.850-851.809.

Full text
Abstract:
Niche is an important technique for multi-peak function optimization. When the particle swarm optimization (PSO) algorithm is used in multi-peak function optimization, there exist some problems, such as easily falling into prematurely, having slow convergence rate and so on. To solve above problems, an improved PSO algorithm based on niche technique is brought forward. PSO algorithm utilizes properties of swarm behavior to solve optimization problems rapidly. Niche techniques have the ability to locate multiple solutions in multimodal domains. The improved PSO algorithm not only has the efficient parallelism but also increases the diversity of population because of the niche technique. The simulation result shows that the new algorithm is prior to traditional PSO algorithm, having stronger adaptability and convergence, solving better the question on multi-peak function optimization.
APA, Harvard, Vancouver, ISO, and other styles
18

Abdul-Adheem, Wameedh Riyadh. "An enhanced particle swarm optimization algorithm." International Journal of Electrical and Computer Engineering (IJECE) 9, no. 6 (December 1, 2019): 4904. http://dx.doi.org/10.11591/ijece.v9i6.pp4904-4907.

Full text
Abstract:
<p>In this paper, an enhanced stochastic optimization algorithm based on the basic Particle Swarm Optimization (PSO) algorithm is proposed. The basic PSO algorithm is built on the activities of the social feeding of some animals. Its parameters may influence the solution considerably. Moreover, it has a couple of weaknesses, for example, convergence speed and premature convergence. As a way out of the shortcomings of the basic PSO, several enhanced methods for updating the velocity such as Exponential Decay Inertia Weight (EDIW) are proposed in this work to construct an Enhanced PSO (EPSO) algorithm. The suggested algorithm is numerically simulated established on five benchmark functions with regards to the basic PSO approaches. The performance of the EPSO algorithm is analyzed and discussed based on the test results.</p>
APA, Harvard, Vancouver, ISO, and other styles
19

Li, Ming, and Xue Ling Ji. "Bacterial Particle Swarm Optimization Algorithm." Advanced Materials Research 211-212 (February 2011): 968–72. http://dx.doi.org/10.4028/www.scientific.net/amr.211-212.968.

Full text
Abstract:
The loss of the population diversity leads to the premature convergence in existing particle swarm optimization(PSO) algorithm. In order to solve this problem, a novel version of PSO algorithm called bacterial PSO(BacPSO), was proposed in this paper. In the new algorithm, the individuals were replaced by bacterial, and a new evolutionary mechanism was designed by the basic law of evolution of bacterial colony. Such evolutionary mechanism also generated a new natural termination criterion. Propagation and death operators were used to keep the population diversity of BacPSO. The simulation results show that BacPSO algorithm not only significantly improves convergence speed ,but also can converge to the global optimum.
APA, Harvard, Vancouver, ISO, and other styles
20

Fang, Xu. "Research of Construction Schedule Optimization using Particle Swarm Optimization." Advanced Materials Research 452-453 (January 2012): 441–45. http://dx.doi.org/10.4028/www.scientific.net/amr.452-453.441.

Full text
Abstract:
The particle swarm optimization (PSO)-based approach to resolve the resource-constrained project scheduling problem with the objective of minimizing project duration is introduced in this paper. Computational analyses are provided so as to investigate the performance of the PSO-based approach for the resource-constrained project scheduling problem. The results shows that it is feasible to apply PSO to construction schedule optimization.
APA, Harvard, Vancouver, ISO, and other styles
21

Wang, Bei Zhan, Xiang Deng, Wei Chuan Ye, and Hai Fang Wei. "Study on Discrete Particle Swarm Optimization Algorithm." Applied Mechanics and Materials 220-223 (November 2012): 1787–94. http://dx.doi.org/10.4028/www.scientific.net/amm.220-223.1787.

Full text
Abstract:
The particle swarm optimization (PSO) algorithm is a new type global searching method, which mostly focus on the continuous variables and little on discrete variables. The discrete forms and discretized methods have received more attention in recent years. This paper introduces the basic principles and mechanisms of PSO algorithm firstly, then points out the process of PSO algorithm and depict the operation rules of discrete PSO algorithm. Various improvements and applications of discrete PSO algorithms are reviewed. The mechanisms and characteristics of two different discretized strategies are presented. Some development trends and future research directions about discrete PSO are proposed.
APA, Harvard, Vancouver, ISO, and other styles
22

Zhou, Fei Hong, and Zi Zhen Liao. "A Particle Swarm Optimization Algorithm." Applied Mechanics and Materials 303-306 (February 2013): 1369–72. http://dx.doi.org/10.4028/www.scientific.net/amm.303-306.1369.

Full text
Abstract:
The basic and improved algorithms of PSO focus on how to effectively search the optimal solution in the solution space using one of the particle swarm. However, the particles are always chasing the global optimal point and such points currently found on their way of search, rapidly leading their speed down to zero and hence being restrained in the local minimum. Consequently, the convergence or early maturity of particles exists. The improved PSO is based on the enlightenment of BP neural network while the improvement is similar to smooth the weight through low-pass filter. The test of classical functions show that the PSO provides a promotion in the convergence precision and calculation velocity to a certain extent.
APA, Harvard, Vancouver, ISO, and other styles
23

Kien, Chong Woon, and Neoh Siew Chin. "Improved Particle Swarm Optimization (PSO) for Performance Optimization of Electronic Filter Circuit Designs." Applied Mechanics and Materials 229-231 (November 2012): 1643–50. http://dx.doi.org/10.4028/www.scientific.net/amm.229-231.1643.

Full text
Abstract:
This article discusses and analyzes particle swarm optimization (PSO) approach in the design and performance optimization of a 4th-order Sallen Key high pass filter. Three types of particle swarm features are studied: basic PSO, PSO with regrouped particles (PSO-RP) and PSO with diversity embedded regrouped particles (PSO-DRP). PSO-RP and PSO-DRP are proposed to solve the stagnation problem of basic PSO. Based on the developed PSO approaches, LTspice is employed as the circuit simulator for the performance investigation of the designed filter. In this paper, 12 design parameters of the Sallen Key high pass filter are optimized to satisfy the required constraints and specifications on gain, cut-off frequency, and pass band ripples. Overall results show that PSO with diversity embedded regrouped particles improve the conventional search of basic PSO and has managed to achieve the design objectives.
APA, Harvard, Vancouver, ISO, and other styles
24

Wu, Hai Ying. "Optimization Problem of Maintenance Resources Allocation Based on Partical Swarm Optimization." Advanced Materials Research 383-390 (November 2011): 5844–50. http://dx.doi.org/10.4028/www.scientific.net/amr.383-390.5844.

Full text
Abstract:
In the communications maintenance support system, how to progress the optimal allocation of maintenance resources is key to design the system. For Optimization Problem of Maintenance Resources Allocation, this paper analysis the phenomenon of premature convergence in Partical Swarm Optimization(PSO), and proposed an improved Particle Swarm Optimization. The two examples study how to use the improved PSO to solve Optimization Problem of Maintenance Resources and validate an improved PSO.
APA, Harvard, Vancouver, ISO, and other styles
25

El-Shorbagy, M. A., and Aboul Ella Hassanien. "Particle Swarm Optimization from Theory to Applications." International Journal of Rough Sets and Data Analysis 5, no. 2 (April 2018): 1–24. http://dx.doi.org/10.4018/ijrsda.2018040101.

Full text
Abstract:
Particle swarm optimization (PSO) is considered one of the most important methods in swarm intelligence. PSO is related to the study of swarms; where it is a simulation of bird flocks. It can be used to solve a wide variety of optimization problems such as unconstrained optimization problems, constrained optimization problems, nonlinear programming, multi-objective optimization, stochastic programming and combinatorial optimization problems. PSO has been presented in the literature and applied successfully in real life applications. In this paper, a comprehensive review of PSO as a well-known population-based optimization technique. The review starts by a brief introduction to the behavior of the PSO, then basic concepts and development of PSO are discussed, it's followed by the discussion of PSO inertia weight and constriction factor as well as issues related to parameter setting, selection and tuning, dynamic environments, and hybridization. Also, we introduced the other representation, convergence properties and the applications of PSO. Finally, conclusions and discussion are presented. Limitations to be addressed and the directions of research in the future are identified, and an extensive bibliography is also included.
APA, Harvard, Vancouver, ISO, and other styles
26

Wang, Xiao Lei, Yu Yang, Qiang Zeng, and Jin Qiang Wang. "Particle Swarm Optimization with Adaptive Inertia Weight and its Application in Optimization Design." Advanced Materials Research 97-101 (March 2010): 3484–88. http://dx.doi.org/10.4028/www.scientific.net/amr.97-101.3484.

Full text
Abstract:
To avoid the premature convergence caused by basic particle swarm optimization(PSO) in resolving engineering optimization design of highly complex and nonlinear constraints, a new particle swarm optimization algorithm with adaptive inertia weight (AIW-PSO) is proposed. In this algorithm, inertia weight is adaptively changed according to the current evolution speed and aggregation degree of the swarm, which provides the algorithm with dynamic adaptability, enhances the search ability and convergence performance of the algorithm. Moreover, penalty function is used to eliminate the constraints. Finally, the validity of AIW-PSO is verified through an optimization example.
APA, Harvard, Vancouver, ISO, and other styles
27

Sousa-Ferreira, Ivo, and Duarte Sousa. "A review of velocity-type PSO variants." Journal of Algorithms & Computational Technology 11, no. 1 (September 18, 2016): 23–30. http://dx.doi.org/10.1177/1748301816665021.

Full text
Abstract:
This paper presents a review of the particular variants of particle swarm optimization, based on the velocity-type class. The original particle swarm optimization algorithm was developed as an unconstrained optimization technique, which lacks a model that is able to handle constrained optimization problems. The particle swarm optimization and its inapplicability in constrained optimization problems are solved using the dynamic-objective constraint-handling method. The dynamic-objective constraint-handling method is originally developed for two variants of the basic particle swarm optimization, namely restricted velocity particle swarm optimization and self-adaptive velocity particle swarm optimization. Also on the subject velocity-type class, a review of three other variants is given, specifically: (1) vertical particle swarm optimization; (2) velocity limited particle swarm optimization; and (3) particle swarm optimization with scape velocity. These velocity-type particle swarm optimization variants all have in common a velocity parameter which determines the direction/movements of the particles.
APA, Harvard, Vancouver, ISO, and other styles
28

Li, Jian Guang, Ying Xue Yao, Dong Gao, Chang Qing Liu, and Zhe Jun Yuan. "Cutting Parameters Optimization by Using Particle Swarm Optimization (PSO)." Applied Mechanics and Materials 10-12 (December 2007): 879–83. http://dx.doi.org/10.4028/www.scientific.net/amm.10-12.879.

Full text
Abstract:
Cutting parameters play an essential role in the economics of machining. In this paper, particle swarm optimization (PSO), a novel optimization algorithm for cutting parameters optimization (CPO), was discussed comprehensively. First, the fundamental principle of PSO was introduced; then, the algorithm for PSO application in cutting parameters optimization was developed; thirdly, cutting experiments without and with optimized cutting parameters were conducted to demonstrate the effectiveness of optimization, respectively. The results show that the machining process was improved obviously.
APA, Harvard, Vancouver, ISO, and other styles
29

Fan, Shu-Kai S., and Chih-Hung Jen. "An Enhanced Partial Search to Particle Swarm Optimization for Unconstrained Optimization." Mathematics 7, no. 4 (April 17, 2019): 357. http://dx.doi.org/10.3390/math7040357.

Full text
Abstract:
Particle swarm optimization (PSO) is a population-based optimization technique that has been applied extensively to a wide range of engineering problems. This paper proposes a variation of the original PSO algorithm for unconstrained optimization, dubbed the enhanced partial search particle swarm optimizer (EPS-PSO), using the idea of cooperative multiple swarms in an attempt to improve the convergence and efficiency of the original PSO algorithm. The cooperative searching strategy is particularly devised to prevent the particles from being trapped into the local optimal solutions and tries to locate the global optimal solution efficiently. The effectiveness of the proposed algorithm is verified through the simulation study where the EPS-PSO algorithm is compared to a variety of exiting “cooperative” PSO algorithms in terms of noted benchmark functions.
APA, Harvard, Vancouver, ISO, and other styles
30

Казакова, Е. М. "A Concise Overview of Particle Swarm Optimization Methods." Вестник КРАУНЦ. Физико-математические науки, no. 2 (September 25, 2022): 150–74. http://dx.doi.org/10.26117/2079-6641-2022-39-2-150-174.

Full text
Abstract:
Оптимизация роем частиц или particle swarm optimization (PSO) — это метаэвристический метод глобальной оптимизации, первоначально предложенный Кеннеди и Эберхартом в 1995 году. В настоящее время это один из наиболее часто используемых методов оптимизации. В этом обзоре представлен краткий обзор исследований в области PSO за последние годы — методы инициализации роя и скорости в PSO, модификации, топологии соседства, гибридизации и обзор различных приложений PSO. Particle Swarm Optimization (PSO) is a meta-heuristic method of global, inferred, proposed by Kennedy and Eberhart in 1995. It is currently one of the most commonly used search methods. This review provides a brief overview of PSO research in recent years – swarm and rate initialization methods in PSO, modifications, neighborhood topologies, hybridization, and an overview of various PSO applications.
APA, Harvard, Vancouver, ISO, and other styles
31

Zhu, Yong Bin. "Overview of Particle Swarm Optimization." Applied Mechanics and Materials 543-547 (March 2014): 1597–600. http://dx.doi.org/10.4028/www.scientific.net/amm.543-547.1597.

Full text
Abstract:
Particle swarm optimization (PSO) is a new optimization algorithm based on swarm intelligence. Firstly, the paper briefly introduces the origin of the PSO, the basic algorithm and the basic model, but an overview on the basic principle of the algorithm and its improved algorithm is also provided. Then, the research status and the current application of the algorithm as well as the development direction in the future are reviewed.
APA, Harvard, Vancouver, ISO, and other styles
32

Huang, Ronggeng. "A Hybrid Particle Swarm Optimization and Single-Objective Slap Swarm Optimization Algorithm Based MPPT Strategy." Journal of Physics: Conference Series 2290, no. 1 (June 1, 2022): 012125. http://dx.doi.org/10.1088/1742-6596/2290/1/012125.

Full text
Abstract:
Abstract Under inhomogeneous irradiance conditions, multiple peak characteristic of P-V curve reduces the efficiency of conventional trackers like incremental conductance control. A hybrid particle swarm optimization and single-objective slap swarm optimization algorithm (PSO-SOSSOA) based which is applied to maximum power point tracking (MPPT) strategy is presented. The PSO-SOSSOA adopts the particle swarm optimization algorithm in the early period of the algorithm operation and a modified single-objective slap swarm optimization algorithm (SOSSOA) in the remaining period. The simulation results reveal that the PSO-SOSSOA can successfully deal with global MPPT and exhibits performance advantages in terms of efficiency, search speed, and power oscillation compared with the SOSSOA.
APA, Harvard, Vancouver, ISO, and other styles
33

Yuriy, Romasevych, and Loveikin Viatcheslav. "A Novel Multi-Epoch Particle Swarm Optimization Technique." Cybernetics and Information Technologies 18, no. 3 (September 1, 2018): 62–74. http://dx.doi.org/10.2478/cait-2018-0039.

Full text
Abstract:
Abstract Since canonical PSO method has many disadvantages which do not allow to effectively reach the global minima of various functions it needs to be improved. The article refers to a novel Multi-Epoch Particle Swarm Optimization (ME-PSO) technique which has been developed by authors. ME-PSO algorithm is based on reinitializing of the stagnant swarm with low exploration efficiency. This approach provides a high rate of global best changing. As a result ME-PSO has great possibility of finding good local (or even global) optimum and does not trap in bad local optimum. In order to prove the advantages of the ME-PSO technique numerical experiments have been carried out with ten uni- and multimodal benchmark functions. Analysis of the obtained results convincingly showed significant superiority of ME-PSO over PSO and IA-PSO algorithms. It has been set that canonical PSO is a special case of ME-PSO.
APA, Harvard, Vancouver, ISO, and other styles
34

Benuwa, Ben Bright, Benjamin Ghansah, Dickson Keddy Wornyo, and Sefakor Awurama Adabunu. "A Comprehensive Review of Particle Swarm Optimization." International Journal of Engineering Research in Africa 23 (April 2016): 141–61. http://dx.doi.org/10.4028/www.scientific.net/jera.23.141.

Full text
Abstract:
Particle swarm optimization (PSO) is a heuristic global optimization method. PSO was motivated by the social behavior of organisms, such as bird flocking, fish schooling and human social relations. Its properties of low constraint on the continuity of objective function and the ability to adapt various dynamic environments, makes PSO one of the most important swarm intelligence algorithms and ostensibly the most commonly used optimization technique. This survey presents a comprehensive investigation of PSO and in particular, a proposed theoretical framework to improve its implementation. We hope that this survey would be beneficial to researchers studying PSO algorithms and would also serve as the substratum for future research in the study area, particularly those pursuing their career in artificial intelligence. In the end, some important conclusions and possible research directions of PSO that need to be studied in the future are proposed.
APA, Harvard, Vancouver, ISO, and other styles
35

Qin, Hai Sheng, Deng Yue Wei, Jun Hui Li, Lei Zhang, and Yan Qiang Feng. "A Particle Swarm Optimization with Variety Factor." Applied Mechanics and Materials 239-240 (December 2012): 1291–97. http://dx.doi.org/10.4028/www.scientific.net/amm.239-240.1291.

Full text
Abstract:
A new particle swarm optimization (PSO) algorithm (a PSO with Variety Factor, VFPSO) based on the PSO was proposed. Compared with the previous algorithm, the proposed algorithm is to update the Variety Factor and to improve the inertia weight of the PSO. The target of the improvement is that the new algorithm could go on enhancing the robustness as before and should reduce the risk of premature convergence. The simulation experiments show that it has great advantages of convergence property over some other modified PSO algorithms, and also avoids algorithm being trapped in local minimum effectively. So it can avoid the phenomenon of premature convergence.
APA, Harvard, Vancouver, ISO, and other styles
36

Bangyal, Waqas Haider, Abdul Hameed, Wael Alosaimi, and Hashem Alyami. "A New Initialization Approach in Particle Swarm Optimization for Global Optimization Problems." Computational Intelligence and Neuroscience 2021 (May 17, 2021): 1–17. http://dx.doi.org/10.1155/2021/6628889.

Full text
Abstract:
Particle swarm optimization (PSO) algorithm is a population-based intelligent stochastic search technique used to search for food with the intrinsic manner of bee swarming. PSO is widely used to solve the diverse problems of optimization. Initialization of population is a critical factor in the PSO algorithm, which considerably influences the diversity and convergence during the process of PSO. Quasirandom sequences are useful for initializing the population to improve the diversity and convergence, rather than applying the random distribution for initialization. The performance of PSO is expanded in this paper to make it appropriate for the optimization problem by introducing a new initialization technique named WELL with the help of low-discrepancy sequence. To solve the optimization problems in large-dimensional search spaces, the proposed solution is termed as WE-PSO. The suggested solution has been verified on fifteen well-known unimodal and multimodal benchmark test problems extensively used in the literature, Moreover, the performance of WE-PSO is compared with the standard PSO and two other initialization approaches Sobol-based PSO (SO-PSO) and Halton-based PSO (H-PSO). The findings indicate that WE-PSO is better than the standard multimodal problem-solving techniques. The results validate the efficacy and effectiveness of our approach. In comparison, the proposed approach is used for artificial neural network (ANN) learning and contrasted to the standard backpropagation algorithm, standard PSO, H-PSO, and SO-PSO, respectively. The results of our technique has a higher accuracy score and outperforms traditional methods. Also, the outcome of our work presents an insight on how the proposed initialization technique has a high effect on the quality of cost function, integration, and diversity aspects.
APA, Harvard, Vancouver, ISO, and other styles
37

Borowska, Bożena. "Learning Competitive Swarm Optimization." Entropy 24, no. 2 (February 16, 2022): 283. http://dx.doi.org/10.3390/e24020283.

Full text
Abstract:
Particle swarm optimization (PSO) is a popular method widely used in solving different optimization problems. Unfortunately, in the case of complex multidimensional problems, PSO encounters some troubles associated with the excessive loss of population diversity and exploration ability. This leads to a deterioration in the effectiveness of the method and premature convergence. In order to prevent these inconveniences, in this paper, a learning competitive swarm optimization algorithm (LCSO) based on the particle swarm optimization method and the competition mechanism is proposed. In the first phase of LCSO, the swarm is divided into sub-swarms, each of which can work in parallel. In each sub-swarm, particles participate in the tournament. The participants of the tournament update their knowledge by learning from their competitors. In the second phase, information is exchanged between sub-swarms. The new algorithm was examined on a set of test functions. To evaluate the effectiveness of the proposed LCSO, the test results were compared with those achieved through the competitive swarm optimizer (CSO), comprehensive particle swarm optimizer (CLPSO), PSO, fully informed particle swarm (FIPS), covariance matrix adaptation evolution strategy (CMA-ES) and heterogeneous comprehensive learning particle swarm optimization (HCLPSO). The experimental results indicate that the proposed approach enhances the entropy of the particle swarm and improves the search process. Moreover, the LCSO algorithm is statistically and significantly more efficient than the other tested methods.
APA, Harvard, Vancouver, ISO, and other styles
38

Chuang, Li-Yeh, Sheng-Wei Tsai, and Cheng-Hong Yang. "Fuzzy adaptive catfish particle swarm optimization." Artificial Intelligence Research 1, no. 2 (September 26, 2012): 149. http://dx.doi.org/10.5430/air.v1n2p149.

Full text
Abstract:
The catfish particle swarm optimization (CatfishPSO) algorithm is a novel swarm intelligence optimization technique.This algorithm was inspired by the interactive behavior of sardines and catfish. The observed catfish effect is applied toimprove the performance of particle swarm optimization (PSO). In this paper, we propose fuzzy CatfishPSO(F-CatfishPSO), which uses fuzzy to dynamically change the inertia weight of CatfishPSO. Ten benchmark functions with10, 20, and 30 different dimensions were selected as the test functions. Statistical analysis of the experimental resultsindicates that F-CatfishPSO outperformed PSO, F-PSO and CatfishPSO.
APA, Harvard, Vancouver, ISO, and other styles
39

Leong, Wen Fung, and Gary G. Yen. "Constraint Handling in Particle Swarm Optimization." International Journal of Swarm Intelligence Research 1, no. 1 (January 2010): 42–63. http://dx.doi.org/10.4018/jsir.2010010103.

Full text
Abstract:
In this article, the authors propose a particle swarm optimization (PSO) for constrained optimization. The proposed PSO adopts a multiobjective approach to constraint handling. Procedures to update the feasible and infeasible personal best are designed to encourage finding feasible regions and convergence toward the Pareto front. In addition, the infeasible nondominated solutions are stored in the global best archive to exploit the hidden information for guiding the particles toward feasible regions. Furthermore, the number of feasible personal best in the personal best memory and the scalar constraint violations of personal best and global best are used to adapt the acceleration constants in the PSO flight equations. The purpose is to find more feasible particles and search for better solutions during the process. The mutation procedure is applied to encourage global and fine-tune local searches. The simulation results indicate that the proposed constrained PSO is highly competitive, achieving promising performance.
APA, Harvard, Vancouver, ISO, and other styles
40

Li, Jing, Yifei Sun, and Sicheng Hou. "Particle Swarm Optimization Algorithm with Multiple Phases for Solving Continuous Optimization Problems." Discrete Dynamics in Nature and Society 2021 (June 26, 2021): 1–13. http://dx.doi.org/10.1155/2021/8378579.

Full text
Abstract:
An algorithm with different parameter settings often performs differently on the same problem. The parameter settings are difficult to determine before the optimization process. The variants of particle swarm optimization (PSO) algorithms are studied as exemplars of swarm intelligence algorithms. Based on the concept of building block thesis, a PSO algorithm with multiple phases was proposed to analyze the relation between search strategies and the solved problems. Two variants of the PSO algorithm, which were termed as the PSO with fixed phase (PSOFP) algorithm and PSO with dynamic phase (PSODP) algorithm, were compared with six variants of the standard PSO algorithm in the experimental study. The benchmark functions for single-objective numerical optimization, which includes 12 functions in 50 and 100 dimensions, are used in the experimental study, respectively. The experimental results have verified the generalization ability of the proposed PSO variants.
APA, Harvard, Vancouver, ISO, and other styles
41

Elhossini, Ahmed, Shawki Areibi, and Robert Dony. "Strength Pareto Particle Swarm Optimization and Hybrid EA-PSO for Multi-Objective Optimization." Evolutionary Computation 18, no. 1 (March 2010): 127–56. http://dx.doi.org/10.1162/evco.2010.18.1.18105.

Full text
Abstract:
This paper proposes an efficient particle swarm optimization (PSO) technique that can handle multi-objective optimization problems. It is based on the strength Pareto approach originally used in evolutionary algorithms (EA). The proposed modified particle swarm algorithm is used to build three hybrid EA-PSO algorithms to solve different multi-objective optimization problems. This algorithm and its hybrid forms are tested using seven benchmarks from the literature and the results are compared to the strength Pareto evolutionary algorithm (SPEA2) and a competitive multi-objective PSO using several metrics. The proposed algorithm shows a slower convergence, compared to the other algorithms, but requires less CPU time. Combining PSO and evolutionary algorithms leads to superior hybrid algorithms that outperform SPEA2, the competitive multi-objective PSO (MO-PSO), and the proposed strength Pareto PSO based on different metrics.
APA, Harvard, Vancouver, ISO, and other styles
42

Zhang, Zhaojun, Xuanyu Li, Shengyang Luan, and Zhaoxiong Xu. "An efficient particle swarm optimization with homotopy strategy for global numerical optimization." Journal of Intelligent & Fuzzy Systems 40, no. 3 (March 2, 2021): 4301–15. http://dx.doi.org/10.3233/jifs-200979.

Full text
Abstract:
Particle swarm optimization (PSO) as a successful optimization algorithm is widely used in many practical applications due to its advantages in fast convergence speed and convenient implementation. As a population optimization algorithm, the quality of initial population plays an important role in the performance of PSO. However, random initialization is used in population initialization for PSO. Using the solution of the solved problem as prior knowledge will help to improve the quality of the initial population solution. In this paper, we use homotopy analysis method (HAM) to build a bridge between the solved problems and the problems to be solved. Therefore, an improved PSO framework based on HAM, called HAM-PSO, is proposed. The framework of HAM-PSO includes four main processes. It contains obtaining the prior knowledge, constructing homotopy function, generating initial solution and solving the to be solved by PSO. In fact, the framework does not change the PSO, but replaces the random population initialization. The basic PSO algorithm and three others typical PSO algorithms are used to verify the feasibility and effectiveness of this framework. The experimental results show that the four PSO using this framework are better than those without this framework.
APA, Harvard, Vancouver, ISO, and other styles
43

Ludwig, Simone A. "Repulsive Self-Adaptive Acceleration Particle Swarm Optimization Approach." Journal of Artificial Intelligence and Soft Computing Research 4, no. 3 (July 1, 2014): 189–204. http://dx.doi.org/10.1515/jaiscr-2015-0008.

Full text
Abstract:
Abstract Adaptive Particle Swarm Optimization (PSO) variants have become popular in recent years. The main idea of these adaptive PSO variants is that they adaptively change their search behavior during the optimization process based on information gathered during the run. Adaptive PSO variants have shown to be able to solve a wide range of difficult optimization problems efficiently and effectively. In this paper we propose a Repulsive Self-adaptive Acceleration PSO (RSAPSO) variant that adaptively optimizes the velocity weights of every particle at every iteration. The velocity weights include the acceleration constants as well as the inertia weight that are responsible for the balance between exploration and exploitation. Our proposed RSAPSO variant optimizes the velocity weights that are then used to search for the optimal solution of the problem (e.g., benchmark function). We compare RSAPSO to four known adaptive PSO variants (decreasing weight PSO, time-varying acceleration coefficients PSO, guaranteed convergence PSO, and attractive and repulsive PSO) on twenty benchmark problems. The results show that RSAPSO achives better results compared to the known PSO variants on difficult optimization problems that require large numbers of function evaluations.
APA, Harvard, Vancouver, ISO, and other styles
44

Marini, Federico, and Beata Walczak. "Particle swarm optimization (PSO). A tutorial." Chemometrics and Intelligent Laboratory Systems 149 (December 2015): 153–65. http://dx.doi.org/10.1016/j.chemolab.2015.08.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Jakubcová, Michala, Petr Máca, and Pavel Pech. "A Comparison of Selected Modifications of the Particle Swarm Optimization Algorithm." Journal of Applied Mathematics 2014 (2014): 1–10. http://dx.doi.org/10.1155/2014/293087.

Full text
Abstract:
We compare 27 modifications of the original particle swarm optimization (PSO) algorithm. The analysis evaluated nine basic PSO types, which differ according to the swarm evolution as controlled by various inertia weights and constriction factor. Each of the basic PSO modifications was analyzed using three different distributed strategies. In the first strategy, the entire swarm population is considered as one unit (OC-PSO), the second strategy periodically partitions the population into equally large complexes according to the particle’s functional value (SCE-PSO), and the final strategy periodically splits the swarm population into complexes using random permutation (SCERand-PSO). All variants are tested using 11 benchmark functions that were prepared for the special session on real-parameter optimization of CEC 2005. It was found that the best modification of the PSO algorithm is a variant with adaptive inertia weight. The best distribution strategy is SCE-PSO, which gives better results than do OC-PSO and SCERand-PSO for seven functions. The sphere function showed no significant difference between SCE-PSO and SCERand-PSO. It follows that a shuffling mechanism improves the optimization process.
APA, Harvard, Vancouver, ISO, and other styles
46

Zhang, Bi, Jia Yang Wang, and Zuo Yong Li. "Parameter Optimization of Street-Phelps Model Based on Particle Swarm Optimization." Advanced Materials Research 710 (June 2013): 647–50. http://dx.doi.org/10.4028/www.scientific.net/amr.710.647.

Full text
Abstract:
Particle swarm optimization (PSO) is introduced and tried to optimize the parameters of Street-Phelps model. Here, the inertial adjustment method was used to adopt the inertial weight of PSO. Parameters of Street-Phelps model were optimized by PSO, the performance is compared with other method. Results show that PSO plays an important role in solving global optimization problem, and demonstrate the effectiveness and higher accuracy than other methods.
APA, Harvard, Vancouver, ISO, and other styles
47

Reddy T, Chandrasekhara, Srivani V, A. Mallikarjuna Reddy, and G. Vishnu Murthy. "Test Case Optimization and Prioritization Using Improved Cuckoo Search and Particle Swarm Optimization Algorithm." International Journal of Engineering & Technology 7, no. 4.6 (September 25, 2018): 275. http://dx.doi.org/10.14419/ijet.v7i4.6.20489.

Full text
Abstract:
For minimized t-way test suite generation (t indicates more strength of interaction) recently many meta-heuristic, hybrid and hyper-heuristic algorithms are proposed which includes Artificial Bee Colony (ABC), Ant Colony Optimization (ACO), Genetic Algorithms (GA), Simulated Annealing (SA), Cuckoo Search (CS), Harmony Elements Algorithm (HE), Exponential Monte Carlo with counter (EMCQ), Particle Swarm Optimization (PSO), and Choice Function (CF). Although useful strategies are required specific domain knowledge to allow effective tuning before good quality solutions can be obtained. In our proposed technique test cases are optimized by utilizing Improved Cuckoo Algorithm (ICSA). At that point, the advanced experiments are organized or prioritized by utilizing Particle Swarm Optimization algorithm (PSO). The Particle Swarm Optimization and Improved Cuckoo Algorithm (PSOICSA) estimation is a blend of Improved Cuckoo Search Algorithm(ICSA) and Particle Swarm Optimization (PSO). PSOICSA could be utilized to advance the test suite, and coordinate both ICSA and PSO for a superior outcome, when contrasted with their individual execution as far as experiment improvement.
APA, Harvard, Vancouver, ISO, and other styles
48

Qian, Wei Yi, and Guang Lei Liu. "Particle Swarm Optimization with Crossover Operator for Global Optimization Problems." Applied Mechanics and Materials 373-375 (August 2013): 1131–34. http://dx.doi.org/10.4028/www.scientific.net/amm.373-375.1131.

Full text
Abstract:
We propose a modified particle swarm optimization (PSO) algorithm named SPSO for the global optimization problems. In SPSO, we introduce the crossover operator in order to increase the diversity of the swarm. The crossover operator is contracted by forming a simplex. The crossover operator is used if the diversity of the swarm is below a threshold (denoted hlow) and continues until the diversity reaches the required value (hhigh). The six test problems are used for numerical study. Numerical results indicate that the proposed algorithm is better than some existing PSO.
APA, Harvard, Vancouver, ISO, and other styles
49

Jain, Meetu, Vibha Saihjpal, Narinder Singh, and Satya Bir Singh. "An Overview of Variants and Advancements of PSO Algorithm." Applied Sciences 12, no. 17 (August 23, 2022): 8392. http://dx.doi.org/10.3390/app12178392.

Full text
Abstract:
Particle swarm optimization (PSO) is one of the most famous swarm-based optimization techniques inspired by nature. Due to its properties of flexibility and easy implementation, there is an enormous increase in the popularity of this nature-inspired technique. Particle swarm optimization (PSO) has gained prompt attention from every field of researchers. Since its origin in 1995 till now, researchers have improved the original Particle swarm optimization (PSO) in varying ways. They have derived new versions of it, such as the published theoretical studies on various parameters of PSO, proposed many variants of the algorithm and numerous other advances. In the present paper, an overview of the PSO algorithm is presented. On the one hand, the basic concepts and parameters of PSO are explained, on the other hand, various advances in relation to PSO, including its modifications, extensions, hybridization, theoretical analysis, are included.
APA, Harvard, Vancouver, ISO, and other styles
50

Muhammed Neda, Omar, and Alfian Ma'arif. "Chaotic Particle Swarm Optimization for Solving Reactive Power Optimization Problem." International Journal of Robotics and Control Systems 1, no. 4 (January 28, 2022): 523–33. http://dx.doi.org/10.31763/ijrcs.v1i4.539.

Full text
Abstract:
The losses in electrical power systems are a great problem. Multiple methods have been utilized to decrease power losses in transmission lines. The proper adjusting of reactive power resources is one way to minimize the losses in any power system. Reactive Power Optimization (RPO) problem is a nonlinear and complex optimization problem and contains equality and inequality constraints. The RPO is highly essential in the operation and control of power systems. Therefore, the study concentrates on the Optimal Load Flow calculation in solving RPO problems. The Simple Particle Swarm Optimization (PSO) often falls into the local optima solution. To prevent this limitation and speed up the convergence for the Simple PSO algorithm, this study employed an improved hybrid algorithm based on Chaotic theory with PSO, called Chaotic PSO (CPSO) algorithm. Undeniably, this merging of chaotic theory in PSO algorithm can be an efficient method to slip very easily from local optima compared to Simple PSO algorithm due to remarkable behavior and high ability of the chaos. In this study, the CPSO algorithm was utilized as an optimization tool for solving the RPO problem; the main objective in this study is to decrease the power loss and enhance the voltage profile in the power system. The presented algorithm was tested on IEEE Node-14 system. The simulation implications for this system reveal that the CPSO algorithm provides the best results. It had a high ability to minimize transmission line losses and improve the system's voltage profile compared to the Simple PSO and other approaches in the literature.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography