Littérature scientifique sur le sujet « Optimization algorithms »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Optimization algorithms ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Articles de revues sur le sujet "Optimization algorithms"

1

Celik, Yuksel, et Erkan Ulker. « An Improved Marriage in Honey Bees Optimization Algorithm for Single Objective Unconstrained Optimization ». Scientific World Journal 2013 (2013) : 1–11. http://dx.doi.org/10.1155/2013/370172.

Texte intégral
Résumé :
Marriage in honey bees optimization (MBO) is a metaheuristic optimization algorithm developed by inspiration of the mating and fertilization process of honey bees and is a kind of swarm intelligence optimizations. In this study we propose improved marriage in honey bees optimization (IMBO) by adding Levy flight algorithm for queen mating flight and neighboring for worker drone improving. The IMBO algorithm’s performance and its success are tested on the well-known six unconstrained test functions and compared with other metaheuristic optimization algorithms.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Luan, Yuxuan, Junjiang He, Jingmin Yang, Xiaolong Lan et Geying Yang. « Uniformity-Comprehensive Multiobjective Optimization Evolutionary Algorithm Based on Machine Learning ». International Journal of Intelligent Systems 2023 (10 novembre 2023) : 1–21. http://dx.doi.org/10.1155/2023/1666735.

Texte intégral
Résumé :
When solving real-world optimization problems, the uniformity of Pareto fronts is an essential strategy in multiobjective optimization problems (MOPs). However, it is a common challenge for many existing multiobjective optimization algorithms due to the skewed distribution of solutions and biases towards specific objective functions. This paper proposes a uniformity-comprehensive multiobjective optimization evolutionary algorithm based on machine learning to address this limitation. Our algorithm utilizes uniform initialization and self-organizing map (SOM) to enhance population diversity and uniformity. We track the IGD value and use K-means and CNN refinement with crossover and mutation techniques during evolutionary stages. Our algorithm’s uniformity and objective function balance superiority were verified through comparative analysis with 13 other algorithms, including eight traditional multiobjective optimization algorithms, three machine learning-based enhanced multiobjective optimization algorithms, and two algorithms with objective initialization improvements. Based on these comprehensive experiments, it has been proven that our algorithm outperforms other existing algorithms in these areas.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Wen, Xiaodong, Xiangdong Liu, Cunhui Yu, Haoning Gao, Jing Wang, Yongji Liang, Jiangli Yu et Yan Bai. « IOOA : A multi-strategy fusion improved Osprey Optimization Algorithm for global optimization ». Electronic Research Archive 32, no 3 (2024) : 2033–74. http://dx.doi.org/10.3934/era.2024093.

Texte intégral
Résumé :
<abstract><p>With the widespread application of metaheuristic algorithms in engineering and scientific research, finding algorithms with efficient global search capabilities and precise local search performance has become a hot topic in research. The osprey optimization algorithm (OOA) was first proposed in 2023, characterized by its simple structure and strong optimization capability. However, practical tests have revealed that the OOA algorithm inevitably encounters common issues faced by metaheuristic algorithms, such as the tendency to fall into local optima and reduced population diversity in the later stages of the algorithm's iterations. To address these issues, a multi-strategy fusion improved osprey optimization algorithm is proposed (IOOA). First, the characteristics of various chaotic mappings were thoroughly explored, and the adoption of Circle chaotic mapping to replace pseudo-random numbers for population initialization improvement was proposed, increasing initial population diversity and improving the quality of initial solutions. Second, a dynamically adjustable elite guidance mechanism was proposed to dynamically adjust the position updating method according to different stages of the algorithm's iteration, ensuring the algorithm maintains good global search capabilities while significantly increasing the convergence speed of the algorithm. Lastly, a dynamic chaotic weight factor was designed and applied in the development stage of the original algorithm to enhance the algorithm's local search capability and improve the convergence accuracy of the algorithm. To fully verify the effectiveness and practical engineering applicability of the IOOA algorithm, simulation experiments were conducted using 21 benchmark test functions and the CEC-2022 benchmark functions, and the IOOA algorithm was applied to the LSTM power load forecasting problem as well as two engineering design problems. The experimental results show that the IOOA algorithm possesses outstanding global optimization performance in handling complex optimization problems and broad applicability in practical engineering applications.</p></abstract>
Styles APA, Harvard, Vancouver, ISO, etc.
4

Kim, Minsu, Areum Han, Jaewon Lee, Sunghyun Cho, Il Moon et Jonggeol Na. « Comparison of Derivative-Free Optimization : Energy Optimization of Steam Methane Reforming Process ». International Journal of Energy Research 2023 (3 juin 2023) : 1–20. http://dx.doi.org/10.1155/2023/8868540.

Texte intégral
Résumé :
In modern chemical engineering, various derivative-free optimization (DFO) studies have been conducted to identify operating conditions that maximize energy efficiency for efficient operation of processes. Although DFO algorithm selection is an essential task that leads to successful designs, it is a nonintuitive task because of the uncertain performance of the algorithms. In particular, when the system evaluation cost or computational load is high (e.g., density functional theory and computational fluid dynamics), selecting an algorithm that quickly converges to the near-global optimum at the early stage of optimization is more important. In this study, we compare the optimization performance in the early stage of 12 algorithms. The performance of deterministic global search algorithms, global model-based search algorithms, metaheuristic algorithms, and Bayesian optimization is compared by applying benchmark problems and analyzed based on the problem types and number of variables. Furthermore, we apply all algorithms to the energy process optimization that maximizes the thermal efficiency of the steam methane reforming (SMR) process for hydrogen production. In this application, we have identified a hidden constraint based on real-world operations, and we are addressing it by using a penalty function. Bayesian optimizations explore the design space most efficiently by training infeasible regions. As a result, we have observed a substantial improvement in thermal efficiency of 12.9% compared to the base case and 7% improvement when compared to the lowest performing algorithm.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Priyadarshini, Ishaani. « Dendritic Growth Optimization : A Novel Nature-Inspired Algorithm for Real-World Optimization Problems ». Biomimetics 9, no 3 (21 février 2024) : 130. http://dx.doi.org/10.3390/biomimetics9030130.

Texte intégral
Résumé :
In numerous scientific disciplines and practical applications, addressing optimization challenges is a common imperative. Nature-inspired optimization algorithms represent a highly valuable and pragmatic approach to tackling these complexities. This paper introduces Dendritic Growth Optimization (DGO), a novel algorithm inspired by natural branching patterns. DGO offers a novel solution for intricate optimization problems and demonstrates its efficiency in exploring diverse solution spaces. The algorithm has been extensively tested with a suite of machine learning algorithms, deep learning algorithms, and metaheuristic algorithms, and the results, both before and after optimization, unequivocally support the proposed algorithm’s feasibility, effectiveness, and generalizability. Through empirical validation using established datasets like diabetes and breast cancer, the algorithm consistently enhances model performance across various domains. Beyond its working and experimental analysis, DGO’s wide-ranging applications in machine learning, logistics, and engineering for solving real-world problems have been highlighted. The study also considers the challenges and practical implications of implementing DGO in multiple scenarios. As optimization remains crucial in research and industry, DGO emerges as a promising avenue for innovation and problem solving.
Styles APA, Harvard, Vancouver, ISO, etc.
6

RAO, Xiong, Run DU, Wenming CHENG et Yi YANG. « Modified proportional topology optimization algorithm for multiple optimization problems ». Mechanics 30, no 1 (23 février 2024) : 36–45. http://dx.doi.org/10.5755/j02.mech.34367.

Texte intégral
Résumé :
Three modified proportional topology optimization (MPTO) algorithms are presented in this paper, which are named MPTOc, MPTOs and MPTOm, respectively. MPTOc aims to address the minimum compliance problem with volume constraint, MPTOs aims to solve the minimum volume fraction problem under stress constraint, and MPTOm aims to tackle the minimum volume fraction problem under compliance and stress constraints. In order to get rid of the shortcomings of the original proportional topology optimization (PTO) algorithm and improve the comprehensive performance of the PTO algorithm, the proposed algorithms modify the material interpolation scheme and introduce the Heaviside threshold function based on the PTO algorithm. To confirm the effectiveness and superiority of the presented algorithms, multiple optimization problems for the classical MBB beam are solved, and the original PTO algorithm is compared with the new algorithms. Numerical examples show that MPTOc, MPTOs and MPTOm enjoy distinct advantages over the PTO algorithm in the matter of convergence efficiency and the ability to obtain distinct topology structure without redundancy. Moreover, MPTOc has the fastest convergence speed among these algorithms and can acquire the smallest (best) compliance value. In addition, the new algorithms are also superior to PTO concerning suppressing gray-scale elements.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Gireesha. B, Mr, et . « A Literature Survey on Artificial Swarm Intelligence based Optimization Techniques ». International Journal of Engineering & ; Technology 7, no 4.5 (22 septembre 2018) : 455. http://dx.doi.org/10.14419/ijet.v7i4.5.20205.

Texte intégral
Résumé :
From few decades’ optimizations techniques plays a key role in engineering and technological field applications. They are known for their behaviour pattern for solving modern engineering problems. Among various optimization techniques, heuristic and meta-heuristic algorithms proved to be efficient. In this paper, an effort is made to address techniques that are commonly used in engineering applications. This paper presents a basic overview of such optimization algorithms namely Artificial Bee Colony (ABC) Algorithm, Ant Colony Optimization (ACO) Algorithm, Fire-fly Algorithm (FFA) and Particle Swarm Optimization (PSO) is presented and also the most suitable fitness functions and its numerical expressions have discussed.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Acherjee, Bappa, Debanjan Maity et Arunanshu S. Kuar. « Ultrasonic Machining Process Optimization by Cuckoo Search and Chicken Swarm Optimization Algorithms ». International Journal of Applied Metaheuristic Computing 11, no 2 (avril 2020) : 1–26. http://dx.doi.org/10.4018/ijamc.2020040101.

Texte intégral
Résumé :
The ultrasonic machining (USM) process has been analyzed in the present study to obtain the desired process responses by optimizing machining parameters using cuckoo search (CS) and chicken swarm optimization (CSO), two powerful nature-inspired, population and swarm-intelligence-based metaheuristic algorithms. The CS and CSO algorithms have been compared with other non-conventional optimization techniques in terms of optimal results, convergence, accuracy, and computational time. It is found that CS and CSO algorithms predict superior single and multi-objective optimization results than gravitational search algorithms (GSAs), genetic algorithms (GAs), particle swarm optimization (PSO) algorithms, ant colony optimization (ACO) algorithms and artificial bee colony (ABC) algorithms, and gives exactly the same results as predicted by the fireworks algorithm (FWA). The CS algorithm outperforms all other algorithms namely CSO, FWA, GSA, GA, PSO, ACO, and ABC algorithms in terms of mean computational time, whereas, the CSO algorithm outperforms all other algorithms except for the CS and GSA algorithms.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Alfarhisi, Zikrie Pramudia, Hadi Suyono et Fakhriy Hario Partiansyah. « 4G LTE Network Coverage Optimization Using Metaheuristic Approach ». International Journal of Computer Applications Technology and Researc 10, no 01 (1 janvier 2021) : 010–13. http://dx.doi.org/10.7753/ijcatr1001.1003.

Texte intégral
Résumé :
The main focus of this paper is to optimize the coverage of each 4G LTE network cell within the service area. There are many algorithms can be implemented to determine the optimal 4G LTE coverage area including the deterministic and heuristic approaches. The deterministic approach could solve accurately the optimization problem but need more resources and time consuming to determine the convergence parameters. Therefore, the heuristic approaches were introduced to improve the deterministic approach drawback. The methods used are the Differential Evolution Algorithm (DEA) and Adaptive Mutation Genetic Algorithm (AMGA), which are categorized as metaheuristic approach. The DEA and AMGA algorithms have been widely used to solve combinatorial problems, including for solving the network optimizations. In the network optimization, coverage is strongly related to 2 objectives, which are reducing the black spot area and decreasing the overlapping coverage areas. Coverage overlap is a condition when some cell sites in an area overlap. It implies in the occurrence of hand off and an inefficient network management. This research aims to obtain an optimal 4G LTE network coverage and reduce the overlapping coverage areas based on effective e-Node B arrangements by using the DEA and AMGA algorithms. The simulations results showed that the DEA algorithm’s coverage effectiveness was 23,4%, and the AMGA Algorithm’s was 16,32%.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Si, Binghui, Feng Liu et Yanxia Li. « Metamodel-Based Hyperparameter Optimization of Optimization Algorithms in Building Energy Optimization ». Buildings 13, no 1 (9 janvier 2023) : 167. http://dx.doi.org/10.3390/buildings13010167.

Texte intégral
Résumé :
Building energy optimization (BEO) is a promising technique to achieve energy efficient designs. The efficacy of optimization algorithms is imperative for the BEO technique and is significantly dependent on the algorithm hyperparameters. Currently, studies focusing on algorithm hyperparameters are scarce, and common agreement on how to set their values, especially for BEO problems, is still lacking. This study proposes a metamodel-based methodology for hyperparameter optimization of optimization algorithms applied in BEO. The aim is to maximize the algorithmic efficacy and avoid the failure of the BEO technique because of improper algorithm hyperparameter settings. The method consists of three consecutive steps: constructing the specific BEO problem, developing an ANN-trained metamodel of the problem, and optimizing algorithm hyperparameters with nondominated sorting genetic algorithm II (NSGA-II). To verify the validity, 15 benchmark BEO problems with different properties, i.e., five building models and three design variable categories, were constructed for numerical experiments. For each problem, the hyperparameters of four commonly used algorithms, i.e., the genetic algorithm (GA), the particle swarm optimization (PSO) algorithm, simulated annealing (SA), and the multi-objective genetic algorithm (MOGA), were optimized. Results demonstrated that the MOGA benefited the most from hyperparameter optimization in terms of the quality of the obtained optimum, while PSO benefited the most in terms of the computing time.
Styles APA, Harvard, Vancouver, ISO, etc.

Thèses sur le sujet "Optimization algorithms"

1

Astete, morales Sandra. « Contributions to Convergence Analysis of Noisy Optimization Algorithms ». Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS327/document.

Texte intégral
Résumé :
Cette thèse montre des contributions à l'analyse d'algorithmes pour l'optimisation de fonctions bruitées. Les taux de convergences (regret simple et regret cumulatif) sont analysés pour les algorithmes de recherche linéaire ainsi que pour les algorithmes de recherche aléatoires. Nous prouvons que les algorithmes basé sur la matrice hessienne peuvent atteindre le même résultat que certaines algorithmes optimaux, lorsque les paramètres sont bien choisis. De plus, nous analysons l'ordre de convergence des stratégies évolutionnistes pour des fonctions bruitées. Nous déduisons une convergence log-log. Nous prouvons aussi une borne basse pour le taux de convergence de stratégies évolutionnistes. Nous étendons le travail effectué sur les mécanismes de réévaluations en les appliquant au cas discret. Finalement, nous analysons la mesure de performance en elle-même et prouvons que l'utilisation d'une mauvaise mesure de performance peut mener à des résultats trompeurs lorsque différentes méthodes d'optimisation sont évaluées
This thesis exposes contributions to the analysis of algorithms for noisy functions. It exposes convergence rates for linesearch algorithms as well as for random search algorithms. We prove in terms of Simple Regret and Cumulative Regret that a Hessian based algorithm can reach the same results as some optimal algorithms in the literature, when parameters are tuned correctly. On the other hand we analyse the convergence order of Evolution Strategies when solving noisy functions. We deduce log-log convergence. We also give a lower bound for the convergence rate of the Evolution Strategies. We extend the work on revaluation by applying it to a discrete settings. Finally we analyse the performance measure itself and prove that the use of an erroneus performance measure can lead to misleading results on the evaluation of different methods
Styles APA, Harvard, Vancouver, ISO, etc.
2

Reimann, Axel. « Evolutionary algorithms and optimization ». Doctoral thesis, [S.l. : s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=969093497.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Parpas, Panayiotis. « Algorithms for stochastic optimization ». Thesis, Imperial College London, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.434980.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Johnson, Jared. « Algorithms for Rendering Optimization ». Doctoral diss., University of Central Florida, 2012. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5329.

Texte intégral
Résumé :
This dissertation explores algorithms for rendering optimization realizable within a modern, complex rendering engine. The first part contains optimized rendering algorithms for ray tracing. Ray tracing algorithms typically provide properties of simplicity and robustness that are highly desirable in computer graphics. We offer several novel contributions to the problem of interactive ray tracing of complex lighting environments. We focus on the problem of maintaining interactivity as both geometric and lighting complexity grows without effecting the simplicity or robustness of ray tracing. First, we present a new algorithm called occlusion caching for accelerating the calculation of direct lighting from many light sources. We cache light visibility information sparsely across a scene. When rendering direct lighting for all pixels in a frame, we combine cached lighting information to determine whether or not shadow rays are needed. Since light visibility and scene location are highly correlated, our approach precludes the need for most shadow rays. Second, we present improvements to the irradiance caching algorithm. Here we demonstrate a new elliptical cache point spacing heuristic that reduces the number of cache points required by taking into account the direction of irradiance gradients. We also accelerate irradiance caching by efficiently and intuitively coupling it with occlusion caching. In the second part of this dissertation, we present optimizations to rendering algorithms for participating media. Specifically, we explore the implementation and use of photon beams as an efficient, intuitive artistic primitive. We detail our implementation of the photon beams algorithm into PhotoRealistic RenderMan (PRMan). We show how our implementation maintains the benefits of the industry standard Reyes rendering pipeline, with proper motion blur and depth of field. We detail an automatic photon beam generation algorithm, utilizing PRMan shadow maps. We accelerate the rendering of camera-facing photon beams by utilizing Gaussian quadrature for path integrals in place of ray marching. Our optimized implementation allows for incredible versatility and intuitiveness in artistic control of volumetric lighting effects. Finally, we demonstrate the usefulness of photon beams as artistic primitives by detailing their use in a feature-length animated film.
Ph.D.
Doctorate
Computer Science
Engineering and Computer Science
Computer Science
Styles APA, Harvard, Vancouver, ISO, etc.
5

CESARI, TOMMASO RENATO. « ALGORITHMS, LEARNING, AND OPTIMIZATION ». Doctoral thesis, Università degli Studi di Milano, 2020. http://hdl.handle.net/2434/699354.

Texte intégral
Résumé :
This thesis covers some algorithmic aspects of online machine learning and optimization. In Chapter 1 we design algorithms with state-of-the-art regret guarantees for the problem dynamic pricing. In Chapter 2 we move on to an asynchronous online learning setting in which only some of the agents in the network are active at each time step. We show that when information is shared among neighbors, knowledge about the graph structure might have a significantly different impact on learning rates depending on how agents are activated. In Chapter 3 we investigate the online problem of multivariate non-concave maximization under weak assumptions on the regularity of the objective function. In Chapter 4 we introduce a new performance measure and design an efficient algorithm to learn optimal policies in repeated A/B testing.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Stults, Ian Collier. « A multi-fidelity analysis selection method using a constrained discrete optimization formulation ». Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/31706.

Texte intégral
Résumé :
Thesis (Ph.D)--Aerospace Engineering, Georgia Institute of Technology, 2010.
Committee Chair: Mavris, Dimitri; Committee Member: Beeson, Don; Committee Member: Duncan, Scott; Committee Member: German, Brian; Committee Member: Kumar, Viren. Part of the SMARTech Electronic Thesis and Dissertation Collection.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Rafique, Abid. « Communication optimization in iterative numerical algorithms : an algorithm-architecture interaction ». Thesis, Imperial College London, 2014. http://hdl.handle.net/10044/1/17837.

Texte intégral
Résumé :
Trading communication with redundant computation can increase the silicon efficiency of common hardware accelerators like FPGA and GPU in accelerating sparse iterative numerical algorithms. While iterative numerical algorithms are extensively used in solving large-scale sparse linear system of equations and eigenvalue problems, they are challenging to accelerate as they spend most of their time in communication-bound operations, like sparse matrix-vector multiply (SpMV) and vector-vector operations. Communication is used in a general sense to mean moving the matrix and the vectors within the custom memory hierarchy of the FPGA and between processors in the GPU; the cost of which is much higher than performing the actual computation due to technological reasons. Additionally, the dependency between the operations hinders overlapping computation with communication. As a result, although GPU and FPGA are offering large peak floating-point performance, their sustained performance is nonetheless very low due to high communication costs leading to poor silicon efficiency. In this thesis, we provide a systematic study to minimize the communication cost thereby increase the silicon efficiency. For small-to-medium datasets, we exploit large on-chip memory of the FPGA to load the matrix only once and then use explicit blocking to perform all iterations at the communication cost of a single iteration. For large sparse datasets, it is now a well-known idea to unroll k iterations using a matrix powers kernel which replaces SpMV and two additional kernels, TSQR and BGS, which replace vector-vector operations. While this approach can provide a Θ(k) reduction in the communication cost, the extent of the unrolling depends on the growth in redundant computation, the underlying architecture and the memory model. In this work, we show how to select the unroll factor k in an architecture-agnostic manner to provide communication-computation tradeoff on FPGA and GPU. To this end, we exploit inverse-memory hierarchy of the GPUs to map matrix power kernel and present a new algorithm for the FPGAs which matches with their strength to reduce redundant computation to allow large k and hence higher speedups. We provide predictive models of the matrix powers kernel to understand the communication-computation tradeoff on GPU and FPGA. We highlight extremely low efficiency of the GPU in TSQR due to off-chip sharing of data across different building blocks and show how we can use on-chip memory of the FPGA to eliminate this off-chip access and hence achieve better efficiency. Finally, we demonstrate how to compose all the kernels by using a unified architecture and exploit on-chip memory of the FPGA to share data across these kernels. Using the Lanczos Iteration as a case study to solve symmetric extremal eigenvalue problem, we show that the efficiency of FPGAs can be increased from 1.8% to 38% for small- to-medium scale dense matrices whereas up to 7.8% for large-scale structured banded matrices. We show that although GPU shows better efficiency for certain kernels like the matrix powers kernel, the overall efficiency is even lower due to increase in communication cost while sharing data across different kernels through off-chip memory. As the Lanczos Iteration is at the heart of all modern iterative numerical algorithms, our results are applicable to a broad class of iterative numerical algorithms.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Dost, Banu. « Optimization algorithms for biological data ». Diss., [La Jolla] : University of California, San Diego, 2010. http://wwwlib.umi.com/cr/ucsd/fullcit?p3397170.

Texte intégral
Résumé :
Thesis (Ph. D.)--University of California, San Diego, 2010.
Title from first page of PDF file (viewed March 23, 2010). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 149-159).
Styles APA, Harvard, Vancouver, ISO, etc.
9

Xiong, Xiaoping. « Stochastic optimization algorithms and convergence / ». College Park, Md. : University of Maryland, 2005. http://hdl.handle.net/1903/2360.

Texte intégral
Résumé :
Thesis (Ph. D.) -- University of Maryland, College Park, 2005.
Thesis research directed by: Business and Management. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Quttineh, Nils-Hassan. « Algorithms for Costly Global Optimization ». Licentiate thesis, Mälardalen University, School of Education, Culture and Communication, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-5970.

Texte intégral
Résumé :

There exists many applications with so-called costly problems, which means that the objective function you want to maximize or minimize cannot be described using standard functions and expressions. Instead one considers these objective functions as ``black box'' where the parameter values are sent in and a function value is returned. This implies in particular that no derivative information is available.The reason for describing these problems as expensive is that it may take a long time to calculate a single function value. The black box could, for example, solve a large system of differential equations or carrying out a heavy simulation, which can take anywhere from several minutes to several hours!These very special conditions therefore requires customized algorithms. Common optimization algorithms are based on calculating function values every now and then, which usually can be done instantly. But with an expensive problem, it may take several hours to compute a single function value. Our main objective is therefore to create algorithms that exploit all available information to the limit before a new function value is calculated. Or in other words, we want to find the optimal solution using as few function evaluations as possible.A good example of real life applications comes from the automotive industry, where on the development of new engines utilize advanced models that are governed by a dozen key parameters. The goal is to optimize the model by changing the parameters in such a way that the engine becomes as energy efficient as possible, but still meets all sorts of demands on strength and external constraints.

Styles APA, Harvard, Vancouver, ISO, etc.

Livres sur le sujet "Optimization algorithms"

1

Hartmann, Alexander K. Optimization algorithms in physics. Berlin : Wiley-VCH, 2002.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Spedicato, Emilio, dir. Algorithms for Continuous Optimization. Dordrecht : Springer Netherlands, 1994. http://dx.doi.org/10.1007/978-94-009-0369-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Cheng, Shi, et Yuhui Shi, dir. Brain Storm Optimization Algorithms. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-15070-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

1939-, Dennis J. E., et Institute for Computer Applications in Science and Engineering., dir. Algorithms for bilevel optimization. Hampton, VA : Institute for Computer Applications in Science and Engineering, NASA Langley Research Center, 1994.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Arora, Rajesh Kumar. Optimization : Algorithms and applications. Boca Raton : Taylor & Francis Group, 2015.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Kocay, William. Graphs, algorithms, and optimization. Boca Raton : Chapman & Hall/CRC, 2005.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Pereira, Ana I., Florbela P. Fernandes, João P. Coelho, João P. Teixeira, Maria F. Pacheco, Paulo Alves et Rui P. Lopes, dir. Optimization, Learning Algorithms and Applications. Cham : Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-91885-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Grötschel, Martin, László Lovász et Alexander Schrijver. Geometric Algorithms and Combinatorial Optimization. Berlin, Heidelberg : Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/978-3-642-78240-4.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Uryasev, Stanislav, et Panos M. Pardalos, dir. Stochastic Optimization : Algorithms and Applications. Boston, MA : Springer US, 2001. http://dx.doi.org/10.1007/978-1-4757-6594-6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Jansen, Klaus, et José Rolim, dir. Approximation Algorithms for Combinatiorial Optimization. Berlin, Heidelberg : Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0053958.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Chapitres de livres sur le sujet "Optimization algorithms"

1

Löhne, Andreas. « Algorithms ». Dans Vector Optimization, 161–95. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-18351-5_6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Khamehchi, Ehsan, et Mohammad Reza Mahdiani. « Optimization Algorithms ». Dans SpringerBriefs in Petroleum Geoscience & ; Engineering, 35–46. Cham : Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-51451-2_4.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Chen, Po, et En-Jui Lee. « Optimization Algorithms ». Dans Full-3D Seismic Waveform Inversion, 311–43. Cham : Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-16604-9_5.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Yang, Xin-She. « Optimization Algorithms ». Dans Computational Optimization, Methods and Algorithms, 13–31. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-20859-1_2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Buljak, Vladimir. « Optimization Algorithms ». Dans Computational Fluid and Solid Mechanics, 19–83. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-22703-5_2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Stefanov, Stefan M. « The Algorithms ». Dans Applied Optimization, 159–74. Boston, MA : Springer US, 2001. http://dx.doi.org/10.1007/978-1-4757-3417-1_7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Virant, Jernej. « Fuzzy Algorithms ». Dans Applied Optimization, 65–78. Boston, MA : Springer US, 2000. http://dx.doi.org/10.1007/978-1-4615-4673-3_4.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Stefanov, Stefan M. « The Algorithms ». Dans Separable Optimization, 149–62. Cham : Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78401-0_7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Bonnans, J. Frédéric. « Algorithms ». Dans Convex and Stochastic Optimization, 267–81. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-14977-2_8.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Dostál, Zdenek. « Optimization ». Dans Optimal Quadratic Programming Algorithms, 1–44. Boston, MA : Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-84806-8_2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Actes de conférences sur le sujet "Optimization algorithms"

1

Liu, Jihong, et Sen Zeng. « A Survey of Assembly Planning Based on Intelligent Optimization Algorithms ». Dans ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-49445.

Texte intégral
Résumé :
Assembly planning is one of the NP complete problems, which is even more difficult to solve for complex products. Intelligent optimization algorithms have obvious advantages to deal with such combinatorial problems. Various intelligent optimization algorithms have been applied to assembly sequence planning and optimization in the last decade. This paper surveys the state-of-the-art of the assembly planning methods based on the intelligent optimization algorithms. Five intelligent optimization algorithms, i.e. genetic algorithm (GA), artificial neural networks (ANN), simulated annealing (SA), ant colony algorithm (ACO) and artificial immune algorithm (AIA), and their applications in assembly planning and optimization are introduced respectively. The application features of the algorithms are summarized. At last, the future research directions of the assembly planning based on the intelligent optimization algorithms are discussed.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Mekhilef, Mounib, et Mohamed B. Trabia. « Successive Twinkling Simplex Search Optimization Algorithms ». Dans ASME 2001 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/detc2001/dac-21132.

Texte intégral
Résumé :
Abstract Simplex algorithms have been proven to be a reliable nonlinear programming pattern search algorithm. The effectiveness of simplex however reduces when the solved problem has large number of variables, several local minima, or when initial guess is not readily available. Recent results obtained by introducing a technique of random selection of the variables in optimization processes, encouraged studying the effect of this idea on the Nelder & Mead version of the simplex algorithm to improve its semi-global behavior. This paper proposes several enhancements to the simplex. The algorithm is run for several attempts. Each attempt uses the final answer of the previous attempt as an initial guess. At each attempt, the search starts by generating a simplex with n+1 vertices. A random subset of the variables is involved in the movement (reflection, expansion, contraction, shrinking) of the simplex. This process is called twinkling. The paper presents several variations of twinkling the simplex. The random nature of the algorithm lends itself to problems with large dimensionality or complex topography. The algorithm is applied successfully to several engineering design problems. The results compare favorably with both regular simplex and genetic algorithms.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Hamann, Hendrik F. « Optimization Algorithms for Energy-Efficient Data Centers ». Dans ASME 2013 International Technical Conference and Exhibition on Packaging and Integration of Electronic and Photonic Microsystems. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/ipack2013-73066.

Texte intégral
Résumé :
Real-time optimization algorithms for managing energy efficiency in data centers have been developed and implemented. For example, for a given cooling configuration (which is being measured and modeled in real-time using IBM’s Measurement and Management Technologies) an optimization algorithm allows identifying the optimum placement of new servers (or workloads) or alternatively where to remove servers (or workload) for different constraints. Another optimization algorithm optimizes performance of the data center without creating hotspots. The optimization algorithms use a physical model in junction with linear programming as well as linear least square methods.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Müller, Nils, et Tobias Glasmachers. « Non-local optimization ». Dans FOGA '21 : Foundations of Genetic Algorithms XVI. New York, NY, USA : ACM, 2021. http://dx.doi.org/10.1145/3450218.3477307.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Khamisov, O. V. « Optimization with quadratic support functions in nonconvex smooth optimization ». Dans NUMERICAL COMPUTATIONS : THEORY AND ALGORITHMS (NUMTA–2016) : Proceedings of the 2nd International Conference “Numerical Computations : Theory and Algorithms”. Author(s), 2016. http://dx.doi.org/10.1063/1.4965331.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Ladkany, George S., et Mohamed B. Trabia. « Incorporating Twinkling in Genetic Algorithms for Global Optimization ». Dans ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-49256.

Texte intégral
Résumé :
Genetic algorithms have been extensively used as a reliable tool for global optimization. However these algorithms suffer from their slow convergence. To address this limitation, this paper proposes a two-fold approach to address these limitations. The first approach is to introduce a twinkling process within the crossover phase of a genetic algorithm. Twinkling can be incorporated within any standard algorithm by introducing a controlled random deviation from its standard progression to avoiding being trapped at a local minimum. The second approach is to introduce a crossover technique: the weighted average normally-distributed arithmetic crossover that is shown to enhance the rate of convergence. Two possible twinkling genetic algorithms are proposed. The performance of the proposed algorithms is successfully compared to simple genetic algorithms using various standard mathematical and engineering design problems. The twinkling genetic algorithms show their ability to consistently reach known global minima, rather than nearby sub-optimal points with a competitive rate of convergence.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Service, Travis C., et Daniel R. Tauritz. « Co-optimization algorithms ». Dans the 10th annual conference. New York, New York, USA : ACM Press, 2008. http://dx.doi.org/10.1145/1389095.1389166.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Corne, David, et Alan Reynolds. « Evaluating optimization algorithms ». Dans the 13th annual conference companion. New York, New York, USA : ACM Press, 2011. http://dx.doi.org/10.1145/2001858.2002073.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Alexandrov, Natalia, et J. Dennis, Jr. « Algorithms for bilevel optimization ». Dans 5th Symposium on Multidisciplinary Analysis and Optimization. Reston, Virigina : American Institute of Aeronautics and Astronautics, 1994. http://dx.doi.org/10.2514/6.1994-4334.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Mukesh, R., K. Lingadurai et S. Karthick. « Aerodynamic optimization using proficient optimization algorithms ». Dans 2012 International Conference on Computing, Communication and Applications (ICCCA). IEEE, 2012. http://dx.doi.org/10.1109/iccca.2012.6179183.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Rapports d'organisations sur le sujet "Optimization algorithms"

1

Parekh, Ojas D., Ciaran Ryan-Anderson et Sevag Gharibian. Quantum Optimization and Approximation Algorithms. Office of Scientific and Technical Information (OSTI), janvier 2019. http://dx.doi.org/10.2172/1492737.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Mifflin, Robert. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA : Defense Technical Information Center, juillet 1988. http://dx.doi.org/10.21236/ada204389.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Prieto, Francisco J. Sequential Quadratic Programming Algorithms for Optimization. Fort Belvoir, VA : Defense Technical Information Center, août 1989. http://dx.doi.org/10.21236/ada212800.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Mifflin, Robert. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA : Defense Technical Information Center, juillet 1986. http://dx.doi.org/10.21236/ada182531.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Mifflin, R. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA : Defense Technical Information Center, juillet 1985. http://dx.doi.org/10.21236/ada159168.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Mifflin, Robert. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA : Defense Technical Information Center, décembre 1990. http://dx.doi.org/10.21236/ada231110.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Prieto, F. Sequential quadratic programming algorithms for optimization. Office of Scientific and Technical Information (OSTI), août 1989. http://dx.doi.org/10.2172/5325989.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Apostolatos, A., B. Keith, C. Soriano et R. Rossi. D6.1 Deterministic optimization software. Scipedia, 2021. http://dx.doi.org/10.23967/exaqute.2021.2.018.

Texte intégral
Résumé :
This deliverable focuses on the implementation of deterministic optimization algorithms and problem solvers within KRATOS open-source software. One of the main challenges of optimization algorithms in Finite-Element based optimization is how to get the gradient of response functions which are used as objective and constraints when this is not available in an explicit form. The idea is to use local sensitivity analysis to get the gradient of the response function(s)
Styles APA, Harvard, Vancouver, ISO, etc.
9

Plotkin, Serge. Research in Graph Algorithms and Combinatorial Optimization. Fort Belvoir, VA : Defense Technical Information Center, mars 1995. http://dx.doi.org/10.21236/ada292630.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Nocedal, J. Algorithms and software for large scale optimization. Office of Scientific and Technical Information (OSTI), mai 1990. http://dx.doi.org/10.2172/5688791.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie