Literatura científica selecionada sobre o tema "Optimization algorithms"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Optimization algorithms".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Artigos de revistas sobre o assunto "Optimization algorithms"

1

Celik, Yuksel, e Erkan Ulker. "An Improved Marriage in Honey Bees Optimization Algorithm for Single Objective Unconstrained Optimization". Scientific World Journal 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/370172.

Texto completo da fonte
Resumo:
Marriage in honey bees optimization (MBO) is a metaheuristic optimization algorithm developed by inspiration of the mating and fertilization process of honey bees and is a kind of swarm intelligence optimizations. In this study we propose improved marriage in honey bees optimization (IMBO) by adding Levy flight algorithm for queen mating flight and neighboring for worker drone improving. The IMBO algorithm’s performance and its success are tested on the well-known six unconstrained test functions and compared with other metaheuristic optimization algorithms.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Luan, Yuxuan, Junjiang He, Jingmin Yang, Xiaolong Lan e Geying Yang. "Uniformity-Comprehensive Multiobjective Optimization Evolutionary Algorithm Based on Machine Learning". International Journal of Intelligent Systems 2023 (10 de novembro de 2023): 1–21. http://dx.doi.org/10.1155/2023/1666735.

Texto completo da fonte
Resumo:
When solving real-world optimization problems, the uniformity of Pareto fronts is an essential strategy in multiobjective optimization problems (MOPs). However, it is a common challenge for many existing multiobjective optimization algorithms due to the skewed distribution of solutions and biases towards specific objective functions. This paper proposes a uniformity-comprehensive multiobjective optimization evolutionary algorithm based on machine learning to address this limitation. Our algorithm utilizes uniform initialization and self-organizing map (SOM) to enhance population diversity and uniformity. We track the IGD value and use K-means and CNN refinement with crossover and mutation techniques during evolutionary stages. Our algorithm’s uniformity and objective function balance superiority were verified through comparative analysis with 13 other algorithms, including eight traditional multiobjective optimization algorithms, three machine learning-based enhanced multiobjective optimization algorithms, and two algorithms with objective initialization improvements. Based on these comprehensive experiments, it has been proven that our algorithm outperforms other existing algorithms in these areas.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Wen, Xiaodong, Xiangdong Liu, Cunhui Yu, Haoning Gao, Jing Wang, Yongji Liang, Jiangli Yu e Yan Bai. "IOOA: A multi-strategy fusion improved Osprey Optimization Algorithm for global optimization". Electronic Research Archive 32, n.º 3 (2024): 2033–74. http://dx.doi.org/10.3934/era.2024093.

Texto completo da fonte
Resumo:
<abstract><p>With the widespread application of metaheuristic algorithms in engineering and scientific research, finding algorithms with efficient global search capabilities and precise local search performance has become a hot topic in research. The osprey optimization algorithm (OOA) was first proposed in 2023, characterized by its simple structure and strong optimization capability. However, practical tests have revealed that the OOA algorithm inevitably encounters common issues faced by metaheuristic algorithms, such as the tendency to fall into local optima and reduced population diversity in the later stages of the algorithm's iterations. To address these issues, a multi-strategy fusion improved osprey optimization algorithm is proposed (IOOA). First, the characteristics of various chaotic mappings were thoroughly explored, and the adoption of Circle chaotic mapping to replace pseudo-random numbers for population initialization improvement was proposed, increasing initial population diversity and improving the quality of initial solutions. Second, a dynamically adjustable elite guidance mechanism was proposed to dynamically adjust the position updating method according to different stages of the algorithm's iteration, ensuring the algorithm maintains good global search capabilities while significantly increasing the convergence speed of the algorithm. Lastly, a dynamic chaotic weight factor was designed and applied in the development stage of the original algorithm to enhance the algorithm's local search capability and improve the convergence accuracy of the algorithm. To fully verify the effectiveness and practical engineering applicability of the IOOA algorithm, simulation experiments were conducted using 21 benchmark test functions and the CEC-2022 benchmark functions, and the IOOA algorithm was applied to the LSTM power load forecasting problem as well as two engineering design problems. The experimental results show that the IOOA algorithm possesses outstanding global optimization performance in handling complex optimization problems and broad applicability in practical engineering applications.</p></abstract>
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Kim, Minsu, Areum Han, Jaewon Lee, Sunghyun Cho, Il Moon e Jonggeol Na. "Comparison of Derivative-Free Optimization: Energy Optimization of Steam Methane Reforming Process". International Journal of Energy Research 2023 (3 de junho de 2023): 1–20. http://dx.doi.org/10.1155/2023/8868540.

Texto completo da fonte
Resumo:
In modern chemical engineering, various derivative-free optimization (DFO) studies have been conducted to identify operating conditions that maximize energy efficiency for efficient operation of processes. Although DFO algorithm selection is an essential task that leads to successful designs, it is a nonintuitive task because of the uncertain performance of the algorithms. In particular, when the system evaluation cost or computational load is high (e.g., density functional theory and computational fluid dynamics), selecting an algorithm that quickly converges to the near-global optimum at the early stage of optimization is more important. In this study, we compare the optimization performance in the early stage of 12 algorithms. The performance of deterministic global search algorithms, global model-based search algorithms, metaheuristic algorithms, and Bayesian optimization is compared by applying benchmark problems and analyzed based on the problem types and number of variables. Furthermore, we apply all algorithms to the energy process optimization that maximizes the thermal efficiency of the steam methane reforming (SMR) process for hydrogen production. In this application, we have identified a hidden constraint based on real-world operations, and we are addressing it by using a penalty function. Bayesian optimizations explore the design space most efficiently by training infeasible regions. As a result, we have observed a substantial improvement in thermal efficiency of 12.9% compared to the base case and 7% improvement when compared to the lowest performing algorithm.
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Priyadarshini, Ishaani. "Dendritic Growth Optimization: A Novel Nature-Inspired Algorithm for Real-World Optimization Problems". Biomimetics 9, n.º 3 (21 de fevereiro de 2024): 130. http://dx.doi.org/10.3390/biomimetics9030130.

Texto completo da fonte
Resumo:
In numerous scientific disciplines and practical applications, addressing optimization challenges is a common imperative. Nature-inspired optimization algorithms represent a highly valuable and pragmatic approach to tackling these complexities. This paper introduces Dendritic Growth Optimization (DGO), a novel algorithm inspired by natural branching patterns. DGO offers a novel solution for intricate optimization problems and demonstrates its efficiency in exploring diverse solution spaces. The algorithm has been extensively tested with a suite of machine learning algorithms, deep learning algorithms, and metaheuristic algorithms, and the results, both before and after optimization, unequivocally support the proposed algorithm’s feasibility, effectiveness, and generalizability. Through empirical validation using established datasets like diabetes and breast cancer, the algorithm consistently enhances model performance across various domains. Beyond its working and experimental analysis, DGO’s wide-ranging applications in machine learning, logistics, and engineering for solving real-world problems have been highlighted. The study also considers the challenges and practical implications of implementing DGO in multiple scenarios. As optimization remains crucial in research and industry, DGO emerges as a promising avenue for innovation and problem solving.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

RAO, Xiong, Run DU, Wenming CHENG e Yi YANG. "Modified proportional topology optimization algorithm for multiple optimization problems". Mechanics 30, n.º 1 (23 de fevereiro de 2024): 36–45. http://dx.doi.org/10.5755/j02.mech.34367.

Texto completo da fonte
Resumo:
Three modified proportional topology optimization (MPTO) algorithms are presented in this paper, which are named MPTOc, MPTOs and MPTOm, respectively. MPTOc aims to address the minimum compliance problem with volume constraint, MPTOs aims to solve the minimum volume fraction problem under stress constraint, and MPTOm aims to tackle the minimum volume fraction problem under compliance and stress constraints. In order to get rid of the shortcomings of the original proportional topology optimization (PTO) algorithm and improve the comprehensive performance of the PTO algorithm, the proposed algorithms modify the material interpolation scheme and introduce the Heaviside threshold function based on the PTO algorithm. To confirm the effectiveness and superiority of the presented algorithms, multiple optimization problems for the classical MBB beam are solved, and the original PTO algorithm is compared with the new algorithms. Numerical examples show that MPTOc, MPTOs and MPTOm enjoy distinct advantages over the PTO algorithm in the matter of convergence efficiency and the ability to obtain distinct topology structure without redundancy. Moreover, MPTOc has the fastest convergence speed among these algorithms and can acquire the smallest (best) compliance value. In addition, the new algorithms are also superior to PTO concerning suppressing gray-scale elements.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Gireesha. B, Mr, e . "A Literature Survey on Artificial Swarm Intelligence based Optimization Techniques". International Journal of Engineering & Technology 7, n.º 4.5 (22 de setembro de 2018): 455. http://dx.doi.org/10.14419/ijet.v7i4.5.20205.

Texto completo da fonte
Resumo:
From few decades’ optimizations techniques plays a key role in engineering and technological field applications. They are known for their behaviour pattern for solving modern engineering problems. Among various optimization techniques, heuristic and meta-heuristic algorithms proved to be efficient. In this paper, an effort is made to address techniques that are commonly used in engineering applications. This paper presents a basic overview of such optimization algorithms namely Artificial Bee Colony (ABC) Algorithm, Ant Colony Optimization (ACO) Algorithm, Fire-fly Algorithm (FFA) and Particle Swarm Optimization (PSO) is presented and also the most suitable fitness functions and its numerical expressions have discussed.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Acherjee, Bappa, Debanjan Maity e Arunanshu S. Kuar. "Ultrasonic Machining Process Optimization by Cuckoo Search and Chicken Swarm Optimization Algorithms". International Journal of Applied Metaheuristic Computing 11, n.º 2 (abril de 2020): 1–26. http://dx.doi.org/10.4018/ijamc.2020040101.

Texto completo da fonte
Resumo:
The ultrasonic machining (USM) process has been analyzed in the present study to obtain the desired process responses by optimizing machining parameters using cuckoo search (CS) and chicken swarm optimization (CSO), two powerful nature-inspired, population and swarm-intelligence-based metaheuristic algorithms. The CS and CSO algorithms have been compared with other non-conventional optimization techniques in terms of optimal results, convergence, accuracy, and computational time. It is found that CS and CSO algorithms predict superior single and multi-objective optimization results than gravitational search algorithms (GSAs), genetic algorithms (GAs), particle swarm optimization (PSO) algorithms, ant colony optimization (ACO) algorithms and artificial bee colony (ABC) algorithms, and gives exactly the same results as predicted by the fireworks algorithm (FWA). The CS algorithm outperforms all other algorithms namely CSO, FWA, GSA, GA, PSO, ACO, and ABC algorithms in terms of mean computational time, whereas, the CSO algorithm outperforms all other algorithms except for the CS and GSA algorithms.
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Alfarhisi, Zikrie Pramudia, Hadi Suyono e Fakhriy Hario Partiansyah. "4G LTE Network Coverage Optimization Using Metaheuristic Approach". International Journal of Computer Applications Technology and Researc 10, n.º 01 (1 de janeiro de 2021): 010–13. http://dx.doi.org/10.7753/ijcatr1001.1003.

Texto completo da fonte
Resumo:
The main focus of this paper is to optimize the coverage of each 4G LTE network cell within the service area. There are many algorithms can be implemented to determine the optimal 4G LTE coverage area including the deterministic and heuristic approaches. The deterministic approach could solve accurately the optimization problem but need more resources and time consuming to determine the convergence parameters. Therefore, the heuristic approaches were introduced to improve the deterministic approach drawback. The methods used are the Differential Evolution Algorithm (DEA) and Adaptive Mutation Genetic Algorithm (AMGA), which are categorized as metaheuristic approach. The DEA and AMGA algorithms have been widely used to solve combinatorial problems, including for solving the network optimizations. In the network optimization, coverage is strongly related to 2 objectives, which are reducing the black spot area and decreasing the overlapping coverage areas. Coverage overlap is a condition when some cell sites in an area overlap. It implies in the occurrence of hand off and an inefficient network management. This research aims to obtain an optimal 4G LTE network coverage and reduce the overlapping coverage areas based on effective e-Node B arrangements by using the DEA and AMGA algorithms. The simulations results showed that the DEA algorithm’s coverage effectiveness was 23,4%, and the AMGA Algorithm’s was 16,32%.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Si, Binghui, Feng Liu e Yanxia Li. "Metamodel-Based Hyperparameter Optimization of Optimization Algorithms in Building Energy Optimization". Buildings 13, n.º 1 (9 de janeiro de 2023): 167. http://dx.doi.org/10.3390/buildings13010167.

Texto completo da fonte
Resumo:
Building energy optimization (BEO) is a promising technique to achieve energy efficient designs. The efficacy of optimization algorithms is imperative for the BEO technique and is significantly dependent on the algorithm hyperparameters. Currently, studies focusing on algorithm hyperparameters are scarce, and common agreement on how to set their values, especially for BEO problems, is still lacking. This study proposes a metamodel-based methodology for hyperparameter optimization of optimization algorithms applied in BEO. The aim is to maximize the algorithmic efficacy and avoid the failure of the BEO technique because of improper algorithm hyperparameter settings. The method consists of three consecutive steps: constructing the specific BEO problem, developing an ANN-trained metamodel of the problem, and optimizing algorithm hyperparameters with nondominated sorting genetic algorithm II (NSGA-II). To verify the validity, 15 benchmark BEO problems with different properties, i.e., five building models and three design variable categories, were constructed for numerical experiments. For each problem, the hyperparameters of four commonly used algorithms, i.e., the genetic algorithm (GA), the particle swarm optimization (PSO) algorithm, simulated annealing (SA), and the multi-objective genetic algorithm (MOGA), were optimized. Results demonstrated that the MOGA benefited the most from hyperparameter optimization in terms of the quality of the obtained optimum, while PSO benefited the most in terms of the computing time.
Estilos ABNT, Harvard, Vancouver, APA, etc.

Teses / dissertações sobre o assunto "Optimization algorithms"

1

Astete, morales Sandra. "Contributions to Convergence Analysis of Noisy Optimization Algorithms". Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS327/document.

Texto completo da fonte
Resumo:
Cette thèse montre des contributions à l'analyse d'algorithmes pour l'optimisation de fonctions bruitées. Les taux de convergences (regret simple et regret cumulatif) sont analysés pour les algorithmes de recherche linéaire ainsi que pour les algorithmes de recherche aléatoires. Nous prouvons que les algorithmes basé sur la matrice hessienne peuvent atteindre le même résultat que certaines algorithmes optimaux, lorsque les paramètres sont bien choisis. De plus, nous analysons l'ordre de convergence des stratégies évolutionnistes pour des fonctions bruitées. Nous déduisons une convergence log-log. Nous prouvons aussi une borne basse pour le taux de convergence de stratégies évolutionnistes. Nous étendons le travail effectué sur les mécanismes de réévaluations en les appliquant au cas discret. Finalement, nous analysons la mesure de performance en elle-même et prouvons que l'utilisation d'une mauvaise mesure de performance peut mener à des résultats trompeurs lorsque différentes méthodes d'optimisation sont évaluées
This thesis exposes contributions to the analysis of algorithms for noisy functions. It exposes convergence rates for linesearch algorithms as well as for random search algorithms. We prove in terms of Simple Regret and Cumulative Regret that a Hessian based algorithm can reach the same results as some optimal algorithms in the literature, when parameters are tuned correctly. On the other hand we analyse the convergence order of Evolution Strategies when solving noisy functions. We deduce log-log convergence. We also give a lower bound for the convergence rate of the Evolution Strategies. We extend the work on revaluation by applying it to a discrete settings. Finally we analyse the performance measure itself and prove that the use of an erroneus performance measure can lead to misleading results on the evaluation of different methods
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Reimann, Axel. "Evolutionary algorithms and optimization". Doctoral thesis, [S.l. : s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=969093497.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Parpas, Panayiotis. "Algorithms for stochastic optimization". Thesis, Imperial College London, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.434980.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Johnson, Jared. "Algorithms for Rendering Optimization". Doctoral diss., University of Central Florida, 2012. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5329.

Texto completo da fonte
Resumo:
This dissertation explores algorithms for rendering optimization realizable within a modern, complex rendering engine. The first part contains optimized rendering algorithms for ray tracing. Ray tracing algorithms typically provide properties of simplicity and robustness that are highly desirable in computer graphics. We offer several novel contributions to the problem of interactive ray tracing of complex lighting environments. We focus on the problem of maintaining interactivity as both geometric and lighting complexity grows without effecting the simplicity or robustness of ray tracing. First, we present a new algorithm called occlusion caching for accelerating the calculation of direct lighting from many light sources. We cache light visibility information sparsely across a scene. When rendering direct lighting for all pixels in a frame, we combine cached lighting information to determine whether or not shadow rays are needed. Since light visibility and scene location are highly correlated, our approach precludes the need for most shadow rays. Second, we present improvements to the irradiance caching algorithm. Here we demonstrate a new elliptical cache point spacing heuristic that reduces the number of cache points required by taking into account the direction of irradiance gradients. We also accelerate irradiance caching by efficiently and intuitively coupling it with occlusion caching. In the second part of this dissertation, we present optimizations to rendering algorithms for participating media. Specifically, we explore the implementation and use of photon beams as an efficient, intuitive artistic primitive. We detail our implementation of the photon beams algorithm into PhotoRealistic RenderMan (PRMan). We show how our implementation maintains the benefits of the industry standard Reyes rendering pipeline, with proper motion blur and depth of field. We detail an automatic photon beam generation algorithm, utilizing PRMan shadow maps. We accelerate the rendering of camera-facing photon beams by utilizing Gaussian quadrature for path integrals in place of ray marching. Our optimized implementation allows for incredible versatility and intuitiveness in artistic control of volumetric lighting effects. Finally, we demonstrate the usefulness of photon beams as artistic primitives by detailing their use in a feature-length animated film.
Ph.D.
Doctorate
Computer Science
Engineering and Computer Science
Computer Science
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

CESARI, TOMMASO RENATO. "ALGORITHMS, LEARNING, AND OPTIMIZATION". Doctoral thesis, Università degli Studi di Milano, 2020. http://hdl.handle.net/2434/699354.

Texto completo da fonte
Resumo:
This thesis covers some algorithmic aspects of online machine learning and optimization. In Chapter 1 we design algorithms with state-of-the-art regret guarantees for the problem dynamic pricing. In Chapter 2 we move on to an asynchronous online learning setting in which only some of the agents in the network are active at each time step. We show that when information is shared among neighbors, knowledge about the graph structure might have a significantly different impact on learning rates depending on how agents are activated. In Chapter 3 we investigate the online problem of multivariate non-concave maximization under weak assumptions on the regularity of the objective function. In Chapter 4 we introduce a new performance measure and design an efficient algorithm to learn optimal policies in repeated A/B testing.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Stults, Ian Collier. "A multi-fidelity analysis selection method using a constrained discrete optimization formulation". Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/31706.

Texto completo da fonte
Resumo:
Thesis (Ph.D)--Aerospace Engineering, Georgia Institute of Technology, 2010.
Committee Chair: Mavris, Dimitri; Committee Member: Beeson, Don; Committee Member: Duncan, Scott; Committee Member: German, Brian; Committee Member: Kumar, Viren. Part of the SMARTech Electronic Thesis and Dissertation Collection.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Rafique, Abid. "Communication optimization in iterative numerical algorithms : an algorithm-architecture interaction". Thesis, Imperial College London, 2014. http://hdl.handle.net/10044/1/17837.

Texto completo da fonte
Resumo:
Trading communication with redundant computation can increase the silicon efficiency of common hardware accelerators like FPGA and GPU in accelerating sparse iterative numerical algorithms. While iterative numerical algorithms are extensively used in solving large-scale sparse linear system of equations and eigenvalue problems, they are challenging to accelerate as they spend most of their time in communication-bound operations, like sparse matrix-vector multiply (SpMV) and vector-vector operations. Communication is used in a general sense to mean moving the matrix and the vectors within the custom memory hierarchy of the FPGA and between processors in the GPU; the cost of which is much higher than performing the actual computation due to technological reasons. Additionally, the dependency between the operations hinders overlapping computation with communication. As a result, although GPU and FPGA are offering large peak floating-point performance, their sustained performance is nonetheless very low due to high communication costs leading to poor silicon efficiency. In this thesis, we provide a systematic study to minimize the communication cost thereby increase the silicon efficiency. For small-to-medium datasets, we exploit large on-chip memory of the FPGA to load the matrix only once and then use explicit blocking to perform all iterations at the communication cost of a single iteration. For large sparse datasets, it is now a well-known idea to unroll k iterations using a matrix powers kernel which replaces SpMV and two additional kernels, TSQR and BGS, which replace vector-vector operations. While this approach can provide a Θ(k) reduction in the communication cost, the extent of the unrolling depends on the growth in redundant computation, the underlying architecture and the memory model. In this work, we show how to select the unroll factor k in an architecture-agnostic manner to provide communication-computation tradeoff on FPGA and GPU. To this end, we exploit inverse-memory hierarchy of the GPUs to map matrix power kernel and present a new algorithm for the FPGAs which matches with their strength to reduce redundant computation to allow large k and hence higher speedups. We provide predictive models of the matrix powers kernel to understand the communication-computation tradeoff on GPU and FPGA. We highlight extremely low efficiency of the GPU in TSQR due to off-chip sharing of data across different building blocks and show how we can use on-chip memory of the FPGA to eliminate this off-chip access and hence achieve better efficiency. Finally, we demonstrate how to compose all the kernels by using a unified architecture and exploit on-chip memory of the FPGA to share data across these kernels. Using the Lanczos Iteration as a case study to solve symmetric extremal eigenvalue problem, we show that the efficiency of FPGAs can be increased from 1.8% to 38% for small- to-medium scale dense matrices whereas up to 7.8% for large-scale structured banded matrices. We show that although GPU shows better efficiency for certain kernels like the matrix powers kernel, the overall efficiency is even lower due to increase in communication cost while sharing data across different kernels through off-chip memory. As the Lanczos Iteration is at the heart of all modern iterative numerical algorithms, our results are applicable to a broad class of iterative numerical algorithms.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Dost, Banu. "Optimization algorithms for biological data". Diss., [La Jolla] : University of California, San Diego, 2010. http://wwwlib.umi.com/cr/ucsd/fullcit?p3397170.

Texto completo da fonte
Resumo:
Thesis (Ph. D.)--University of California, San Diego, 2010.
Title from first page of PDF file (viewed March 23, 2010). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 149-159).
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Xiong, Xiaoping. "Stochastic optimization algorithms and convergence /". College Park, Md. : University of Maryland, 2005. http://hdl.handle.net/1903/2360.

Texto completo da fonte
Resumo:
Thesis (Ph. D.) -- University of Maryland, College Park, 2005.
Thesis research directed by: Business and Management. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Quttineh, Nils-Hassan. "Algorithms for Costly Global Optimization". Licentiate thesis, Mälardalen University, School of Education, Culture and Communication, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-5970.

Texto completo da fonte
Resumo:

There exists many applications with so-called costly problems, which means that the objective function you want to maximize or minimize cannot be described using standard functions and expressions. Instead one considers these objective functions as ``black box'' where the parameter values are sent in and a function value is returned. This implies in particular that no derivative information is available.The reason for describing these problems as expensive is that it may take a long time to calculate a single function value. The black box could, for example, solve a large system of differential equations or carrying out a heavy simulation, which can take anywhere from several minutes to several hours!These very special conditions therefore requires customized algorithms. Common optimization algorithms are based on calculating function values every now and then, which usually can be done instantly. But with an expensive problem, it may take several hours to compute a single function value. Our main objective is therefore to create algorithms that exploit all available information to the limit before a new function value is calculated. Or in other words, we want to find the optimal solution using as few function evaluations as possible.A good example of real life applications comes from the automotive industry, where on the development of new engines utilize advanced models that are governed by a dozen key parameters. The goal is to optimize the model by changing the parameters in such a way that the engine becomes as energy efficient as possible, but still meets all sorts of demands on strength and external constraints.

Estilos ABNT, Harvard, Vancouver, APA, etc.

Livros sobre o assunto "Optimization algorithms"

1

Spedicato, Emilio, ed. Algorithms for Continuous Optimization. Dordrecht: Springer Netherlands, 1994. http://dx.doi.org/10.1007/978-94-009-0369-2.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Cheng, Shi, e Yuhui Shi, eds. Brain Storm Optimization Algorithms. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-15070-9.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

L, Kreher Donald, ed. Graphs, algorithms, and optimization. Boca Raton: Chapman & Hall/CRC, 2005.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

1939-, Dennis J. E., e Institute for Computer Applications in Science and Engineering., eds. Algorithms for bilevel optimization. Hampton, VA: Institute for Computer Applications in Science and Engineering, NASA Langley Research Center, 1994.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Heiko, Rieger, ed. Optimization algorithms in physics. Berlin: Wiley-VCH, 2002.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Pereira, Ana I., Florbela P. Fernandes, João P. Coelho, João P. Teixeira, Maria F. Pacheco, Paulo Alves e Rui P. Lopes, eds. Optimization, Learning Algorithms and Applications. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-91885-9.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Grötschel, Martin, László Lovász e Alexander Schrijver. Geometric Algorithms and Combinatorial Optimization. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/978-3-642-78240-4.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Uryasev, Stanislav, e Panos M. Pardalos, eds. Stochastic Optimization: Algorithms and Applications. Boston, MA: Springer US, 2001. http://dx.doi.org/10.1007/978-1-4757-6594-6.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Jansen, Klaus, e José Rolim, eds. Approximation Algorithms for Combinatiorial Optimization. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0053958.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Migdalas, Athanasios, Panos M. Pardalos e Peter Värbrand, eds. Multilevel Optimization: Algorithms and Applications. Boston, MA: Springer US, 1998. http://dx.doi.org/10.1007/978-1-4613-0307-7.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Capítulos de livros sobre o assunto "Optimization algorithms"

1

Löhne, Andreas. "Algorithms". In Vector Optimization, 161–95. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-18351-5_6.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Khamehchi, Ehsan, e Mohammad Reza Mahdiani. "Optimization Algorithms". In SpringerBriefs in Petroleum Geoscience & Engineering, 35–46. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-51451-2_4.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Chen, Po, e En-Jui Lee. "Optimization Algorithms". In Full-3D Seismic Waveform Inversion, 311–43. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-16604-9_5.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Yang, Xin-She. "Optimization Algorithms". In Computational Optimization, Methods and Algorithms, 13–31. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-20859-1_2.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Buljak, Vladimir. "Optimization Algorithms". In Computational Fluid and Solid Mechanics, 19–83. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-22703-5_2.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Dolzhenko, Viktoria. "Optimization Algorithms". In Algorithmic Trading Systems and Strategies: A New Approach, 215–64. Berkeley, CA: Apress, 2024. http://dx.doi.org/10.1007/979-8-8688-0357-4_6.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Rong, Hai-Jun, e Zhao-Xu Yang. "Optimization Algorithms". In Sequential Intelligent Dynamic System Modeling and Control, 29–44. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-1541-1_3.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Stefanov, Stefan M. "The Algorithms". In Applied Optimization, 159–74. Boston, MA: Springer US, 2001. http://dx.doi.org/10.1007/978-1-4757-3417-1_7.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Virant, Jernej. "Fuzzy Algorithms". In Applied Optimization, 65–78. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/978-1-4615-4673-3_4.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Stefanov, Stefan M. "The Algorithms". In Separable Optimization, 149–62. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78401-0_7.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Trabalhos de conferências sobre o assunto "Optimization algorithms"

1

Liu, Jihong, e Sen Zeng. "A Survey of Assembly Planning Based on Intelligent Optimization Algorithms". In ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-49445.

Texto completo da fonte
Resumo:
Assembly planning is one of the NP complete problems, which is even more difficult to solve for complex products. Intelligent optimization algorithms have obvious advantages to deal with such combinatorial problems. Various intelligent optimization algorithms have been applied to assembly sequence planning and optimization in the last decade. This paper surveys the state-of-the-art of the assembly planning methods based on the intelligent optimization algorithms. Five intelligent optimization algorithms, i.e. genetic algorithm (GA), artificial neural networks (ANN), simulated annealing (SA), ant colony algorithm (ACO) and artificial immune algorithm (AIA), and their applications in assembly planning and optimization are introduced respectively. The application features of the algorithms are summarized. At last, the future research directions of the assembly planning based on the intelligent optimization algorithms are discussed.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Mekhilef, Mounib, e Mohamed B. Trabia. "Successive Twinkling Simplex Search Optimization Algorithms". In ASME 2001 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/detc2001/dac-21132.

Texto completo da fonte
Resumo:
Abstract Simplex algorithms have been proven to be a reliable nonlinear programming pattern search algorithm. The effectiveness of simplex however reduces when the solved problem has large number of variables, several local minima, or when initial guess is not readily available. Recent results obtained by introducing a technique of random selection of the variables in optimization processes, encouraged studying the effect of this idea on the Nelder & Mead version of the simplex algorithm to improve its semi-global behavior. This paper proposes several enhancements to the simplex. The algorithm is run for several attempts. Each attempt uses the final answer of the previous attempt as an initial guess. At each attempt, the search starts by generating a simplex with n+1 vertices. A random subset of the variables is involved in the movement (reflection, expansion, contraction, shrinking) of the simplex. This process is called twinkling. The paper presents several variations of twinkling the simplex. The random nature of the algorithm lends itself to problems with large dimensionality or complex topography. The algorithm is applied successfully to several engineering design problems. The results compare favorably with both regular simplex and genetic algorithms.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Hamann, Hendrik F. "Optimization Algorithms for Energy-Efficient Data Centers". In ASME 2013 International Technical Conference and Exhibition on Packaging and Integration of Electronic and Photonic Microsystems. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/ipack2013-73066.

Texto completo da fonte
Resumo:
Real-time optimization algorithms for managing energy efficiency in data centers have been developed and implemented. For example, for a given cooling configuration (which is being measured and modeled in real-time using IBM’s Measurement and Management Technologies) an optimization algorithm allows identifying the optimum placement of new servers (or workloads) or alternatively where to remove servers (or workload) for different constraints. Another optimization algorithm optimizes performance of the data center without creating hotspots. The optimization algorithms use a physical model in junction with linear programming as well as linear least square methods.
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Müller, Nils, e Tobias Glasmachers. "Non-local optimization". In FOGA '21: Foundations of Genetic Algorithms XVI. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3450218.3477307.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Ladkany, George S., e Mohamed B. Trabia. "Incorporating Twinkling in Genetic Algorithms for Global Optimization". In ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-49256.

Texto completo da fonte
Resumo:
Genetic algorithms have been extensively used as a reliable tool for global optimization. However these algorithms suffer from their slow convergence. To address this limitation, this paper proposes a two-fold approach to address these limitations. The first approach is to introduce a twinkling process within the crossover phase of a genetic algorithm. Twinkling can be incorporated within any standard algorithm by introducing a controlled random deviation from its standard progression to avoiding being trapped at a local minimum. The second approach is to introduce a crossover technique: the weighted average normally-distributed arithmetic crossover that is shown to enhance the rate of convergence. Two possible twinkling genetic algorithms are proposed. The performance of the proposed algorithms is successfully compared to simple genetic algorithms using various standard mathematical and engineering design problems. The twinkling genetic algorithms show their ability to consistently reach known global minima, rather than nearby sub-optimal points with a competitive rate of convergence.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Khamisov, O. V. "Optimization with quadratic support functions in nonconvex smooth optimization". In NUMERICAL COMPUTATIONS: THEORY AND ALGORITHMS (NUMTA–2016): Proceedings of the 2nd International Conference “Numerical Computations: Theory and Algorithms”. Author(s), 2016. http://dx.doi.org/10.1063/1.4965331.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Service, Travis C., e Daniel R. Tauritz. "Co-optimization algorithms". In the 10th annual conference. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1389095.1389166.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Corne, David, e Alan Reynolds. "Evaluating optimization algorithms". In the 13th annual conference companion. New York, New York, USA: ACM Press, 2011. http://dx.doi.org/10.1145/2001858.2002073.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Alexandrov, Natalia, e J. Dennis, Jr. "Algorithms for bilevel optimization". In 5th Symposium on Multidisciplinary Analysis and Optimization. Reston, Virigina: American Institute of Aeronautics and Astronautics, 1994. http://dx.doi.org/10.2514/6.1994-4334.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Mukesh, R., K. Lingadurai e S. Karthick. "Aerodynamic optimization using proficient optimization algorithms". In 2012 International Conference on Computing, Communication and Applications (ICCCA). IEEE, 2012. http://dx.doi.org/10.1109/iccca.2012.6179183.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Relatórios de organizações sobre o assunto "Optimization algorithms"

1

Parekh, Ojas D., Ciaran Ryan-Anderson e Sevag Gharibian. Quantum Optimization and Approximation Algorithms. Office of Scientific and Technical Information (OSTI), janeiro de 2019. http://dx.doi.org/10.2172/1492737.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Mifflin, Robert. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA: Defense Technical Information Center, julho de 1988. http://dx.doi.org/10.21236/ada204389.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Prieto, Francisco J. Sequential Quadratic Programming Algorithms for Optimization. Fort Belvoir, VA: Defense Technical Information Center, agosto de 1989. http://dx.doi.org/10.21236/ada212800.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Mifflin, Robert. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA: Defense Technical Information Center, julho de 1986. http://dx.doi.org/10.21236/ada182531.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Mifflin, R. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA: Defense Technical Information Center, julho de 1985. http://dx.doi.org/10.21236/ada159168.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Mifflin, Robert. Rapidly Convergent Algorithms for Nonsmooth Optimization. Fort Belvoir, VA: Defense Technical Information Center, dezembro de 1990. http://dx.doi.org/10.21236/ada231110.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Prieto, F. Sequential quadratic programming algorithms for optimization. Office of Scientific and Technical Information (OSTI), agosto de 1989. http://dx.doi.org/10.2172/5325989.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Apostolatos, A., B. Keith, C. Soriano e R. Rossi. D6.1 Deterministic optimization software. Scipedia, 2021. http://dx.doi.org/10.23967/exaqute.2021.2.018.

Texto completo da fonte
Resumo:
This deliverable focuses on the implementation of deterministic optimization algorithms and problem solvers within KRATOS open-source software. One of the main challenges of optimization algorithms in Finite-Element based optimization is how to get the gradient of response functions which are used as objective and constraints when this is not available in an explicit form. The idea is to use local sensitivity analysis to get the gradient of the response function(s)
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Plotkin, Serge. Research in Graph Algorithms and Combinatorial Optimization. Fort Belvoir, VA: Defense Technical Information Center, março de 1995. http://dx.doi.org/10.21236/ada292630.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Nocedal, J. Algorithms and software for large scale optimization. Office of Scientific and Technical Information (OSTI), maio de 1990. http://dx.doi.org/10.2172/5688791.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia