Academic literature on the topic 'Blackbox optimization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Blackbox optimization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Blackbox optimization":

1

Audet, Charles, Sébastien Le Digabel, and Mathilde Peyrega. "Linear equalities in blackbox optimization." Computational Optimization and Applications 61, no. 1 (October 19, 2014): 1–23. http://dx.doi.org/10.1007/s10589-014-9708-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Audet, Charles, J. E. Dennis, and Sébastien Le Digabel. "Trade-off studies in blackbox optimization." Optimization Methods and Software 27, no. 4-5 (October 2012): 613–24. http://dx.doi.org/10.1080/10556788.2011.571687.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Audet, Charles, Alain Batailly, and Solène Kojtych. "Escaping Unknown Discontinuous Regions in Blackbox Optimization." SIAM Journal on Optimization 32, no. 3 (August 4, 2022): 1843–70. http://dx.doi.org/10.1137/21m1420915.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gramacy, Robert B., Genetha A. Gray, Sébastien Le Digabel, Herbert K. H. Lee, Pritam Ranjan, Garth Wells, and Stefan M. Wild. "Modeling an Augmented Lagrangian for Blackbox Constrained Optimization." Technometrics 58, no. 1 (January 2, 2016): 1–11. http://dx.doi.org/10.1080/00401706.2015.1014065.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Hao, and William J. Welch. "Comment: Expected Improvement for Efficient Blackbox Constrained Optimization." Technometrics 58, no. 1 (January 2, 2016): 12–15. http://dx.doi.org/10.1080/00401706.2015.1044119.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Audet, Charles, Gilles Caporossi, and Stéphane Jacquet. "Binary, unrelaxable and hidden constraints in blackbox optimization." Operations Research Letters 48, no. 4 (July 2020): 467–71. http://dx.doi.org/10.1016/j.orl.2020.05.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Audet, Charles, Jean Bigeon, Romain Couderc, and Michael Kokkolaras. "Sequential stochastic blackbox optimization with zeroth-order gradient estimators." AIMS Mathematics 8, no. 11 (2023): 25922–56. http://dx.doi.org/10.3934/math.20231321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
<abstract><p>This work considers stochastic optimization problems in which the objective function values can only be computed by a blackbox corrupted by some random noise following an unknown distribution. The proposed method is based on sequential stochastic optimization (SSO), i.e., the original problem is decomposed into a sequence of subproblems. Each subproblem is solved by using a zeroth-order version of a sign stochastic gradient descent with momentum algorithm (i.e., ZO-signum) and with increasingly fine precision. This decomposition allows a good exploration of the space while maintaining the efficiency of the algorithm once it gets close to the solution. Under the Lipschitz continuity assumption on the blackbox, a convergence rate in mean is derived for the ZO-signum algorithm. Moreover, if the blackbox is smooth and convex or locally convex around its minima, the rate of convergence to an $ \epsilon $-optimal point of the problem may be obtained for the SSO algorithm. Numerical experiments are conducted to compare the SSO algorithm with other state-of-the-art algorithms and to demonstrate its competitiveness.</p></abstract>
8

Audet, Charles, and Michael Kokkolaras. "Blackbox and derivative-free optimization: theory, algorithms and applications." Optimization and Engineering 17, no. 1 (February 1, 2016): 1–2. http://dx.doi.org/10.1007/s11081-016-9307-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Herraz, Mahfoud, Jean-Max Redonnet, Mohammed Sbihi, and Marcel Mongeau. "Blackbox optimization and surrogate models for machining free-form surfaces." Computers & Industrial Engineering 177 (March 2023): 109029. http://dx.doi.org/10.1016/j.cie.2023.109029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sankaran, Anush, Olivier Mastropietro, Ehsan Saboori, Yasser Idris, Davis Sawyer, MohammadHossein AskariHemmat, and Ghouthi Boukli Hacene. "Deeplite NeutrinoTM: A BlackBox Framework for Constrained Deep Learning Model Optimization." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 17 (May 18, 2021): 15166–74. http://dx.doi.org/10.1609/aaai.v35i17.17780.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Designing deep learning-based solutions is becoming a race for training deeper models with a greater number of layers. While a large-size deeper model could provide competitive accuracy, it creates a lot of logistical challenges and unreasonable resource requirements during development and deployment. This has been one of the key reasons for deep learning models not being excessively used in various production environments, especially in edge devices. There is an immediate requirement for optimizing and compressing these deep learning models, to enable on-device intelligence. In this research, we introduce a black-box framework, Deeplite Neutrino^{TM} for production-ready optimization of deep learning models. The framework provides an easy mechanism for the end-users to provide constraints such as a tolerable drop in accuracy or target size of the optimized models, to guide the whole optimization process. The framework is easy to include in an existing production pipeline and is available as a Python Package, supporting PyTorch and Tensorflow libraries. The optimization performance of the framework is shown across multiple benchmark datasets and popular deep learning models. Further, the framework is currently used in production and the results and testimonials from several clients are summarized.

Dissertations / Theses on the topic "Blackbox optimization":

1

Dahito, Marie-Ange. "Constrained mixed-variable blackbox optimization with applications in the automotive industry." Electronic Thesis or Diss., Institut polytechnique de Paris, 2022. http://www.theses.fr/2022IPPAS017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Bon nombre de problèmes d'optimisation rencontrés dans l'industrie font appel à des systèmes complexes et n'ont pas de formulation analytique explicite : ce sont des problèmes d'optimisation de type boîte noire (ou blackbox en anglais). Ils peuvent être dits “mixtes”, auquel cas ils impliquent des variables de différentes natures (continues et discrètes), et avoir de nombreuses contraintes à satisfaire. De plus, les évaluations de l'objectif et des contraintes peuvent être numériquement coûteuses.Dans cette thèse, nous étudions des méthodes de résolution de tels problèmes complexes, à savoir des problèmes d'optimisation boîte noire avec contraintes et variables mixtes, pour lesquels les évaluations des fonctions sont très coûteuses en temps de calcul.Puisque l'utilisation de dérivées n'est pas envisageable, ce type de problèmes est généralement abordé par des approches sans dérivées comme les algorithmes évolutionnaires, les méthodes de recherche directe et les approches basées sur des métamodèles.Nous étudions les performances de telles méthodes déterministes et stochastiques dans le cadre de l'optimisation boîte noire, y compris sur un cas test en éléments finis que nous avons conçu. En particulier, nous évaluons les performances de la variante ORTHOMADS de l'algorithme de recherche directe MADS sur des problèmes d'optimisation continus et à variables mixtes issus de la littérature.Nous proposons également une nouvelle méthode d'optimisation boîte noire, nommée BOA, basée sur des approximations par métamodèles. Elle comporte deux phases dont la première vise à trouver un point réalisable tandis que la seconde améliore itérativement la valeur de l'objectif de la meilleure solution réalisable trouvée. Nous décrivons des expériences utilisant des instances de la littérature et des applications de l'industrie automobile. Elles incluent des tests de notre algorithme avec différents types de métamodèles, ainsi que des comparaisons avec ORTHOMADS
Numerous industrial optimization problems are concerned with complex systems and have no explicit analytical formulation, that is they are blackbox optimization problems. They may be mixed, namely involve different types of variables (continuous and discrete), and comprise many constraints that must be satisfied. In addition, the objective and constraint blackbox functions may be computationally expensive to evaluate.In this thesis, we investigate solution methods for such challenging problems, i.e constrained mixed-variable blackbox optimization problems involving computationally expensive functions.As the use of derivatives is impractical, problems of this form are commonly tackled using derivative-free approaches such as evolutionary algorithms, direct search and surrogate-based methods.We investigate the performance of such deterministic and stochastic methods in the context of blackbox optimization, including a finite element test case designed for our research purposes. In particular, the performance of the ORTHOMADS instantiation of the direct search MADS algorithm is analyzed on continuous and mixed-integer optimization problems from the literature.We also propose a new blackbox optimization algorithm, called BOA, based on surrogate approximations. It proceeds in two phases, the first of which focuses on finding a feasible solution, while the second one iteratively improves the objective value of the best feasible solution found. Experiments on instances stemming from the literature and applications from the automotive industry are reported. They namely include results of our algorithm considering different types of surrogates and comparisons with ORTHOMADS
2

Anil, Gautham. "A Fitness Function Elimination Theory for Blackbox Optimization and Problem Class Learning." Doctoral diss., University of Central Florida, 2012. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The modern view of optimization is that optimization algorithms are not designed in a vacuum, but can make use of information regarding the broad class of objective functions from which a problem instance is drawn. Using this knowledge, we want to design optimization algorithms that execute quickly (efficiency), solve the objective function with minimal samples (performance), and are applicable over a wide range of problems (abstraction). However, we present a new theory for blackbox optimization from which, we conclude that of these three desired characteristics, only two can be maximized by any algorithm. We put forward an alternate view of optimization where we use knowledge about the problem class and samples from the problem instance to identify which problem instances from the class are being solved. From this Elimination of Fitness Functions approach, an idealized optimization algorithm that minimizes sample counts over any problem class, given complete knowledge about the class, is designed. This theory allows us to learn more about the difficulty of various problems, and we are able to use it to develop problem complexity bounds. We present general methods to model this algorithm over a particular problem class and gain efficiency at the cost of specifically targeting that class. This is demonstrated over the Generalized Leading-Ones problem and a generalization called LO'', and efficient algorithms with optimal performance are derived and analyzed. We also tighten existing bounds for LO'''. Additionally, we present a probabilistic framework based on our Elimination of Fitness Functions approach that clarifies how one can ideally learn about the problem class we face from the objective functions. This problem learning increases the performance of an optimization algorithm at the cost of abstraction. In the context of this theory, we re-examine the blackbox framework as an algorithm design framework and suggest several improvements to existing methods, including incorporating problem learning, not being restricted to blackbox framework and building parametrized algorithms. We feel that this theory and our recommendations will help a practitioner make substantially better use of all that is available in typical practical optimization algorithm design scenarios.
Ph.D.
Doctorate
Computer Science
Engineering and Computer Science
Computer Science
3

Bittar, Thomas. "Stochastic optimization of maintenance scheduling : blackbox methods, decomposition approaches - Theoretical and numerical aspects." Thesis, Marne-la-vallée, ENPC, 2021. http://www.theses.fr/2021ENPC2004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Le but de cette thèse est de développer des algorithmes pour la planification optimale de la maintenance. On s'intéresse à des systèmes de grande taille constitués de plusieurs composants liés par un stock commun de pièces de rechange. Les tests numériques sont effectués sur des systèmes de composants d'une même centrale hydroélectrique. La première partie est consacrée à l'étude des méthodes de type boîte noire qui sont souvent utilisées pour la planification de la maintenance. On s'intéresse à un algorithme basé sur le krigeage, Efficient Global Optimization (EGO), et à une méthode de recherche directe, Mesh Adaptive Direct Search (MADS). On présente le fonctionnement des algorithmes aussi bien d'un point de vue théorique que pratique et on propose quelques améliorations pour l'implémentation d'EGO. On compare MADS et EGO sur un banc d'essai académique et sur des cas industriels de petite taille, montrant la supériorité de MADS mais aussi les limites des méthodes boîte noire lorsque l'on veut s'attaquer à des problèmes de grande taille. Dans une deuxième partie, on veut prendre en compte la structure du système, constitué de plusieurs composants liés par un stock commun, afin de pouvoir résoudre des problèmes d'optimisation de maintenance en grande dimension. Dans ce but, on développe un modèle de la dynamique du système étudié et on formule explicitement un problème de contrôle optimal stochastique. On met en place un schéma de décomposition par prédiction, basé sur le Principe du Problème Auxiliaire (PPA), qui permet de ramener la résolution du problème en grande dimension à la résolution itérative d'une suite de sous-problèmes de plus petite taille. La décomposition est d'abord appliquée sur des cas tests académiques où elle se révèle très performante. Dans le cas industriel, il est nécessaire de procéder à une "relaxation" du système pour appliquer la méthode de décomposition. Lors des tests numériques, on résout une approximation de Monte-Carlo du problème. La décomposition permet d'obtenir des gains substantiels par rapport à l'algorithme de référence. Pour résoudre l'approximation de Monte-Carlo du problème de maintenance, on a utilisé une version déterministe du PPA. Dans la troisième partie, on étudie le PPA dans le cadre de l'approximation stochastique dans un espace de Banach. On prouve la mesurabilité des itérés de l'algorithme, on étend aux espaces de Banach des résultats de convergence existant dans les espaces de Hilbert et on donne des vitesses de convergence
The aim of the thesis is to develop algorithms for optimal maintenance scheduling. We focus on the specific case of large systems that consist of several components linked by a common stock of spare parts. The numerical experiments are carried out on systems of components from a single hydroelectric power plant.The first part is devoted to blackbox methods which are commonly used in maintenance scheduling. We focus on a kriging-based algorithm, Efficient Global Optimization (EGO), and on a direct search method, Mesh Adaptive Direct Search (MADS). We present a theoretical and practical review of the algorithms as well as some improvements for the implementation of EGO. MADS and EGO are compared on an academic benchmark and on small industrial maintenance problems, showing the superiority of MADS but also the limitation of the blackbox approach when tackling large-scale problems.In a second part, we want to take into account the fact that the system is composed of several components linked by a common stock in order to address large-scale maintenance optimization problems. For that purpose, we develop a model of the dynamics of the studied system and formulate an explicit stochastic optimal control problem. We set up a scheme of decomposition by prediction, based on the Auxiliary Problem Principle (APP), that turns the resolution of the large-scale problem into the iterative resolution of a sequence of subproblems of smaller size. The decomposition is first applied on synthetic test cases where it proves to be very efficient. For the industrial case, a "relaxation" of the system is needed and developed to apply the decomposition methodology. In the numerical experiments, we solve a Sample Average Approximation (SAA) of the problem and show that the decomposition leads to substantial gains over the reference algorithm.As we use a SAA method, we have considered the APP in a deterministic setting. In the third part, we study the APP in the stochastic approximation framework in a Banach space. We prove the measurability of the iterates of the algorithm, extend convergence results from Hilbert spaces to Banach spaces and give efficiency estimates
4

Hemker, Thomas. "Derivative free surrogate optimization for mixed integer nonlinear black box problems in engineering." Düsseldorf VDI-Verl, 2009. http://d-nb.info/995156654/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Draheim, Patrick [Verfasser], Gabriel [Akademischer Betreuer] Zachmann, Gabriel [Gutachter] Zachmann, and Marc-Erich [Gutachter] Latoschik. "New Concepts for Virtual Testbeds : Data Mining Algorithms for Blackbox Optimization based on Wait-Free Concurrency and Generative Simulation / Patrick Draheim ; Gutachter: Gabriel Zachmann, Marc-Erich Latoschik ; Betreuer: Gabriel Zachmann." Bremen : Staats- und Universitätsbibliothek Bremen, 2018. http://d-nb.info/1176103636/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Atamna, Asma. "Analysis of Randomized Adaptive Algorithms for Black-Box Continuous Constrained Optimization." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS010/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
On s'intéresse à l'étude d'algorithmes stochastiques pour l'optimisation numérique boîte-noire. Dans la première partie de cette thèse, on présente une méthodologie pour évaluer efficacement des stratégies d'adaptation du step-size dans le cas de l'optimisation boîte-noire sans contraintes. Le step-size est un paramètre important dans les algorithmes évolutionnaires tels que les stratégies d'évolution; il contrôle la diversité de la population et, de ce fait, joue un rôle déterminant dans la convergence de l'algorithme. On présente aussi les résultats empiriques de la comparaison de trois méthodes d'adaptation du step-size. Ces algorithmes sont testés sur le testbed BBOB (black-box optimization benchmarking) de la plateforme COCO (comparing continuous optimisers). Dans la deuxième partie de cette thèse, sont présentées nos contributions dans le domaine de l'optimisation boîte-noire avec contraintes. On analyse la convergence linéaire d'algorithmes stochastiques adaptatifs pour l'optimisation sous contraintes dans le cas de contraintes linéaires, gérées avec une approche Lagrangien augmenté adaptative. Pour ce faire, on étend l'analyse par chaines de Markov faite dans le cas d'optimisation sans contraintes au cas avec contraintes: pour chaque algorithme étudié, on exhibe une classe de fonctions pour laquelle il existe une chaine de Markov homogène telle que la stabilité de cette dernière implique la convergence linéaire de l'algorithme. La convergence linéaire est déduite en appliquant une loi des grands nombres pour les chaines de Markov, sous l'hypothèse de la stabilité. Dans notre cas, la stabilité est validée empiriquement
We investigate various aspects of adaptive randomized (or stochastic) algorithms for both constrained and unconstrained black-box continuous optimization. The first part of this thesis focuses on step-size adaptation in unconstrained optimization. We first present a methodology for assessing efficiently a step-size adaptation mechanism that consists in testing a given algorithm on a minimal set of functions, each reflecting a particular difficulty that an efficient step-size adaptation algorithm should overcome. We then benchmark two step-size adaptation mechanisms on the well-known BBOB noiseless testbed and compare their performance to the one of the state-of-the-art evolution strategy (ES), CMA-ES, with cumulative step-size adaptation. In the second part of this thesis, we investigate linear convergence of a (1 + 1)-ES and a general step-size adaptive randomized algorithm on a linearly constrained optimization problem, where an adaptive augmented Lagrangian approach is used to handle the constraints. To that end, we extend the Markov chain approach used to analyze randomized algorithms for unconstrained optimization to the constrained case. We prove that when the augmented Lagrangian associated to the problem, centered at the optimum and the corresponding Lagrange multipliers, is positive homogeneous of degree 2, then for algorithms enjoying some invariance properties, there exists an underlying homogeneous Markov chain whose stability (typically positivity and Harris-recurrence) leads to linear convergence to both the optimum and the corresponding Lagrange multipliers. We deduce linear convergence under the aforementioned stability assumptions by applying a law of large numbers for Markov chains. We also present a general framework to design an augmented-Lagrangian-based adaptive randomized algorithm for constrained optimization, from an adaptive randomized algorithm for unconstrained optimization
7

Wang, Pei-Qi, and 王姵淇. "Apply Ant Colony Optimization to Test Case Prioritization for Blackbox Testing." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/5uz6rz.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
碩士
中原大學
工業與系統工程研究所
99
A mature software products have to pass two testing processes before being released, white-box testing and black-box testing, as much as possible to implement testing in different scenarios, to identify potential faults and defects. In order to improve software quality, testing engineers should perform regression testing for each software release. However, regression testing is a time consuming and expensive testing procedure, required to execute numerous test cases. Due to time, cost and labor constraints, how to effectively perform regression testing has become an important issue. In such case, test case prioritization technique is one of effective technologies. Test case prioritizaiton techniques schedule test cases in an order that attempts to maximize the effectiveness in terms of meeting some performance goals. In previous literatures, test case prioritization techniques considered prioritization factors based on costs, requirements, number of faults, faults severities, and so on; the researches have never considered the inter-dependency between test cases. Unlike past researches of the test case prioritization problem, we proposed the test case prioritization techniques based on severity, complexity and inter-dependency in black-box testing. Based ant colony optimization, and combined with the Maximum Partial Order/Arbitrary Insertion(MPO/AI)method, we generated test case execution order with precedence constraints. In this study, the method we proposed to solve scheduling problem is compared with the other algorithms. To solve the SOP problem, experimantal results indicate that our proposed technique yields a comparable value to other algorithms, and execute time is shortened significantly. To solve the TCP problem, we obtain better result than the comparison algorithms. Using in practical application of our graphical user interface(GUI)testing, the testing effectiveness is superior to the comparison algorithms. Therefore, the experimental results show that, our proposed technique is useful to prioritizate test cases.

Books on the topic "Blackbox optimization":

1

Audet, Charles, and Warren Hare. Derivative-Free and Blackbox Optimization. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Audet, Charles, and Warren Hare. Derivative-Free and Blackbox Optimization. Springer, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Audet, Charles, and Warren Hare. Derivative-Free and Blackbox Optimization. Springer, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Blackbox optimization":

1

Audet, Charles. "Blackbox Optimization." In Encyclopedia of Optimization, 1–6. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-54621-2_723-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Audet, Charles, and Warren Hare. "Biobjective Optimization." In Derivative-Free and Blackbox Optimization, 247–62. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Audet, Charles, and Warren Hare. "Optimization Using Surrogates and Models." In Derivative-Free and Blackbox Optimization, 235–46. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Audet, Charles, and Warren Hare. "Positive Bases and Nonsmooth Optimization." In Derivative-Free and Blackbox Optimization, 95–114. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Audet, Charles, and Warren Hare. "Introduction: Tools and Challenges in Derivative-Free and Blackbox Optimization." In Derivative-Free and Blackbox Optimization, 3–14. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Audet, Charles, and Warren Hare. "Model-Based Descent." In Derivative-Free and Blackbox Optimization, 183–200. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Audet, Charles, and Warren Hare. "Model-Based Trust Region." In Derivative-Free and Blackbox Optimization, 201–18. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Audet, Charles, and Warren Hare. "Variables and Constraints." In Derivative-Free and Blackbox Optimization, 221–34. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Audet, Charles, and Warren Hare. "Mathematical Background." In Derivative-Free and Blackbox Optimization, 15–31. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Audet, Charles, and Warren Hare. "The Beginnings of DFO Algorithms." In Derivative-Free and Blackbox Optimization, 33–54. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68913-5_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Blackbox optimization":

1

Hawkins, Byron, Brian Demsky, and Michael B. Taylor. "BlackBox: lightweight security monitoring for COTS binaries." In CGO '16: 14th Annual IEEE/ACM International Symposium on Code Generation and Optimization. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2854038.2854062.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Dev, Rahul, Krishanu Kundu, Amrita Rai, Shiv Narain Gupta, Abhishek Kaushik, and Reshu Agarwal. "Design and Implementation of Blackbox in Vehicles." In 2024 11th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO). IEEE, 2024. http://dx.doi.org/10.1109/icrito61523.2024.10522349.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kurhe, Vaibhav Kiran, Pratik Karia, Shubhani Gupta, Abhishek Rose, and Sorav Bansal. "Automatic Generation of Debug Headers through BlackBox Equivalence Checking." In 2022 IEEE/ACM International Symposium on Code Generation and Optimization (CGO). IEEE, 2022. http://dx.doi.org/10.1109/cgo53902.2022.9741273.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

"Managing Computationally Expensive Blackbox Multiobjective Optimization Problems with Libensemble." In 2020 Spring Simulation Conference. Society for Modeling and Simulation International (SCS), 2020. http://dx.doi.org/10.22360/springsim.2020.hpc.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hutter, Frank, Holger Hoos, and Kevin Leyton-Brown. "An evaluation of sequential model-based optimization for expensive blackbox functions." In Proceeding of the fifteenth annual conference companion. New York, New York, USA: ACM Press, 2013. http://dx.doi.org/10.1145/2464576.2501592.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Audet, Charles, Sébastien Le Digabel, Ludovic Salomon, and Christophe Tribes. "Constrained blackbox optimization with the NOMAD solver on the COCO constrained test suite." In GECCO '22: Genetic and Evolutionary Computation Conference. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3520304.3534019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Tehrani, H. Mazaheri, A. Frances, R. Asensi, and J. Uceda. "Blackbox Equivalent Switching Model Identification of DC-DC Power Electronic Converters Using optimization Algorithms." In 2021 IEEE Fourth International Conference on DC Microgrids (ICDCM). IEEE, 2021. http://dx.doi.org/10.1109/icdcm50975.2021.9504611.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hartpence, Bruce, and Andres Kwasinski. "Considering the Blackbox: An Investigation of Optimization Techniques with Completely Balanced Datasets of Packet Traffic." In 2019 IEEE International Conference on Big Data (Big Data). IEEE, 2019. http://dx.doi.org/10.1109/bigdata47090.2019.9006508.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chamseddine, Ibrahim M., and Michael Kokkolaras. "Bio-Inspired Heuristic for Decoupling Network Configuration in Air Transportation System-of-Systems Design Optimization." In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-59424.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Previous work in air transportation system-of-systems design optimization considered integrated aircraft sizing, fleet allocation and route network configuration. The associated nested multidisciplinary formulation posed a numerically challenging optimization problem; therefore, direct search methods with convergence properties were used to solve it. However, the complexity of the blackbox is such that it impedes greatly the solution of larger-scale problems, where the number of considered nodes in the route network is high. The research presented here adopts a rule-based route network design inspired by biological transfer principles. This bio-inspired approach decouples the network configuration problem from the optimization loop, leading to significant numerical simplifications. The usefulness of the bio-inspired approach is demonstrated by comparing its results to those obtained using the nested formulation for a 15-city network. We then consider introduction of new aircraft as well as a larger problem with 20 cities.
10

Wang, Lv, Teng Long, Lei Peng, and Li Liu. "Optimized Radial Basis Function Metamodel for Expensive Engineering Design Optimization." In ASME 2012 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/imece2012-87489.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
At the aim of alleviating the computational burden of complicated engineering optimization problems, metamodels have been widely employed to approximate the expensive blackbox functions. Among the popular metamodeling methods RBF metamodel well balances the global approximation accuracy, computational cost and implementation difficulty. However, the approximation accuracy of RBF metamodel is heavily influenced by the width factors of kernel functions, which are hard to determine and actually depend on the numerical behavior of expensive functions and distribution of samples. The main contribution of this paper is to propose an optimized RBF (ORBF) metamodel for the purpose of improving the global approximation capability with an affordable extra computational cost. Several numerical problems are used to compare the global approximation performance of the proposed ORBF metamodeling methods to determine the promising optimization approach. And the proposed ORBF is also adopted in adaptive metamodel-based optimization method. Two numerical benchmark examples and an I-beam optimization design are used to validate the adaptive metamodel-based optimization method using ORBF metamodel. It is demonstrated that ORBF metamodeling is beneficial to improving the optimization efficiency and global convergence capability for expensive engineering optimization problems.

To the bibliography