Littérature scientifique sur le sujet « Machine learning, Global Optimization »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Sommaire
Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Machine learning, Global Optimization ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Articles de revues sur le sujet "Machine learning, Global Optimization"
Cassioli, A., D. Di Lorenzo, M. Locatelli, F. Schoen et M. Sciandrone. « Machine learning for global optimization ». Computational Optimization and Applications 51, no 1 (5 mai 2010) : 279–303. http://dx.doi.org/10.1007/s10589-010-9330-x.
Texte intégralKudyshev, Zhaxylyk A., Alexander V. Kildishev, Vladimir M. Shalaev et Alexandra Boltasseva. « Machine learning–assisted global optimization of photonic devices ». Nanophotonics 10, no 1 (28 octobre 2020) : 371–83. http://dx.doi.org/10.1515/nanoph-2020-0376.
Texte intégralAbdul Salam, Mustafa, Ahmad Taher Azar et Rana Hussien. « Swarm-Based Extreme Learning Machine Models for Global Optimization ». Computers, Materials & ; Continua 70, no 3 (2022) : 6339–63. http://dx.doi.org/10.32604/cmc.2022.020583.
Texte intégralTAKAMATSU, Ryosuke, et Wataru YAMAZAKI. « Global topology optimization of supersonic airfoil using machine learning technologies ». Proceedings of The Computational Mechanics Conference 2021.34 (2021) : 112. http://dx.doi.org/10.1299/jsmecmd.2021.34.112.
Texte intégralTsoulos, Ioannis G., Alexandros Tzallas, Evangelos Karvounis et Dimitrios Tsalikakis. « NeuralMinimizer : A Novel Method for Global Optimization ». Information 14, no 2 (25 janvier 2023) : 66. http://dx.doi.org/10.3390/info14020066.
Texte intégralHonda, M., et E. Narita. « Machine-learning assisted steady-state profile predictions using global optimization techniques ». Physics of Plasmas 26, no 10 (octobre 2019) : 102307. http://dx.doi.org/10.1063/1.5117846.
Texte intégralWu, Shaohua, Yong Hu, Wei Wang, Xinyong Feng et Wanneng Shu. « Application of Global Optimization Methods for Feature Selection and Machine Learning ». Mathematical Problems in Engineering 2013 (2013) : 1–8. http://dx.doi.org/10.1155/2013/241517.
Texte intégralMa, Sicong, Cheng Shang, Chuan-Ming Wang et Zhi-Pan Liu. « Thermodynamic rules for zeolite formation from machine learning based global optimization ». Chemical Science 11, no 37 (2020) : 10113–18. http://dx.doi.org/10.1039/d0sc03918g.
Texte intégralHuang, Si-Da, Cheng Shang, Pei-Lin Kang et Zhi-Pan Liu. « Atomic structure of boron resolved using machine learning and global sampling ». Chemical Science 9, no 46 (2018) : 8644–55. http://dx.doi.org/10.1039/c8sc03427c.
Texte intégralBarkalov, Konstantin, Ilya Lebedev et Evgeny Kozinov. « Acceleration of Global Optimization Algorithm by Detecting Local Extrema Based on Machine Learning ». Entropy 23, no 10 (28 septembre 2021) : 1272. http://dx.doi.org/10.3390/e23101272.
Texte intégralThèses sur le sujet "Machine learning, Global Optimization"
Nowak, Hans II(Hans Antoon). « Strategic capacity planning using data science, optimization, and machine learning ». Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/126914.
Texte intégralThesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, in conjunction with the Leaders for Global Operations Program at MIT, May, 2020
Cataloged from the official PDF of thesis.
Includes bibliographical references (pages 101-104).
Raytheon's Circuit Card Assembly (CCA) factory in Andover, MA is Raytheon's largest factory and the largest Department of Defense (DOD) CCA manufacturer in the world. With over 500 operations, it manufactures over 7000 unique parts with a high degree of complexity and varying levels of demand. Recently, the factory has seen an increase in demand, making the ability to continuously analyze factory capacity and strategically plan for future operations much needed. This study seeks to develop a sustainable strategic capacity optimization model and capacity visualization tool that integrates demand data with historical manufacturing data. Through automated data mining algorithms of factory data sources, capacity utilization and overall equipment effectiveness (OEE) for factory operations are evaluated. Machine learning methods are then assessed to gain an accurate estimate of cycle time (CT) throughout the factory. Finally, a mixed-integer nonlinear program (MINLP) integrates the capacity utilization framework and machine learning predictions to compute the optimal strategic capacity planning decisions. Capacity utilization and OEE models are shown to be able to be generated through automated data mining algorithms. Machine learning models are shown to have a mean average error (MAE) of 1.55 on predictions for new data, which is 76.3% lower than the current CT prediction error. Finally, the MINLP is solved to optimality within a tolerance of 1.00e-04 and generates resource and production decisions that can be acted upon.
by Hans Nowak II.
M.B.A.
S.M.
M.B.A. Massachusetts Institute of Technology, Sloan School of Management
S.M. Massachusetts Institute of Technology, Department of Mechanical Engineering
Veluscek, Marco. « Global supply chain optimization : a machine learning perspective to improve caterpillar's logistics operations ». Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/13050.
Texte intégralSchweidtmann, Artur M. [Verfasser], Alexander [Akademischer Betreuer] Mitsos et Andreas [Akademischer Betreuer] Schuppert. « Global optimization of processes through machine learning / Artur M. Schweidtmann ; Alexander Mitsos, Andreas Schuppert ». Aachen : Universitätsbibliothek der RWTH Aachen, 2021. http://d-nb.info/1240690924/34.
Texte intégralTaheri, Mehdi. « Machine Learning from Computer Simulations with Applications in Rail Vehicle Dynamics and System Identification ». Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/81417.
Texte intégralPh. D.
Gabere, Musa Nur. « Prediction of antimicrobial peptides using hyperparameter optimized support vector machines ». Thesis, University of the Western Cape, 2011. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_7345_1330684697.
Texte intégralAntimicrobial peptides (AMPs) play a key role in the innate immune response. They can be ubiquitously found in a wide range of eukaryotes including mammals, amphibians, insects, plants, and protozoa. In lower organisms, AMPs function merely as antibiotics by permeabilizing cell membranes and lysing invading microbes. Prediction of antimicrobial peptides is important because experimental methods used in characterizing AMPs are costly, time consuming and resource intensive and identification of AMPs in insects can serve as a template for the design of novel antibiotic. In order to fulfil this, firstly, data on antimicrobial peptides is extracted from UniProt, manually curated and stored into a centralized database called dragon antimicrobial peptide database (DAMPD). Secondly, based on the curated data, models to predict antimicrobial peptides are created using support vector machine with optimized hyperparameters. In particular, global optimization methods such as grid search, pattern search and derivative-free methods are utilised to optimize the SVM hyperparameters. These models are useful in characterizing unknown antimicrobial peptides. Finally, a webserver is created that will be used to predict antimicrobial peptides in haemotophagous insects such as Glossina morsitan and Anopheles gambiae.
Belkhir, Nacim. « Per Instance Algorithm Configuration for Continuous Black Box Optimization ». Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS455/document.
Texte intégralThis PhD thesis focuses on the automated algorithm configuration that aims at finding the best parameter setting for a given problem or a' class of problem. The Algorithm Configuration problem thus amounts to a metal Foptimization problem in the space of parameters, whosemetaFobjective is the performance measure of the given algorithm at hand with a given parameter configuration. However, in the continuous domain, such method can only be empirically assessed at the cost of running the algorithm on some problem instances. More recent approaches rely on a description of problems in some features space, and try to learn a mapping from this feature space onto the space of parameter configurations of the algorithm at hand. Along these lines, this PhD thesis focuses on the Per Instance Algorithm Configuration (PIAC) for solving continuous black boxoptimization problems, where only a limited budget confessionnalisations available. We first survey Evolutionary Algorithms for continuous optimization, with a focus on two algorithms that we have used as target algorithm for PIAC, DE and CMAFES. Next, we review the state of the art of Algorithm Configuration approaches, and the different features that have been proposed in the literature to describe continuous black box optimization problems. We then introduce a general methodology to empirically study PIAC for the continuous domain, so that all the components of PIAC can be explored in real Fworld conditions. To this end, we also introduce a new continuous black box test bench, distinct from the famous BBOB'benchmark, that is composed of a several multiFdimensional test functions with different problem properties, gathered from the literature. The methodology is finally applied to two EAS. First we use Differential Evolution as'target algorithm, and explore all the components of PIAC, such that we empirically assess the best. Second, based on the results on DE, we empirically investigate PIAC with Covariance Matrix Adaptation Evolution Strategy (CMAFES) as target algorithm. Both use cases empirically validate the proposed methodology on the new black box testbench for dimensions up to100
Liu, Liu. « Stochastic Optimization in Machine Learning ». Thesis, The University of Sydney, 2019. http://hdl.handle.net/2123/19982.
Texte intégralLeblond, Rémi. « Asynchronous optimization for machine learning ». Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLEE057/document.
Texte intégralThe impressive breakthroughs of the last two decades in the field of machine learning can be in large part attributed to the explosion of computing power and available data. These two limiting factors have been replaced by a new bottleneck: algorithms. The focus of this thesis is thus on introducing novel methods that can take advantage of high data quantity and computing power. We present two independent contributions. First, we develop and analyze novel fast optimization algorithms which take advantage of the advances in parallel computing architecture and can handle vast amounts of data. We introduce a new framework of analysis for asynchronous parallel incremental algorithms, which enable correct and simple proofs. We then demonstrate its usefulness by performing the convergence analysis for several methods, including two novel algorithms. Asaga is a sparse asynchronous parallel variant of the variance-reduced algorithm Saga which enjoys fast linear convergence rates on smooth and strongly convex objectives. We prove that it can be linearly faster than its sequential counterpart, even without sparsity assumptions. ProxAsaga is an extension of Asaga to the more general setting where the regularizer can be non-smooth. We prove that it can also achieve a linear speedup. We provide extensive experiments comparing our new algorithms to the current state-of-art. Second, we introduce new methods for complex structured prediction tasks. We focus on recurrent neural networks (RNNs), whose traditional training algorithm for RNNs – based on maximum likelihood estimation (MLE) – suffers from several issues. The associated surrogate training loss notably ignores the information contained in structured losses and introduces discrepancies between train and test times that may hurt performance. To alleviate these problems, we propose SeaRNN, a novel training algorithm for RNNs inspired by the “learning to search” approach to structured prediction. SeaRNN leverages test-alike search space exploration to introduce global-local losses that are closer to the test error than the MLE objective. We demonstrate improved performance over MLE on three challenging tasks, and provide several subsampling strategies to enable SeaRNN to scale to large-scale tasks, such as machine translation. Finally, after contrasting the behavior of SeaRNN models to MLE models, we conduct an in-depth comparison of our new approach to the related work
Bai, Hao. « Machine learning assisted probabilistic prediction of long-term fatigue damage and vibration reduction of wind turbine tower using active damping system ». Thesis, Normandie, 2021. http://www.theses.fr/2021NORMIR01.
Texte intégralThis dissertation is devoted to the development of an active damping system for vibration reduction of wind turbine tower under gusty wind and turbulent wind. The presence of vibrations often leads to either an ultimate deflection on the top of wind tower or a failure due to the material’s fatigue near the bottom of wind tower. Furthermore, given the random nature of wind conditions, it is indispensable to look at this problem from a probabilistic point of view. In this work, a probabilistic framework of fatigue analysis is developed and improved by using a residual neural network. A damping system employing an active damper, Twin Rotor Damper, is designed for NREL 5MW reference wind turbine. The design is optimized by an evolutionary algorithm with automatic parameter tuning method based on exploitation and exploration
Chang, Allison An. « Integer optimization methods for machine learning ». Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/72643.
Texte intégralThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (p. 129-137).
In this thesis, we propose new mixed integer optimization (MIO) methods to ad- dress problems in machine learning. The first part develops methods for supervised bipartite ranking, which arises in prioritization tasks in diverse domains such as information retrieval, recommender systems, natural language processing, bioinformatics, and preventative maintenance. The primary advantage of using MIO for ranking is that it allows for direct optimization of ranking quality measures, as opposed to current state-of-the-art algorithms that use heuristic loss functions. We demonstrate using a number of datasets that our approach can outperform other ranking methods. The second part of the thesis focuses on reverse-engineering ranking models. This is an application of a more general ranking problem than the bipartite case. Quality rankings affect business for many organizations, and knowing the ranking models would allow these organizations to better understand the standards by which their products are judged and help them to create higher quality products. We introduce an MIO method for reverse-engineering such models and demonstrate its performance in a case-study with real data from a major ratings company. We also devise an approach to find the most cost-effective way to increase the rank of a certain product. In the final part of the thesis, we develop MIO methods to first generate association rules and then use the rules to build an interpretable classifier in the form of a decision list, which is an ordered list of rules. These are both combinatorially challenging problems because even a small dataset may yield a large number of rules and a small set of rules may correspond to many different orderings. We show how to use MIO to mine useful rules, as well as to construct a classifier from them. We present results in terms of both classification accuracy and interpretability for a variety of datasets.
by Allison An Chang.
Ph.D.
Livres sur le sujet "Machine learning, Global Optimization"
Optimization for machine learning. Cambridge, Mass : MIT Press, 2012.
Trouver le texte intégralLin, Zhouchen, Huan Li et Cong Fang. Accelerated Optimization for Machine Learning. Singapore : Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-2910-8.
Texte intégralAgrawal, Tanay. Hyperparameter Optimization in Machine Learning. Berkeley, CA : Apress, 2021. http://dx.doi.org/10.1007/978-1-4842-6579-6.
Texte intégralFazelnia, Ghazal. Optimization for Probabilistic Machine Learning. [New York, N.Y.?] : [publisher not identified], 2019.
Trouver le texte intégralNicosia, Giuseppe, Varun Ojha, Emanuele La Malfa, Gabriele La Malfa, Giorgio Jansen, Panos M. Pardalos, Giovanni Giuffrida et Renato Umeton, dir. Machine Learning, Optimization, and Data Science. Cham : Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95470-3.
Texte intégralNicosia, Giuseppe, Varun Ojha, Emanuele La Malfa, Gabriele La Malfa, Giorgio Jansen, Panos M. Pardalos, Giovanni Giuffrida et Renato Umeton, dir. Machine Learning, Optimization, and Data Science. Cham : Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95467-3.
Texte intégralJiang, Jiawei, Bin Cui et Ce Zhang. Distributed Machine Learning and Gradient Optimization. Singapore : Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-3420-8.
Texte intégralPardalos, Panos, Mario Pavone, Giovanni Maria Farinella et Vincenzo Cutello, dir. Machine Learning, Optimization, and Big Data. Cham : Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27926-8.
Texte intégralNicosia, Giuseppe, Panos Pardalos, Renato Umeton, Giovanni Giuffrida et Vincenzo Sciacca, dir. Machine Learning, Optimization, and Data Science. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-37599-7.
Texte intégralKulkarni, Anand J., et Suresh Chandra Satapathy, dir. Optimization in Machine Learning and Applications. Singapore : Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-0994-0.
Texte intégralChapitres de livres sur le sujet "Machine learning, Global Optimization"
Kearfott, Ralph Baker. « Mathematically Rigorous Global Optimization and Fuzzy Optimization ». Dans Black Box Optimization, Machine Learning, and No-Free Lunch Theorems, 169–94. Cham : Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-66515-9_7.
Texte intégralde Winter, Roy, Bas van Stein, Matthys Dijkman et Thomas Bäck. « Designing Ships Using Constrained Multi-objective Efficient Global Optimization ». Dans Machine Learning, Optimization, and Data Science, 191–203. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-13709-0_16.
Texte intégralCocola, Jorio, et Paul Hand. « Global Convergence of Sobolev Training for Overparameterized Neural Networks ». Dans Machine Learning, Optimization, and Data Science, 574–86. Cham : Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-64583-0_51.
Texte intégralZabinsky, Zelda B., Giulia Pedrielli et Hao Huang. « A Framework for Multi-fidelity Modeling in Global Optimization Approaches ». Dans Machine Learning, Optimization, and Data Science, 335–46. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-37599-7_28.
Texte intégralGriewank, Andreas, et Ángel Rojas. « Treating Artificial Neural Net Training as a Nonsmooth Global Optimization Problem ». Dans Machine Learning, Optimization, and Data Science, 759–70. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-37599-7_64.
Texte intégralIssa, Mohamed, Aboul Ella Hassanien et Ibrahim Ziedan. « Performance Evaluation of Sine-Cosine Optimization Versus Particle Swarm Optimization for Global Sequence Alignment Problem ». Dans Machine Learning Paradigms : Theory and Application, 375–91. Cham : Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-02357-7_18.
Texte intégralWang, Yong-Jun, Jiang-She Zhang et Yu-Fen Zhang. « An Effective and Efficient Two Stage Algorithm for Global Optimization ». Dans Advances in Machine Learning and Cybernetics, 487–96. Berlin, Heidelberg : Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11739685_51.
Texte intégralKiranyaz, Serkan, Turker Ince et Moncef Gabbouj. « Improving Global Convergence ». Dans Multidimensional Particle Swarm Optimization for Machine Learning and Pattern Recognition, 101–49. Berlin, Heidelberg : Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-37846-1_5.
Texte intégralConsoli, Sergio, Luca Tiozzo Pezzoli et Elisa Tosetti. « Using the GDELT Dataset to Analyse the Italian Sovereign Bond Market ». Dans Machine Learning, Optimization, and Data Science, 190–202. Cham : Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-64583-0_18.
Texte intégralRodrigues, Douglas, Gustavo Henrique de Rosa, Leandro Aparecido Passos et João Paulo Papa. « Adaptive Improved Flower Pollination Algorithm for Global Optimization ». Dans Nature-Inspired Computation in Data Mining and Machine Learning, 1–21. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-28553-1_1.
Texte intégralActes de conférences sur le sujet "Machine learning, Global Optimization"
He, Yi-chao, et Kun-qi Liu. « A Modified Particle Swarm Optimization for Solving Global Optimization Problems ». Dans 2006 International Conference on Machine Learning and Cybernetics. IEEE, 2006. http://dx.doi.org/10.1109/icmlc.2006.258615.
Texte intégralTamura, Kenichi, et Keiichiro Yasuda. « Spiral Multipoint Search for Global Optimization ». Dans 2011 Tenth International Conference on Machine Learning and Applications (ICMLA). IEEE, 2011. http://dx.doi.org/10.1109/icmla.2011.131.
Texte intégralYong-Jun Wang, Jiang-She Zhang et Yu-Fen Zhang. « A fast hybrid algorithm for global optimization ». Dans Proceedings of 2005 International Conference on Machine Learning and Cybernetics. IEEE, 2005. http://dx.doi.org/10.1109/icmlc.2005.1527462.
Texte intégralSun, Gao-Ji. « A new evolutionary algorithm for global numerical optimization ». Dans 2010 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2010. http://dx.doi.org/10.1109/icmlc.2010.5580961.
Texte intégralNacef, Abdelhakim, Miloud Bagaa, Youcef Aklouf, Abdellah Kaci, Diego Leonel Cadette Dutra et Adlen Ksentini. « Self-optimized network : When Machine Learning Meets Optimization ». Dans GLOBECOM 2021 - 2021 IEEE Global Communications Conference. IEEE, 2021. http://dx.doi.org/10.1109/globecom46510.2021.9685681.
Texte intégralLi, Xue-Qiang, Zhi-Feng Hao et Han Huang. « An evolutionary algorithm with sorted race mechanism for global optimization ». Dans 2010 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2010. http://dx.doi.org/10.1109/icmlc.2010.5580810.
Texte intégralInjadat, MohammadNoor, Fadi Salo, Ali Bou Nassif, Aleksander Essex et Abdallah Shami. « Bayesian Optimization with Machine Learning Algorithms Towards Anomaly Detection ». Dans GLOBECOM 2018 - 2018 IEEE Global Communications Conference. IEEE, 2018. http://dx.doi.org/10.1109/glocom.2018.8647714.
Texte intégralCandelieri, Antonio, et Francesco Archetti. « Sequential model based optimization with black-box constraints : Feasibility determination via machine learning ». Dans PROCEEDINGS LEGO – 14TH INTERNATIONAL GLOBAL OPTIMIZATION WORKSHOP. Author(s), 2019. http://dx.doi.org/10.1063/1.5089977.
Texte intégralChen, Chang-Huang. « Bare bone particle swarm optimization with integration of global and local learning strategies ». Dans 2011 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2011. http://dx.doi.org/10.1109/icmlc.2011.6016781.
Texte intégralSoroush, H. M. « Bicriteria single machine scheduling with setup times and learning effects ». Dans PROCEEDINGS OF THE SIXTH GLOBAL CONFERENCE ON POWER CONTROL AND OPTIMIZATION. AIP, 2012. http://dx.doi.org/10.1063/1.4769005.
Texte intégralRapports d'organisations sur le sujet "Machine learning, Global Optimization"
Saenz, Juan Antonio, Ismael Djibrilla Boureima, Vitaliy Gyrya et Susan Kurien. Machine-Learning for Rapid Optimization of Turbulence Models. Office of Scientific and Technical Information (OSTI), juillet 2020. http://dx.doi.org/10.2172/1638623.
Texte intégralGu, Xiaofeng, A. Fedotov et D. Kayran. Application of a machine learning algorithm (XGBoost) to offline RHIC luminosity optimization. Office of Scientific and Technical Information (OSTI), avril 2021. http://dx.doi.org/10.2172/1777441.
Texte intégralRolf, Esther, Jonathan Proctor, Tamma Carleton, Ian Bolliger, Vaishaal Shankar, Miyabi Ishihara, Benjamin Recht et Solomon Hsiang. A Generalizable and Accessible Approach to Machine Learning with Global Satellite Imagery. Cambridge, MA : National Bureau of Economic Research, novembre 2020. http://dx.doi.org/10.3386/w28045.
Texte intégralScheinberg, Katya. Derivative Free Optimization of Complex Systems with the Use of Statistical Machine Learning Models. Fort Belvoir, VA : Defense Technical Information Center, septembre 2015. http://dx.doi.org/10.21236/ada622645.
Texte intégralGhanshyam, Pilania, Kenneth James McClellan, Christopher Richard Stanek et Blas P. Uberuaga. Physics-Informed Machine Learning for Discovery and Optimization of Materials : A Case Study of Scintillators. Office of Scientific and Technical Information (OSTI), août 2018. http://dx.doi.org/10.2172/1463529.
Texte intégralBao, Jie, Chao Wang, Zhijie Xu et Brian J. Koeppel. Physics-Informed Machine Learning with Application to Solid Oxide Fuel Cell System Modeling and Optimization. Office of Scientific and Technical Information (OSTI), septembre 2019. http://dx.doi.org/10.2172/1569289.
Texte intégralGabelmann, Jeffrey, et Eduardo Gildin. A Machine Learning-Based Geothermal Drilling Optimization System Using EM Short-Hop Bit Dynamics Measurements. Office of Scientific and Technical Information (OSTI), avril 2020. http://dx.doi.org/10.2172/1842454.
Texte intégralQi, Fei, Zhaohui Xia, Gaoyang Tang, Hang Yang, Yu Song, Guangrui Qian, Xiong An, Chunhuan Lin et Guangming Shi. A Graph-based Evolutionary Algorithm for Automated Machine Learning. Web of Open Science, décembre 2020. http://dx.doi.org/10.37686/ser.v1i2.77.
Texte intégralVittorio, Alan, et Kate Calvin. Using machine learning to improve land use/cover characterization and projection for scenario-based global modeling. Office of Scientific and Technical Information (OSTI), avril 2021. http://dx.doi.org/10.2172/1769796.
Texte intégralWu, S. Boiler Optimization Using Advance Machine Learning Techniques. Final Report for period September 30, 1995 - September 29, 2000. Office of Scientific and Technical Information (OSTI), août 2005. http://dx.doi.org/10.2172/877237.
Texte intégral