Academic literature on the topic 'Machine learning, Global Optimization'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Machine learning, Global Optimization.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Machine learning, Global Optimization"
Cassioli, A., D. Di Lorenzo, M. Locatelli, F. Schoen, and M. Sciandrone. "Machine learning for global optimization." Computational Optimization and Applications 51, no. 1 (May 5, 2010): 279–303. http://dx.doi.org/10.1007/s10589-010-9330-x.
Full textKudyshev, Zhaxylyk A., Alexander V. Kildishev, Vladimir M. Shalaev, and Alexandra Boltasseva. "Machine learning–assisted global optimization of photonic devices." Nanophotonics 10, no. 1 (October 28, 2020): 371–83. http://dx.doi.org/10.1515/nanoph-2020-0376.
Full textAbdul Salam, Mustafa, Ahmad Taher Azar, and Rana Hussien. "Swarm-Based Extreme Learning Machine Models for Global Optimization." Computers, Materials & Continua 70, no. 3 (2022): 6339–63. http://dx.doi.org/10.32604/cmc.2022.020583.
Full textTAKAMATSU, Ryosuke, and Wataru YAMAZAKI. "Global topology optimization of supersonic airfoil using machine learning technologies." Proceedings of The Computational Mechanics Conference 2021.34 (2021): 112. http://dx.doi.org/10.1299/jsmecmd.2021.34.112.
Full textTsoulos, Ioannis G., Alexandros Tzallas, Evangelos Karvounis, and Dimitrios Tsalikakis. "NeuralMinimizer: A Novel Method for Global Optimization." Information 14, no. 2 (January 25, 2023): 66. http://dx.doi.org/10.3390/info14020066.
Full textHonda, M., and E. Narita. "Machine-learning assisted steady-state profile predictions using global optimization techniques." Physics of Plasmas 26, no. 10 (October 2019): 102307. http://dx.doi.org/10.1063/1.5117846.
Full textWu, Shaohua, Yong Hu, Wei Wang, Xinyong Feng, and Wanneng Shu. "Application of Global Optimization Methods for Feature Selection and Machine Learning." Mathematical Problems in Engineering 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/241517.
Full textMa, Sicong, Cheng Shang, Chuan-Ming Wang, and Zhi-Pan Liu. "Thermodynamic rules for zeolite formation from machine learning based global optimization." Chemical Science 11, no. 37 (2020): 10113–18. http://dx.doi.org/10.1039/d0sc03918g.
Full textHuang, Si-Da, Cheng Shang, Pei-Lin Kang, and Zhi-Pan Liu. "Atomic structure of boron resolved using machine learning and global sampling." Chemical Science 9, no. 46 (2018): 8644–55. http://dx.doi.org/10.1039/c8sc03427c.
Full textBarkalov, Konstantin, Ilya Lebedev, and Evgeny Kozinov. "Acceleration of Global Optimization Algorithm by Detecting Local Extrema Based on Machine Learning." Entropy 23, no. 10 (September 28, 2021): 1272. http://dx.doi.org/10.3390/e23101272.
Full textDissertations / Theses on the topic "Machine learning, Global Optimization"
Nowak, Hans II(Hans Antoon). "Strategic capacity planning using data science, optimization, and machine learning." Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/126914.
Full textThesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, in conjunction with the Leaders for Global Operations Program at MIT, May, 2020
Cataloged from the official PDF of thesis.
Includes bibliographical references (pages 101-104).
Raytheon's Circuit Card Assembly (CCA) factory in Andover, MA is Raytheon's largest factory and the largest Department of Defense (DOD) CCA manufacturer in the world. With over 500 operations, it manufactures over 7000 unique parts with a high degree of complexity and varying levels of demand. Recently, the factory has seen an increase in demand, making the ability to continuously analyze factory capacity and strategically plan for future operations much needed. This study seeks to develop a sustainable strategic capacity optimization model and capacity visualization tool that integrates demand data with historical manufacturing data. Through automated data mining algorithms of factory data sources, capacity utilization and overall equipment effectiveness (OEE) for factory operations are evaluated. Machine learning methods are then assessed to gain an accurate estimate of cycle time (CT) throughout the factory. Finally, a mixed-integer nonlinear program (MINLP) integrates the capacity utilization framework and machine learning predictions to compute the optimal strategic capacity planning decisions. Capacity utilization and OEE models are shown to be able to be generated through automated data mining algorithms. Machine learning models are shown to have a mean average error (MAE) of 1.55 on predictions for new data, which is 76.3% lower than the current CT prediction error. Finally, the MINLP is solved to optimality within a tolerance of 1.00e-04 and generates resource and production decisions that can be acted upon.
by Hans Nowak II.
M.B.A.
S.M.
M.B.A. Massachusetts Institute of Technology, Sloan School of Management
S.M. Massachusetts Institute of Technology, Department of Mechanical Engineering
Veluscek, Marco. "Global supply chain optimization : a machine learning perspective to improve caterpillar's logistics operations." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/13050.
Full textSchweidtmann, Artur M. [Verfasser], Alexander [Akademischer Betreuer] Mitsos, and Andreas [Akademischer Betreuer] Schuppert. "Global optimization of processes through machine learning / Artur M. Schweidtmann ; Alexander Mitsos, Andreas Schuppert." Aachen : Universitätsbibliothek der RWTH Aachen, 2021. http://d-nb.info/1240690924/34.
Full textTaheri, Mehdi. "Machine Learning from Computer Simulations with Applications in Rail Vehicle Dynamics and System Identification." Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/81417.
Full textPh. D.
Gabere, Musa Nur. "Prediction of antimicrobial peptides using hyperparameter optimized support vector machines." Thesis, University of the Western Cape, 2011. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_7345_1330684697.
Full textAntimicrobial peptides (AMPs) play a key role in the innate immune response. They can be ubiquitously found in a wide range of eukaryotes including mammals, amphibians, insects, plants, and protozoa. In lower organisms, AMPs function merely as antibiotics by permeabilizing cell membranes and lysing invading microbes. Prediction of antimicrobial peptides is important because experimental methods used in characterizing AMPs are costly, time consuming and resource intensive and identification of AMPs in insects can serve as a template for the design of novel antibiotic. In order to fulfil this, firstly, data on antimicrobial peptides is extracted from UniProt, manually curated and stored into a centralized database called dragon antimicrobial peptide database (DAMPD). Secondly, based on the curated data, models to predict antimicrobial peptides are created using support vector machine with optimized hyperparameters. In particular, global optimization methods such as grid search, pattern search and derivative-free methods are utilised to optimize the SVM hyperparameters. These models are useful in characterizing unknown antimicrobial peptides. Finally, a webserver is created that will be used to predict antimicrobial peptides in haemotophagous insects such as Glossina morsitan and Anopheles gambiae.
Belkhir, Nacim. "Per Instance Algorithm Configuration for Continuous Black Box Optimization." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS455/document.
Full textThis PhD thesis focuses on the automated algorithm configuration that aims at finding the best parameter setting for a given problem or a' class of problem. The Algorithm Configuration problem thus amounts to a metal Foptimization problem in the space of parameters, whosemetaFobjective is the performance measure of the given algorithm at hand with a given parameter configuration. However, in the continuous domain, such method can only be empirically assessed at the cost of running the algorithm on some problem instances. More recent approaches rely on a description of problems in some features space, and try to learn a mapping from this feature space onto the space of parameter configurations of the algorithm at hand. Along these lines, this PhD thesis focuses on the Per Instance Algorithm Configuration (PIAC) for solving continuous black boxoptimization problems, where only a limited budget confessionnalisations available. We first survey Evolutionary Algorithms for continuous optimization, with a focus on two algorithms that we have used as target algorithm for PIAC, DE and CMAFES. Next, we review the state of the art of Algorithm Configuration approaches, and the different features that have been proposed in the literature to describe continuous black box optimization problems. We then introduce a general methodology to empirically study PIAC for the continuous domain, so that all the components of PIAC can be explored in real Fworld conditions. To this end, we also introduce a new continuous black box test bench, distinct from the famous BBOB'benchmark, that is composed of a several multiFdimensional test functions with different problem properties, gathered from the literature. The methodology is finally applied to two EAS. First we use Differential Evolution as'target algorithm, and explore all the components of PIAC, such that we empirically assess the best. Second, based on the results on DE, we empirically investigate PIAC with Covariance Matrix Adaptation Evolution Strategy (CMAFES) as target algorithm. Both use cases empirically validate the proposed methodology on the new black box testbench for dimensions up to100
Liu, Liu. "Stochastic Optimization in Machine Learning." Thesis, The University of Sydney, 2019. http://hdl.handle.net/2123/19982.
Full textLeblond, Rémi. "Asynchronous optimization for machine learning." Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLEE057/document.
Full textThe impressive breakthroughs of the last two decades in the field of machine learning can be in large part attributed to the explosion of computing power and available data. These two limiting factors have been replaced by a new bottleneck: algorithms. The focus of this thesis is thus on introducing novel methods that can take advantage of high data quantity and computing power. We present two independent contributions. First, we develop and analyze novel fast optimization algorithms which take advantage of the advances in parallel computing architecture and can handle vast amounts of data. We introduce a new framework of analysis for asynchronous parallel incremental algorithms, which enable correct and simple proofs. We then demonstrate its usefulness by performing the convergence analysis for several methods, including two novel algorithms. Asaga is a sparse asynchronous parallel variant of the variance-reduced algorithm Saga which enjoys fast linear convergence rates on smooth and strongly convex objectives. We prove that it can be linearly faster than its sequential counterpart, even without sparsity assumptions. ProxAsaga is an extension of Asaga to the more general setting where the regularizer can be non-smooth. We prove that it can also achieve a linear speedup. We provide extensive experiments comparing our new algorithms to the current state-of-art. Second, we introduce new methods for complex structured prediction tasks. We focus on recurrent neural networks (RNNs), whose traditional training algorithm for RNNs – based on maximum likelihood estimation (MLE) – suffers from several issues. The associated surrogate training loss notably ignores the information contained in structured losses and introduces discrepancies between train and test times that may hurt performance. To alleviate these problems, we propose SeaRNN, a novel training algorithm for RNNs inspired by the “learning to search” approach to structured prediction. SeaRNN leverages test-alike search space exploration to introduce global-local losses that are closer to the test error than the MLE objective. We demonstrate improved performance over MLE on three challenging tasks, and provide several subsampling strategies to enable SeaRNN to scale to large-scale tasks, such as machine translation. Finally, after contrasting the behavior of SeaRNN models to MLE models, we conduct an in-depth comparison of our new approach to the related work
Bai, Hao. "Machine learning assisted probabilistic prediction of long-term fatigue damage and vibration reduction of wind turbine tower using active damping system." Thesis, Normandie, 2021. http://www.theses.fr/2021NORMIR01.
Full textThis dissertation is devoted to the development of an active damping system for vibration reduction of wind turbine tower under gusty wind and turbulent wind. The presence of vibrations often leads to either an ultimate deflection on the top of wind tower or a failure due to the material’s fatigue near the bottom of wind tower. Furthermore, given the random nature of wind conditions, it is indispensable to look at this problem from a probabilistic point of view. In this work, a probabilistic framework of fatigue analysis is developed and improved by using a residual neural network. A damping system employing an active damper, Twin Rotor Damper, is designed for NREL 5MW reference wind turbine. The design is optimized by an evolutionary algorithm with automatic parameter tuning method based on exploitation and exploration
Chang, Allison An. "Integer optimization methods for machine learning." Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/72643.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (p. 129-137).
In this thesis, we propose new mixed integer optimization (MIO) methods to ad- dress problems in machine learning. The first part develops methods for supervised bipartite ranking, which arises in prioritization tasks in diverse domains such as information retrieval, recommender systems, natural language processing, bioinformatics, and preventative maintenance. The primary advantage of using MIO for ranking is that it allows for direct optimization of ranking quality measures, as opposed to current state-of-the-art algorithms that use heuristic loss functions. We demonstrate using a number of datasets that our approach can outperform other ranking methods. The second part of the thesis focuses on reverse-engineering ranking models. This is an application of a more general ranking problem than the bipartite case. Quality rankings affect business for many organizations, and knowing the ranking models would allow these organizations to better understand the standards by which their products are judged and help them to create higher quality products. We introduce an MIO method for reverse-engineering such models and demonstrate its performance in a case-study with real data from a major ratings company. We also devise an approach to find the most cost-effective way to increase the rank of a certain product. In the final part of the thesis, we develop MIO methods to first generate association rules and then use the rules to build an interpretable classifier in the form of a decision list, which is an ordered list of rules. These are both combinatorially challenging problems because even a small dataset may yield a large number of rules and a small set of rules may correspond to many different orderings. We show how to use MIO to mine useful rules, as well as to construct a classifier from them. We present results in terms of both classification accuracy and interpretability for a variety of datasets.
by Allison An Chang.
Ph.D.
Books on the topic "Machine learning, Global Optimization"
Optimization for machine learning. Cambridge, Mass: MIT Press, 2012.
Find full textLin, Zhouchen, Huan Li, and Cong Fang. Accelerated Optimization for Machine Learning. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-2910-8.
Full textAgrawal, Tanay. Hyperparameter Optimization in Machine Learning. Berkeley, CA: Apress, 2021. http://dx.doi.org/10.1007/978-1-4842-6579-6.
Full textFazelnia, Ghazal. Optimization for Probabilistic Machine Learning. [New York, N.Y.?]: [publisher not identified], 2019.
Find full textNicosia, Giuseppe, Varun Ojha, Emanuele La Malfa, Gabriele La Malfa, Giorgio Jansen, Panos M. Pardalos, Giovanni Giuffrida, and Renato Umeton, eds. Machine Learning, Optimization, and Data Science. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95470-3.
Full textNicosia, Giuseppe, Varun Ojha, Emanuele La Malfa, Gabriele La Malfa, Giorgio Jansen, Panos M. Pardalos, Giovanni Giuffrida, and Renato Umeton, eds. Machine Learning, Optimization, and Data Science. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95467-3.
Full textJiang, Jiawei, Bin Cui, and Ce Zhang. Distributed Machine Learning and Gradient Optimization. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-3420-8.
Full textPardalos, Panos, Mario Pavone, Giovanni Maria Farinella, and Vincenzo Cutello, eds. Machine Learning, Optimization, and Big Data. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27926-8.
Full textNicosia, Giuseppe, Panos Pardalos, Renato Umeton, Giovanni Giuffrida, and Vincenzo Sciacca, eds. Machine Learning, Optimization, and Data Science. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-37599-7.
Full textKulkarni, Anand J., and Suresh Chandra Satapathy, eds. Optimization in Machine Learning and Applications. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-0994-0.
Full textBook chapters on the topic "Machine learning, Global Optimization"
Kearfott, Ralph Baker. "Mathematically Rigorous Global Optimization and Fuzzy Optimization." In Black Box Optimization, Machine Learning, and No-Free Lunch Theorems, 169–94. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-66515-9_7.
Full textde Winter, Roy, Bas van Stein, Matthys Dijkman, and Thomas Bäck. "Designing Ships Using Constrained Multi-objective Efficient Global Optimization." In Machine Learning, Optimization, and Data Science, 191–203. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-13709-0_16.
Full textCocola, Jorio, and Paul Hand. "Global Convergence of Sobolev Training for Overparameterized Neural Networks." In Machine Learning, Optimization, and Data Science, 574–86. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-64583-0_51.
Full textZabinsky, Zelda B., Giulia Pedrielli, and Hao Huang. "A Framework for Multi-fidelity Modeling in Global Optimization Approaches." In Machine Learning, Optimization, and Data Science, 335–46. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-37599-7_28.
Full textGriewank, Andreas, and Ángel Rojas. "Treating Artificial Neural Net Training as a Nonsmooth Global Optimization Problem." In Machine Learning, Optimization, and Data Science, 759–70. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-37599-7_64.
Full textIssa, Mohamed, Aboul Ella Hassanien, and Ibrahim Ziedan. "Performance Evaluation of Sine-Cosine Optimization Versus Particle Swarm Optimization for Global Sequence Alignment Problem." In Machine Learning Paradigms: Theory and Application, 375–91. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-02357-7_18.
Full textWang, Yong-Jun, Jiang-She Zhang, and Yu-Fen Zhang. "An Effective and Efficient Two Stage Algorithm for Global Optimization." In Advances in Machine Learning and Cybernetics, 487–96. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11739685_51.
Full textKiranyaz, Serkan, Turker Ince, and Moncef Gabbouj. "Improving Global Convergence." In Multidimensional Particle Swarm Optimization for Machine Learning and Pattern Recognition, 101–49. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-37846-1_5.
Full textConsoli, Sergio, Luca Tiozzo Pezzoli, and Elisa Tosetti. "Using the GDELT Dataset to Analyse the Italian Sovereign Bond Market." In Machine Learning, Optimization, and Data Science, 190–202. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-64583-0_18.
Full textRodrigues, Douglas, Gustavo Henrique de Rosa, Leandro Aparecido Passos, and João Paulo Papa. "Adaptive Improved Flower Pollination Algorithm for Global Optimization." In Nature-Inspired Computation in Data Mining and Machine Learning, 1–21. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-28553-1_1.
Full textConference papers on the topic "Machine learning, Global Optimization"
He, Yi-chao, and Kun-qi Liu. "A Modified Particle Swarm Optimization for Solving Global Optimization Problems." In 2006 International Conference on Machine Learning and Cybernetics. IEEE, 2006. http://dx.doi.org/10.1109/icmlc.2006.258615.
Full textTamura, Kenichi, and Keiichiro Yasuda. "Spiral Multipoint Search for Global Optimization." In 2011 Tenth International Conference on Machine Learning and Applications (ICMLA). IEEE, 2011. http://dx.doi.org/10.1109/icmla.2011.131.
Full textYong-Jun Wang, Jiang-She Zhang, and Yu-Fen Zhang. "A fast hybrid algorithm for global optimization." In Proceedings of 2005 International Conference on Machine Learning and Cybernetics. IEEE, 2005. http://dx.doi.org/10.1109/icmlc.2005.1527462.
Full textSun, Gao-Ji. "A new evolutionary algorithm for global numerical optimization." In 2010 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2010. http://dx.doi.org/10.1109/icmlc.2010.5580961.
Full textNacef, Abdelhakim, Miloud Bagaa, Youcef Aklouf, Abdellah Kaci, Diego Leonel Cadette Dutra, and Adlen Ksentini. "Self-optimized network: When Machine Learning Meets Optimization." In GLOBECOM 2021 - 2021 IEEE Global Communications Conference. IEEE, 2021. http://dx.doi.org/10.1109/globecom46510.2021.9685681.
Full textLi, Xue-Qiang, Zhi-Feng Hao, and Han Huang. "An evolutionary algorithm with sorted race mechanism for global optimization." In 2010 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2010. http://dx.doi.org/10.1109/icmlc.2010.5580810.
Full textInjadat, MohammadNoor, Fadi Salo, Ali Bou Nassif, Aleksander Essex, and Abdallah Shami. "Bayesian Optimization with Machine Learning Algorithms Towards Anomaly Detection." In GLOBECOM 2018 - 2018 IEEE Global Communications Conference. IEEE, 2018. http://dx.doi.org/10.1109/glocom.2018.8647714.
Full textCandelieri, Antonio, and Francesco Archetti. "Sequential model based optimization with black-box constraints: Feasibility determination via machine learning." In PROCEEDINGS LEGO – 14TH INTERNATIONAL GLOBAL OPTIMIZATION WORKSHOP. Author(s), 2019. http://dx.doi.org/10.1063/1.5089977.
Full textChen, Chang-Huang. "Bare bone particle swarm optimization with integration of global and local learning strategies." In 2011 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2011. http://dx.doi.org/10.1109/icmlc.2011.6016781.
Full textSoroush, H. M. "Bicriteria single machine scheduling with setup times and learning effects." In PROCEEDINGS OF THE SIXTH GLOBAL CONFERENCE ON POWER CONTROL AND OPTIMIZATION. AIP, 2012. http://dx.doi.org/10.1063/1.4769005.
Full textReports on the topic "Machine learning, Global Optimization"
Saenz, Juan Antonio, Ismael Djibrilla Boureima, Vitaliy Gyrya, and Susan Kurien. Machine-Learning for Rapid Optimization of Turbulence Models. Office of Scientific and Technical Information (OSTI), July 2020. http://dx.doi.org/10.2172/1638623.
Full textGu, Xiaofeng, A. Fedotov, and D. Kayran. Application of a machine learning algorithm (XGBoost) to offline RHIC luminosity optimization. Office of Scientific and Technical Information (OSTI), April 2021. http://dx.doi.org/10.2172/1777441.
Full textRolf, Esther, Jonathan Proctor, Tamma Carleton, Ian Bolliger, Vaishaal Shankar, Miyabi Ishihara, Benjamin Recht, and Solomon Hsiang. A Generalizable and Accessible Approach to Machine Learning with Global Satellite Imagery. Cambridge, MA: National Bureau of Economic Research, November 2020. http://dx.doi.org/10.3386/w28045.
Full textScheinberg, Katya. Derivative Free Optimization of Complex Systems with the Use of Statistical Machine Learning Models. Fort Belvoir, VA: Defense Technical Information Center, September 2015. http://dx.doi.org/10.21236/ada622645.
Full textGhanshyam, Pilania, Kenneth James McClellan, Christopher Richard Stanek, and Blas P. Uberuaga. Physics-Informed Machine Learning for Discovery and Optimization of Materials: A Case Study of Scintillators. Office of Scientific and Technical Information (OSTI), August 2018. http://dx.doi.org/10.2172/1463529.
Full textBao, Jie, Chao Wang, Zhijie Xu, and Brian J. Koeppel. Physics-Informed Machine Learning with Application to Solid Oxide Fuel Cell System Modeling and Optimization. Office of Scientific and Technical Information (OSTI), September 2019. http://dx.doi.org/10.2172/1569289.
Full textGabelmann, Jeffrey, and Eduardo Gildin. A Machine Learning-Based Geothermal Drilling Optimization System Using EM Short-Hop Bit Dynamics Measurements. Office of Scientific and Technical Information (OSTI), April 2020. http://dx.doi.org/10.2172/1842454.
Full textQi, Fei, Zhaohui Xia, Gaoyang Tang, Hang Yang, Yu Song, Guangrui Qian, Xiong An, Chunhuan Lin, and Guangming Shi. A Graph-based Evolutionary Algorithm for Automated Machine Learning. Web of Open Science, December 2020. http://dx.doi.org/10.37686/ser.v1i2.77.
Full textVittorio, Alan, and Kate Calvin. Using machine learning to improve land use/cover characterization and projection for scenario-based global modeling. Office of Scientific and Technical Information (OSTI), April 2021. http://dx.doi.org/10.2172/1769796.
Full textWu, S. Boiler Optimization Using Advance Machine Learning Techniques. Final Report for period September 30, 1995 - September 29, 2000. Office of Scientific and Technical Information (OSTI), August 2005. http://dx.doi.org/10.2172/877237.
Full text