Letteratura scientifica selezionata sul tema "Continuous and distributed machine learning"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Continuous and distributed machine learning".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Articoli di riviste sul tema "Continuous and distributed machine learning"
Stan, Ioan-Mihail, Siarhei Padolski e Christopher Jon Lee. "Exploring the self-service model to visualize the results of the ATLAS Machine Learning analysis jobs in BigPanDA with Openshift OKD3". EPJ Web of Conferences 251 (2021): 02009. http://dx.doi.org/10.1051/epjconf/202125102009.
Testo completoYin, Zhongdong, Jingjing Tu e Yonghai Xu. "Development of a Kernel Extreme Learning Machine Model for Capacity Selection of Distributed Generation Considering the Characteristics of Electric Vehicles". Applied Sciences 9, n. 12 (13 giugno 2019): 2401. http://dx.doi.org/10.3390/app9122401.
Testo completoBrophy, Eoin, Maarten De Vos, Geraldine Boylan e Tomás Ward. "Estimation of Continuous Blood Pressure from PPG via a Federated Learning Approach". Sensors 21, n. 18 (21 settembre 2021): 6311. http://dx.doi.org/10.3390/s21186311.
Testo completoVrachimis, Andreas, Stella Gkegka e Kostas Kolomvatsos. "Resilient edge machine learning in smart city environments". Journal of Smart Cities and Society 2, n. 1 (7 luglio 2023): 3–24. http://dx.doi.org/10.3233/scs-230005.
Testo completoMusa, M. O., e E. E. Odokuma. "A framework for the detection of distributed denial of service attacks on network logs using ML and DL classifiers". Scientia Africana 22, n. 3 (25 gennaio 2024): 153–64. http://dx.doi.org/10.4314/sa.v22i3.14.
Testo completoOliveri, Giorgio, Lucas C. van Laake, Cesare Carissimo, Clara Miette e Johannes T. B. Overvelde. "Continuous learning of emergent behavior in robotic matter". Proceedings of the National Academy of Sciences 118, n. 21 (10 maggio 2021): e2017015118. http://dx.doi.org/10.1073/pnas.2017015118.
Testo completoKodaira, Daisuke, Kazuki Tsukazaki, Taiki Kure e Junji Kondoh. "Improving Forecast Reliability for Geographically Distributed Photovoltaic Generations". Energies 14, n. 21 (4 novembre 2021): 7340. http://dx.doi.org/10.3390/en14217340.
Testo completoHua, Xia, e Lei Han. "Design and Practical Application of Sports Visualization Platform Based on Tracking Algorithm". Computational Intelligence and Neuroscience 2022 (16 agosto 2022): 1–9. http://dx.doi.org/10.1155/2022/4744939.
Testo completoRustam, Furqan, Muhammad Faheem Mushtaq, Ameer Hamza, Muhammad Shoaib Farooq, Anca Delia Jurcut e Imran Ashraf. "Denial of Service Attack Classification Using Machine Learning with Multi-Features". Electronics 11, n. 22 (20 novembre 2022): 3817. http://dx.doi.org/10.3390/electronics11223817.
Testo completoHuang, Leqi. "Problems, solutions and improvements on federated learning model". Applied and Computational Engineering 22, n. 1 (23 ottobre 2023): 183–86. http://dx.doi.org/10.54254/2755-2721/22/20231215.
Testo completoTesi sul tema "Continuous and distributed machine learning"
Armond, Kenneth C. Jr. "Distributed Support Vector Machine Learning". ScholarWorks@UNO, 2008. http://scholarworks.uno.edu/td/711.
Testo completoAddanki, Ravichandra. "Learning generalizable device placement algorithms for distributed machine learning". Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122746.
Testo completoCataloged from PDF version of thesis.
Includes bibliographical references (pages 47-50).
We present Placeto, a reinforcement learning (RL) approach to efficiently find device placements for distributed neural network training. Unlike prior approaches that only find a device placement for a specific computation graph, Placeto can learn generalizable device placement policies that can be applied to any graph. We propose two key ideas in our approach: (1) we represent the policy as performing iterative placement improvements, rather than outputting a placement in one shot; (2) we use graph embeddings to capture relevant information about the structure of the computation graph, without relying on node labels for indexing. These ideas allow Placeto to train efficiently and generalize to unseen graphs. Our experiments show that Placeto requires up to 6.1 x fewer training steps to find placements that are on par with or better than the best placements found by prior approaches. Moreover, Placeto is able to learn a generalizable placement policy for any given family of graphs, which can then be used without any retraining to predict optimized placements for unseen graphs from the same family. This eliminates the large overhead incurred by prior RL approaches whose lack of generalizability necessitates re-training from scratch every time a new graph is to be placed.
by Ravichandra Addanki.
S.M.
S.M. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
Johansson, Samuel, e Karol Wojtulewicz. "Machine learning algorithms in a distributed context". Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920.
Testo completoKarimi, Ahmad Maroof. "Distributed Machine Learning Based Intrusion Detection System". University of Toledo / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1470401374.
Testo completoZam, Anton. "Evaluating Distributed Machine Learning using IoT Devices". Thesis, Mittuniversitetet, Institutionen för informationssystem och –teknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-42388.
Testo completoInternet of things is growing every year with new devices being added all the time. Although some of the devices are continuously in use a large amount of them are mostly idle and sitting on untapped processing power that could be used to compute machine learning computations. There currently exist a lot of different methods to combine the processing power of multiple devices to compute machine learning task these are often called distributed machine learning methods. The main focus of this thesis is to evaluate these distributed machine learning methods to see if they could be implemented on IoT devices and if so, measure how efficient and scalable these methods are. The method chosen for implementation was called “MultiWorkerMirrorStrategy” and this method was evaluated by comparing the training time, training accuracy and evaluation accuracy of 2,3 and 4 Raspberry pi:s with a nondistributed machine learning method with 1 Raspberry pi. The results showed that although the computational power increased with every added device the training time increased while the rest of the measurements stayed the same. After the results were analyzed and discussed the conclusion of this were that the overhead added for communicating between devices were to high resulting in this method being very inefficient and wouldn’t scale without some sort of optimization being added.
Thompson, Simon Giles. "Distributed boosting algorithms". Thesis, University of Portsmouth, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285529.
Testo completoDahlberg, Leslie. "Evolutionary Computation in Continuous Optimization and Machine Learning". Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35674.
Testo completoOuyang, Hua. "Optimal stochastic and distributed algorithms for machine learning". Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49091.
Testo completoPrueller, Hans. "Distributed online machine learning for mobile care systems". Thesis, De Montfort University, 2014. http://hdl.handle.net/2086/10875.
Testo completoKonečný, Jakub. "Stochastic, distributed and federated optimization for machine learning". Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/31478.
Testo completoLibri sul tema "Continuous and distributed machine learning"
Weiss, Gerhard. Distributed machine learning. Sankt Augustin: Infix, 1995.
Cerca il testo completoTestas, Abdelaziz. Distributed Machine Learning with PySpark. Berkeley, CA: Apress, 2023. http://dx.doi.org/10.1007/978-1-4842-9751-3.
Testo completoAmini, M. Hadi, a cura di. Distributed Machine Learning and Computing. Cham: Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-57567-9.
Testo completoJiang, Jiawei, Bin Cui e Ce Zhang. Distributed Machine Learning and Gradient Optimization. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-3420-8.
Testo completoJoshi, Gauri. Optimization Algorithms for Distributed Machine Learning. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-19067-4.
Testo completoSahoo, Jyoti Prakash, Asis Kumar Tripathy, Manoranjan Mohanty, Kuan-Ching Li e Ajit Kumar Nayak, a cura di. Advances in Distributed Computing and Machine Learning. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-4807-6.
Testo completoRout, Rashmi Ranjan, Soumya Kanti Ghosh, Prasanta K. Jana, Asis Kumar Tripathy, Jyoti Prakash Sahoo e Kuan-Ching Li, a cura di. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-1018-0.
Testo completoTripathy, Asis Kumar, Mahasweta Sarkar, Jyoti Prakash Sahoo, Kuan-Ching Li e Suchismita Chinara, a cura di. Advances in Distributed Computing and Machine Learning. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-4218-3.
Testo completoNanda, Umakanta, Asis Kumar Tripathy, Jyoti Prakash Sahoo, Mahasweta Sarkar e Kuan-Ching Li, a cura di. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-1841-2.
Testo completoChinara, Suchismita, Asis Kumar Tripathy, Kuan-Ching Li, Jyoti Prakash Sahoo e Alekha Kumar Mishra, a cura di. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-1203-2.
Testo completoCapitoli di libri sul tema "Continuous and distributed machine learning"
Carter, Eric, e Matthew Hurst. "Continuous Delivery". In Agile Machine Learning, 59–69. Berkeley, CA: Apress, 2019. http://dx.doi.org/10.1007/978-1-4842-5107-2_3.
Testo completoYang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen e Han Yu. "Distributed Machine Learning". In Federated Learning, 33–48. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4_3.
Testo completoGalakatos, Alex, Andrew Crotty e Tim Kraska. "Distributed Machine Learning". In Encyclopedia of Database Systems, 1–6. New York, NY: Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_80647-1.
Testo completoGalakatos, Alex, Andrew Crotty e Tim Kraska. "Distributed Machine Learning". In Encyclopedia of Database Systems, 1196–201. New York, NY: Springer New York, 2018. http://dx.doi.org/10.1007/978-1-4614-8265-9_80647.
Testo completoShultz, Thomas R., Scott E. Fahlman, Susan Craw, Periklis Andritsos, Panayiotis Tsaparas, Ricardo Silva, Chris Drummond et al. "Continuous Attribute". In Encyclopedia of Machine Learning, 226. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_172.
Testo completoLi, Jin, Ping Li, Zheli Liu, Xiaofeng Chen e Tong Li. "Secure Distributed Learning". In Privacy-Preserving Machine Learning, 47–56. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9139-3_4.
Testo completoDucoulombier, Antoine, e Michèle Sebag. "Continuous mimetic evolution". In Machine Learning: ECML-98, 334–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0026704.
Testo completoCleophas, Ton J., e Aeilko H. Zwinderman. "Continuous Sequential Techniques". In Machine Learning in Medicine, 187–94. Dordrecht: Springer Netherlands, 2013. http://dx.doi.org/10.1007/978-94-007-6886-4_18.
Testo completoChen, Zhiyuan, e Bing Liu. "Continuous Knowledge Learning in Chatbots". In Lifelong Machine Learning, 131–38. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-031-01581-6_8.
Testo completoLiu, Mark. "Q-Learning with Continuous States". In Machine Learning, Animated, 285–300. Boca Raton: Chapman and Hall/CRC, 2023. http://dx.doi.org/10.1201/b23383-15.
Testo completoAtti di convegni sul tema "Continuous and distributed machine learning"
Belcastro, Loris, Fabrizio Marozzo, Aleandro Presta e Domenico Talia. "A Spark-based Task Allocation Solution for Machine Learning in the Edge-Cloud Continuum". In 2024 20th International Conference on Distributed Computing in Smart Systems and the Internet of Things (DCOSS-IoT), 576–82. IEEE, 2024. http://dx.doi.org/10.1109/dcoss-iot61029.2024.00090.
Testo completoBarros, Claudio D. T., Daniel N. R. da Silva e Fabio A. M. Porto. "Machine Learning on Graph-Structured Data". In Anais Estendidos do Simpósio Brasileiro de Banco de Dados. Sociedade Brasileira de Computação - SBC, 2021. http://dx.doi.org/10.5753/sbbd_estendido.2021.18179.
Testo completoChepurnov, A., e N. Ershov. "APPLICATION OF MACHINE LEARNING METHODS FOR CROSS-CLASSIFICATION OF ALGORITHMS AND PROBLEMS OF MULTIVARIATE CONTINUOUS OPTIMIZATION". In 9th International Conference "Distributed Computing and Grid Technologies in Science and Education". Crossref, 2021. http://dx.doi.org/10.54546/mlit.2021.67.50.001.
Testo completoSartzetakis, Ippokratis, Polyzois Soumplis, Panagiotis Pantazopoulos, Konstantinos V. Katsaros, Vasilis Sourlas e Emmanouel Manos Varvarigos. "Resource Allocation for Distributed Machine Learning at the Edge-Cloud Continuum". In ICC 2022 - IEEE International Conference on Communications. IEEE, 2022. http://dx.doi.org/10.1109/icc45855.2022.9838647.
Testo completoTiezzi, Matteo, Simone Marullo, Lapo Faggi, Enrico Meloni, Alessandro Betti e Stefano Melacci. "Stochastic Coherence Over Attention Trajectory For Continuous Learning In Video Streams". In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/483.
Testo completoGupta, Sujasha, Srivatsava Krishnan e Vishnubaba Sundaresan. "Structural Health Monitoring of Composite Structures via Machine Learning of Mechanoluminescence". In ASME 2019 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/smasis2019-5697.
Testo completoMendoza, Alberto, Çağrı Cerrahoğlu, Alessandro Delfino e Martin Sundin. "Signal Processing and Machine Learning for Effective Integration of Distributed Fiber Optic Sensing Data in Production Petrophysics". In 2022 SPWLA 63rd Annual Symposium. Society of Petrophysicists and Well Log Analysts, 2022. http://dx.doi.org/10.30632/spwla-2022-0016.
Testo completoSadigov, Teymur, Cagri Cerrahoglu, James Ramsay, Laurence Burchell, Sean Cavalero, Thomas Watson, Pradyumna Thiruvenkatanathan e Martin Sundin. "Real-Time Water Injection Monitoring with Distributed Fiber Optics Using Physics-Informed Machine Learning". In Offshore Technology Conference. OTC, 2021. http://dx.doi.org/10.4043/30982-ms.
Testo completo"Session details: 1st Workshop on Distributed Machine Learning for the Intelligent Computing Continuum (DML-ICC)". In UCC '21: 2021 IEEE/ACM 14th International Conference on Utility and Cloud Computing, a cura di Luiz F. Bittencourt, Ian Foster e Filip De Turck. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3517186.
Testo completoGao, Hongchang, Hanzi Xu e Slobodan Vucetic. "Sample Efficient Decentralized Stochastic Frank-Wolfe Methods for Continuous DR-Submodular Maximization". In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/482.
Testo completoRapporti di organizzazioni sul tema "Continuous and distributed machine learning"
Shead, Timothy, Jonathan Berry, Cynthia Phillips e Jared Saia. Information-Theoretically Secure Distributed Machine Learning. Office of Scientific and Technical Information (OSTI), novembre 2019. http://dx.doi.org/10.2172/1763277.
Testo completoLee, Ying-Ying, e Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, ottobre 2019. http://dx.doi.org/10.1920/wp.cem.2019.5419.
Testo completoLee, Ying-Ying, e Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, dicembre 2019. http://dx.doi.org/10.1920/wp.cem.2019.7219.
Testo completoHuang, Amy, Katelyn Barnes, Joseph Bearer, Evan Chrisinger e Christopher Stone. Integrating Distributed-Memory Machine Learning into Large-Scale HPC Simulations. Office of Scientific and Technical Information (OSTI), maggio 2018. http://dx.doi.org/10.2172/1460078.
Testo completoVarastehpour, Soheil, Hamid Sharifzadeh e Iman Ardekani. A Comprehensive Review of Deep Learning Algorithms. Unitec ePress, 2021. http://dx.doi.org/10.34074/ocds.092.
Testo completoLiu, Xiaopei, Dan Liu e Cong’e Tan. Gut microbiome-based machine learning for diagnostic prediction of liver fibrosis and cirrhosis: a systematic review and meta-analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, maggio 2022. http://dx.doi.org/10.37766/inplasy2022.5.0133.
Testo completoChoquette, Gary. PR-000-16209-WEB Data Management Best Practices Learned from CEPM. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), aprile 2019. http://dx.doi.org/10.55274/r0011568.
Testo completoVisser, R., H. Kao, R. M. H. Dokht, A. B. Mahani e S. Venables. A comprehensive earthquake catalogue for northeastern British Columbia: the northern Montney trend from 2017 to 2020 and the Kiskatinaw Seismic Monitoring and Mitigation Area from 2019 to 2020. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/329078.
Testo completoHarris, L. B., P. Adiban e E. Gloaguen. The role of enigmatic deep crustal and upper mantle structures on Au and magmatic Ni-Cu-PGE-Cr mineralization in the Superior Province. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/328984.
Testo completo