Gotowa bibliografia na temat „Continuous and distributed machine learning”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Continuous and distributed machine learning”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Artykuły w czasopismach na temat "Continuous and distributed machine learning"
Stan, Ioan-Mihail, Siarhei Padolski i Christopher Jon Lee. "Exploring the self-service model to visualize the results of the ATLAS Machine Learning analysis jobs in BigPanDA with Openshift OKD3". EPJ Web of Conferences 251 (2021): 02009. http://dx.doi.org/10.1051/epjconf/202125102009.
Pełny tekst źródłaYin, Zhongdong, Jingjing Tu i Yonghai Xu. "Development of a Kernel Extreme Learning Machine Model for Capacity Selection of Distributed Generation Considering the Characteristics of Electric Vehicles". Applied Sciences 9, nr 12 (13.06.2019): 2401. http://dx.doi.org/10.3390/app9122401.
Pełny tekst źródłaBrophy, Eoin, Maarten De Vos, Geraldine Boylan i Tomás Ward. "Estimation of Continuous Blood Pressure from PPG via a Federated Learning Approach". Sensors 21, nr 18 (21.09.2021): 6311. http://dx.doi.org/10.3390/s21186311.
Pełny tekst źródłaVrachimis, Andreas, Stella Gkegka i Kostas Kolomvatsos. "Resilient edge machine learning in smart city environments". Journal of Smart Cities and Society 2, nr 1 (7.07.2023): 3–24. http://dx.doi.org/10.3233/scs-230005.
Pełny tekst źródłaMusa, M. O., i E. E. Odokuma. "A framework for the detection of distributed denial of service attacks on network logs using ML and DL classifiers". Scientia Africana 22, nr 3 (25.01.2024): 153–64. http://dx.doi.org/10.4314/sa.v22i3.14.
Pełny tekst źródłaOliveri, Giorgio, Lucas C. van Laake, Cesare Carissimo, Clara Miette i Johannes T. B. Overvelde. "Continuous learning of emergent behavior in robotic matter". Proceedings of the National Academy of Sciences 118, nr 21 (10.05.2021): e2017015118. http://dx.doi.org/10.1073/pnas.2017015118.
Pełny tekst źródłaKodaira, Daisuke, Kazuki Tsukazaki, Taiki Kure i Junji Kondoh. "Improving Forecast Reliability for Geographically Distributed Photovoltaic Generations". Energies 14, nr 21 (4.11.2021): 7340. http://dx.doi.org/10.3390/en14217340.
Pełny tekst źródłaHua, Xia, i Lei Han. "Design and Practical Application of Sports Visualization Platform Based on Tracking Algorithm". Computational Intelligence and Neuroscience 2022 (16.08.2022): 1–9. http://dx.doi.org/10.1155/2022/4744939.
Pełny tekst źródłaRustam, Furqan, Muhammad Faheem Mushtaq, Ameer Hamza, Muhammad Shoaib Farooq, Anca Delia Jurcut i Imran Ashraf. "Denial of Service Attack Classification Using Machine Learning with Multi-Features". Electronics 11, nr 22 (20.11.2022): 3817. http://dx.doi.org/10.3390/electronics11223817.
Pełny tekst źródłaHuang, Leqi. "Problems, solutions and improvements on federated learning model". Applied and Computational Engineering 22, nr 1 (23.10.2023): 183–86. http://dx.doi.org/10.54254/2755-2721/22/20231215.
Pełny tekst źródłaRozprawy doktorskie na temat "Continuous and distributed machine learning"
Armond, Kenneth C. Jr. "Distributed Support Vector Machine Learning". ScholarWorks@UNO, 2008. http://scholarworks.uno.edu/td/711.
Pełny tekst źródłaAddanki, Ravichandra. "Learning generalizable device placement algorithms for distributed machine learning". Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122746.
Pełny tekst źródłaCataloged from PDF version of thesis.
Includes bibliographical references (pages 47-50).
We present Placeto, a reinforcement learning (RL) approach to efficiently find device placements for distributed neural network training. Unlike prior approaches that only find a device placement for a specific computation graph, Placeto can learn generalizable device placement policies that can be applied to any graph. We propose two key ideas in our approach: (1) we represent the policy as performing iterative placement improvements, rather than outputting a placement in one shot; (2) we use graph embeddings to capture relevant information about the structure of the computation graph, without relying on node labels for indexing. These ideas allow Placeto to train efficiently and generalize to unseen graphs. Our experiments show that Placeto requires up to 6.1 x fewer training steps to find placements that are on par with or better than the best placements found by prior approaches. Moreover, Placeto is able to learn a generalizable placement policy for any given family of graphs, which can then be used without any retraining to predict optimized placements for unseen graphs from the same family. This eliminates the large overhead incurred by prior RL approaches whose lack of generalizability necessitates re-training from scratch every time a new graph is to be placed.
by Ravichandra Addanki.
S.M.
S.M. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
Johansson, Samuel, i Karol Wojtulewicz. "Machine learning algorithms in a distributed context". Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920.
Pełny tekst źródłaKarimi, Ahmad Maroof. "Distributed Machine Learning Based Intrusion Detection System". University of Toledo / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1470401374.
Pełny tekst źródłaZam, Anton. "Evaluating Distributed Machine Learning using IoT Devices". Thesis, Mittuniversitetet, Institutionen för informationssystem och –teknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-42388.
Pełny tekst źródłaInternet of things is growing every year with new devices being added all the time. Although some of the devices are continuously in use a large amount of them are mostly idle and sitting on untapped processing power that could be used to compute machine learning computations. There currently exist a lot of different methods to combine the processing power of multiple devices to compute machine learning task these are often called distributed machine learning methods. The main focus of this thesis is to evaluate these distributed machine learning methods to see if they could be implemented on IoT devices and if so, measure how efficient and scalable these methods are. The method chosen for implementation was called “MultiWorkerMirrorStrategy” and this method was evaluated by comparing the training time, training accuracy and evaluation accuracy of 2,3 and 4 Raspberry pi:s with a nondistributed machine learning method with 1 Raspberry pi. The results showed that although the computational power increased with every added device the training time increased while the rest of the measurements stayed the same. After the results were analyzed and discussed the conclusion of this were that the overhead added for communicating between devices were to high resulting in this method being very inefficient and wouldn’t scale without some sort of optimization being added.
Thompson, Simon Giles. "Distributed boosting algorithms". Thesis, University of Portsmouth, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285529.
Pełny tekst źródłaDahlberg, Leslie. "Evolutionary Computation in Continuous Optimization and Machine Learning". Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35674.
Pełny tekst źródłaOuyang, Hua. "Optimal stochastic and distributed algorithms for machine learning". Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49091.
Pełny tekst źródłaPrueller, Hans. "Distributed online machine learning for mobile care systems". Thesis, De Montfort University, 2014. http://hdl.handle.net/2086/10875.
Pełny tekst źródłaKonečný, Jakub. "Stochastic, distributed and federated optimization for machine learning". Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/31478.
Pełny tekst źródłaKsiążki na temat "Continuous and distributed machine learning"
Weiss, Gerhard. Distributed machine learning. Sankt Augustin: Infix, 1995.
Znajdź pełny tekst źródłaTestas, Abdelaziz. Distributed Machine Learning with PySpark. Berkeley, CA: Apress, 2023. http://dx.doi.org/10.1007/978-1-4842-9751-3.
Pełny tekst źródłaAmini, M. Hadi, red. Distributed Machine Learning and Computing. Cham: Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-57567-9.
Pełny tekst źródłaJiang, Jiawei, Bin Cui i Ce Zhang. Distributed Machine Learning and Gradient Optimization. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-3420-8.
Pełny tekst źródłaJoshi, Gauri. Optimization Algorithms for Distributed Machine Learning. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-19067-4.
Pełny tekst źródłaSahoo, Jyoti Prakash, Asis Kumar Tripathy, Manoranjan Mohanty, Kuan-Ching Li i Ajit Kumar Nayak, red. Advances in Distributed Computing and Machine Learning. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-4807-6.
Pełny tekst źródłaRout, Rashmi Ranjan, Soumya Kanti Ghosh, Prasanta K. Jana, Asis Kumar Tripathy, Jyoti Prakash Sahoo i Kuan-Ching Li, red. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-1018-0.
Pełny tekst źródłaTripathy, Asis Kumar, Mahasweta Sarkar, Jyoti Prakash Sahoo, Kuan-Ching Li i Suchismita Chinara, red. Advances in Distributed Computing and Machine Learning. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-4218-3.
Pełny tekst źródłaNanda, Umakanta, Asis Kumar Tripathy, Jyoti Prakash Sahoo, Mahasweta Sarkar i Kuan-Ching Li, red. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-1841-2.
Pełny tekst źródłaChinara, Suchismita, Asis Kumar Tripathy, Kuan-Ching Li, Jyoti Prakash Sahoo i Alekha Kumar Mishra, red. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-1203-2.
Pełny tekst źródłaCzęści książek na temat "Continuous and distributed machine learning"
Carter, Eric, i Matthew Hurst. "Continuous Delivery". W Agile Machine Learning, 59–69. Berkeley, CA: Apress, 2019. http://dx.doi.org/10.1007/978-1-4842-5107-2_3.
Pełny tekst źródłaYang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen i Han Yu. "Distributed Machine Learning". W Federated Learning, 33–48. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4_3.
Pełny tekst źródłaGalakatos, Alex, Andrew Crotty i Tim Kraska. "Distributed Machine Learning". W Encyclopedia of Database Systems, 1–6. New York, NY: Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_80647-1.
Pełny tekst źródłaGalakatos, Alex, Andrew Crotty i Tim Kraska. "Distributed Machine Learning". W Encyclopedia of Database Systems, 1196–201. New York, NY: Springer New York, 2018. http://dx.doi.org/10.1007/978-1-4614-8265-9_80647.
Pełny tekst źródłaShultz, Thomas R., Scott E. Fahlman, Susan Craw, Periklis Andritsos, Panayiotis Tsaparas, Ricardo Silva, Chris Drummond i in. "Continuous Attribute". W Encyclopedia of Machine Learning, 226. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_172.
Pełny tekst źródłaLi, Jin, Ping Li, Zheli Liu, Xiaofeng Chen i Tong Li. "Secure Distributed Learning". W Privacy-Preserving Machine Learning, 47–56. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9139-3_4.
Pełny tekst źródłaDucoulombier, Antoine, i Michèle Sebag. "Continuous mimetic evolution". W Machine Learning: ECML-98, 334–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0026704.
Pełny tekst źródłaCleophas, Ton J., i Aeilko H. Zwinderman. "Continuous Sequential Techniques". W Machine Learning in Medicine, 187–94. Dordrecht: Springer Netherlands, 2013. http://dx.doi.org/10.1007/978-94-007-6886-4_18.
Pełny tekst źródłaChen, Zhiyuan, i Bing Liu. "Continuous Knowledge Learning in Chatbots". W Lifelong Machine Learning, 131–38. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-031-01581-6_8.
Pełny tekst źródłaLiu, Mark. "Q-Learning with Continuous States". W Machine Learning, Animated, 285–300. Boca Raton: Chapman and Hall/CRC, 2023. http://dx.doi.org/10.1201/b23383-15.
Pełny tekst źródłaStreszczenia konferencji na temat "Continuous and distributed machine learning"
Belcastro, Loris, Fabrizio Marozzo, Aleandro Presta i Domenico Talia. "A Spark-based Task Allocation Solution for Machine Learning in the Edge-Cloud Continuum". W 2024 20th International Conference on Distributed Computing in Smart Systems and the Internet of Things (DCOSS-IoT), 576–82. IEEE, 2024. http://dx.doi.org/10.1109/dcoss-iot61029.2024.00090.
Pełny tekst źródłaBarros, Claudio D. T., Daniel N. R. da Silva i Fabio A. M. Porto. "Machine Learning on Graph-Structured Data". W Anais Estendidos do Simpósio Brasileiro de Banco de Dados. Sociedade Brasileira de Computação - SBC, 2021. http://dx.doi.org/10.5753/sbbd_estendido.2021.18179.
Pełny tekst źródłaChepurnov, A., i N. Ershov. "APPLICATION OF MACHINE LEARNING METHODS FOR CROSS-CLASSIFICATION OF ALGORITHMS AND PROBLEMS OF MULTIVARIATE CONTINUOUS OPTIMIZATION". W 9th International Conference "Distributed Computing and Grid Technologies in Science and Education". Crossref, 2021. http://dx.doi.org/10.54546/mlit.2021.67.50.001.
Pełny tekst źródłaSartzetakis, Ippokratis, Polyzois Soumplis, Panagiotis Pantazopoulos, Konstantinos V. Katsaros, Vasilis Sourlas i Emmanouel Manos Varvarigos. "Resource Allocation for Distributed Machine Learning at the Edge-Cloud Continuum". W ICC 2022 - IEEE International Conference on Communications. IEEE, 2022. http://dx.doi.org/10.1109/icc45855.2022.9838647.
Pełny tekst źródłaTiezzi, Matteo, Simone Marullo, Lapo Faggi, Enrico Meloni, Alessandro Betti i Stefano Melacci. "Stochastic Coherence Over Attention Trajectory For Continuous Learning In Video Streams". W Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/483.
Pełny tekst źródłaGupta, Sujasha, Srivatsava Krishnan i Vishnubaba Sundaresan. "Structural Health Monitoring of Composite Structures via Machine Learning of Mechanoluminescence". W ASME 2019 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/smasis2019-5697.
Pełny tekst źródłaMendoza, Alberto, Çağrı Cerrahoğlu, Alessandro Delfino i Martin Sundin. "Signal Processing and Machine Learning for Effective Integration of Distributed Fiber Optic Sensing Data in Production Petrophysics". W 2022 SPWLA 63rd Annual Symposium. Society of Petrophysicists and Well Log Analysts, 2022. http://dx.doi.org/10.30632/spwla-2022-0016.
Pełny tekst źródłaSadigov, Teymur, Cagri Cerrahoglu, James Ramsay, Laurence Burchell, Sean Cavalero, Thomas Watson, Pradyumna Thiruvenkatanathan i Martin Sundin. "Real-Time Water Injection Monitoring with Distributed Fiber Optics Using Physics-Informed Machine Learning". W Offshore Technology Conference. OTC, 2021. http://dx.doi.org/10.4043/30982-ms.
Pełny tekst źródła"Session details: 1st Workshop on Distributed Machine Learning for the Intelligent Computing Continuum (DML-ICC)". W UCC '21: 2021 IEEE/ACM 14th International Conference on Utility and Cloud Computing, redaktorzy Luiz F. Bittencourt, Ian Foster i Filip De Turck. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3517186.
Pełny tekst źródłaGao, Hongchang, Hanzi Xu i Slobodan Vucetic. "Sample Efficient Decentralized Stochastic Frank-Wolfe Methods for Continuous DR-Submodular Maximization". W Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/482.
Pełny tekst źródłaRaporty organizacyjne na temat "Continuous and distributed machine learning"
Shead, Timothy, Jonathan Berry, Cynthia Phillips i Jared Saia. Information-Theoretically Secure Distributed Machine Learning. Office of Scientific and Technical Information (OSTI), listopad 2019. http://dx.doi.org/10.2172/1763277.
Pełny tekst źródłaLee, Ying-Ying, i Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, październik 2019. http://dx.doi.org/10.1920/wp.cem.2019.5419.
Pełny tekst źródłaLee, Ying-Ying, i Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, grudzień 2019. http://dx.doi.org/10.1920/wp.cem.2019.7219.
Pełny tekst źródłaHuang, Amy, Katelyn Barnes, Joseph Bearer, Evan Chrisinger i Christopher Stone. Integrating Distributed-Memory Machine Learning into Large-Scale HPC Simulations. Office of Scientific and Technical Information (OSTI), maj 2018. http://dx.doi.org/10.2172/1460078.
Pełny tekst źródłaVarastehpour, Soheil, Hamid Sharifzadeh i Iman Ardekani. A Comprehensive Review of Deep Learning Algorithms. Unitec ePress, 2021. http://dx.doi.org/10.34074/ocds.092.
Pełny tekst źródłaLiu, Xiaopei, Dan Liu i Cong’e Tan. Gut microbiome-based machine learning for diagnostic prediction of liver fibrosis and cirrhosis: a systematic review and meta-analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, maj 2022. http://dx.doi.org/10.37766/inplasy2022.5.0133.
Pełny tekst źródłaChoquette, Gary. PR-000-16209-WEB Data Management Best Practices Learned from CEPM. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), kwiecień 2019. http://dx.doi.org/10.55274/r0011568.
Pełny tekst źródłaVisser, R., H. Kao, R. M. H. Dokht, A. B. Mahani i S. Venables. A comprehensive earthquake catalogue for northeastern British Columbia: the northern Montney trend from 2017 to 2020 and the Kiskatinaw Seismic Monitoring and Mitigation Area from 2019 to 2020. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/329078.
Pełny tekst źródłaHarris, L. B., P. Adiban i E. Gloaguen. The role of enigmatic deep crustal and upper mantle structures on Au and magmatic Ni-Cu-PGE-Cr mineralization in the Superior Province. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/328984.
Pełny tekst źródła