Добірка наукової літератури з теми "Continuous and distributed machine learning"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Continuous and distributed machine learning".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Continuous and distributed machine learning"
Stan, Ioan-Mihail, Siarhei Padolski, and Christopher Jon Lee. "Exploring the self-service model to visualize the results of the ATLAS Machine Learning analysis jobs in BigPanDA with Openshift OKD3." EPJ Web of Conferences 251 (2021): 02009. http://dx.doi.org/10.1051/epjconf/202125102009.
Повний текст джерелаYin, Zhongdong, Jingjing Tu, and Yonghai Xu. "Development of a Kernel Extreme Learning Machine Model for Capacity Selection of Distributed Generation Considering the Characteristics of Electric Vehicles." Applied Sciences 9, no. 12 (June 13, 2019): 2401. http://dx.doi.org/10.3390/app9122401.
Повний текст джерелаBrophy, Eoin, Maarten De Vos, Geraldine Boylan, and Tomás Ward. "Estimation of Continuous Blood Pressure from PPG via a Federated Learning Approach." Sensors 21, no. 18 (September 21, 2021): 6311. http://dx.doi.org/10.3390/s21186311.
Повний текст джерелаVrachimis, Andreas, Stella Gkegka, and Kostas Kolomvatsos. "Resilient edge machine learning in smart city environments." Journal of Smart Cities and Society 2, no. 1 (July 7, 2023): 3–24. http://dx.doi.org/10.3233/scs-230005.
Повний текст джерелаMusa, M. O., and E. E. Odokuma. "A framework for the detection of distributed denial of service attacks on network logs using ML and DL classifiers." Scientia Africana 22, no. 3 (January 25, 2024): 153–64. http://dx.doi.org/10.4314/sa.v22i3.14.
Повний текст джерелаOliveri, Giorgio, Lucas C. van Laake, Cesare Carissimo, Clara Miette, and Johannes T. B. Overvelde. "Continuous learning of emergent behavior in robotic matter." Proceedings of the National Academy of Sciences 118, no. 21 (May 10, 2021): e2017015118. http://dx.doi.org/10.1073/pnas.2017015118.
Повний текст джерелаKodaira, Daisuke, Kazuki Tsukazaki, Taiki Kure, and Junji Kondoh. "Improving Forecast Reliability for Geographically Distributed Photovoltaic Generations." Energies 14, no. 21 (November 4, 2021): 7340. http://dx.doi.org/10.3390/en14217340.
Повний текст джерелаHua, Xia, and Lei Han. "Design and Practical Application of Sports Visualization Platform Based on Tracking Algorithm." Computational Intelligence and Neuroscience 2022 (August 16, 2022): 1–9. http://dx.doi.org/10.1155/2022/4744939.
Повний текст джерелаRustam, Furqan, Muhammad Faheem Mushtaq, Ameer Hamza, Muhammad Shoaib Farooq, Anca Delia Jurcut, and Imran Ashraf. "Denial of Service Attack Classification Using Machine Learning with Multi-Features." Electronics 11, no. 22 (November 20, 2022): 3817. http://dx.doi.org/10.3390/electronics11223817.
Повний текст джерелаHuang, Leqi. "Problems, solutions and improvements on federated learning model." Applied and Computational Engineering 22, no. 1 (October 23, 2023): 183–86. http://dx.doi.org/10.54254/2755-2721/22/20231215.
Повний текст джерелаДисертації з теми "Continuous and distributed machine learning"
Armond, Kenneth C. Jr. "Distributed Support Vector Machine Learning." ScholarWorks@UNO, 2008. http://scholarworks.uno.edu/td/711.
Повний текст джерелаAddanki, Ravichandra. "Learning generalizable device placement algorithms for distributed machine learning." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122746.
Повний текст джерелаCataloged from PDF version of thesis.
Includes bibliographical references (pages 47-50).
We present Placeto, a reinforcement learning (RL) approach to efficiently find device placements for distributed neural network training. Unlike prior approaches that only find a device placement for a specific computation graph, Placeto can learn generalizable device placement policies that can be applied to any graph. We propose two key ideas in our approach: (1) we represent the policy as performing iterative placement improvements, rather than outputting a placement in one shot; (2) we use graph embeddings to capture relevant information about the structure of the computation graph, without relying on node labels for indexing. These ideas allow Placeto to train efficiently and generalize to unseen graphs. Our experiments show that Placeto requires up to 6.1 x fewer training steps to find placements that are on par with or better than the best placements found by prior approaches. Moreover, Placeto is able to learn a generalizable placement policy for any given family of graphs, which can then be used without any retraining to predict optimized placements for unseen graphs from the same family. This eliminates the large overhead incurred by prior RL approaches whose lack of generalizability necessitates re-training from scratch every time a new graph is to be placed.
by Ravichandra Addanki.
S.M.
S.M. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
Johansson, Samuel, and Karol Wojtulewicz. "Machine learning algorithms in a distributed context." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920.
Повний текст джерелаKarimi, Ahmad Maroof. "Distributed Machine Learning Based Intrusion Detection System." University of Toledo / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1470401374.
Повний текст джерелаZam, Anton. "Evaluating Distributed Machine Learning using IoT Devices." Thesis, Mittuniversitetet, Institutionen för informationssystem och –teknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-42388.
Повний текст джерелаInternet of things is growing every year with new devices being added all the time. Although some of the devices are continuously in use a large amount of them are mostly idle and sitting on untapped processing power that could be used to compute machine learning computations. There currently exist a lot of different methods to combine the processing power of multiple devices to compute machine learning task these are often called distributed machine learning methods. The main focus of this thesis is to evaluate these distributed machine learning methods to see if they could be implemented on IoT devices and if so, measure how efficient and scalable these methods are. The method chosen for implementation was called “MultiWorkerMirrorStrategy” and this method was evaluated by comparing the training time, training accuracy and evaluation accuracy of 2,3 and 4 Raspberry pi:s with a nondistributed machine learning method with 1 Raspberry pi. The results showed that although the computational power increased with every added device the training time increased while the rest of the measurements stayed the same. After the results were analyzed and discussed the conclusion of this were that the overhead added for communicating between devices were to high resulting in this method being very inefficient and wouldn’t scale without some sort of optimization being added.
Thompson, Simon Giles. "Distributed boosting algorithms." Thesis, University of Portsmouth, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285529.
Повний текст джерелаDahlberg, Leslie. "Evolutionary Computation in Continuous Optimization and Machine Learning." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35674.
Повний текст джерелаOuyang, Hua. "Optimal stochastic and distributed algorithms for machine learning." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49091.
Повний текст джерелаPrueller, Hans. "Distributed online machine learning for mobile care systems." Thesis, De Montfort University, 2014. http://hdl.handle.net/2086/10875.
Повний текст джерелаKonečný, Jakub. "Stochastic, distributed and federated optimization for machine learning." Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/31478.
Повний текст джерелаКниги з теми "Continuous and distributed machine learning"
Weiss, Gerhard. Distributed machine learning. Sankt Augustin: Infix, 1995.
Знайти повний текст джерелаTestas, Abdelaziz. Distributed Machine Learning with PySpark. Berkeley, CA: Apress, 2023. http://dx.doi.org/10.1007/978-1-4842-9751-3.
Повний текст джерелаAmini, M. Hadi, ed. Distributed Machine Learning and Computing. Cham: Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-57567-9.
Повний текст джерелаJiang, Jiawei, Bin Cui, and Ce Zhang. Distributed Machine Learning and Gradient Optimization. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-3420-8.
Повний текст джерелаJoshi, Gauri. Optimization Algorithms for Distributed Machine Learning. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-19067-4.
Повний текст джерелаSahoo, Jyoti Prakash, Asis Kumar Tripathy, Manoranjan Mohanty, Kuan-Ching Li, and Ajit Kumar Nayak, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-4807-6.
Повний текст джерелаRout, Rashmi Ranjan, Soumya Kanti Ghosh, Prasanta K. Jana, Asis Kumar Tripathy, Jyoti Prakash Sahoo, and Kuan-Ching Li, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-1018-0.
Повний текст джерелаTripathy, Asis Kumar, Mahasweta Sarkar, Jyoti Prakash Sahoo, Kuan-Ching Li, and Suchismita Chinara, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-4218-3.
Повний текст джерелаNanda, Umakanta, Asis Kumar Tripathy, Jyoti Prakash Sahoo, Mahasweta Sarkar, and Kuan-Ching Li, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-1841-2.
Повний текст джерелаChinara, Suchismita, Asis Kumar Tripathy, Kuan-Ching Li, Jyoti Prakash Sahoo, and Alekha Kumar Mishra, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-1203-2.
Повний текст джерелаЧастини книг з теми "Continuous and distributed machine learning"
Carter, Eric, and Matthew Hurst. "Continuous Delivery." In Agile Machine Learning, 59–69. Berkeley, CA: Apress, 2019. http://dx.doi.org/10.1007/978-1-4842-5107-2_3.
Повний текст джерелаYang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen, and Han Yu. "Distributed Machine Learning." In Federated Learning, 33–48. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4_3.
Повний текст джерелаGalakatos, Alex, Andrew Crotty, and Tim Kraska. "Distributed Machine Learning." In Encyclopedia of Database Systems, 1–6. New York, NY: Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_80647-1.
Повний текст джерелаGalakatos, Alex, Andrew Crotty, and Tim Kraska. "Distributed Machine Learning." In Encyclopedia of Database Systems, 1196–201. New York, NY: Springer New York, 2018. http://dx.doi.org/10.1007/978-1-4614-8265-9_80647.
Повний текст джерелаShultz, Thomas R., Scott E. Fahlman, Susan Craw, Periklis Andritsos, Panayiotis Tsaparas, Ricardo Silva, Chris Drummond, et al. "Continuous Attribute." In Encyclopedia of Machine Learning, 226. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_172.
Повний текст джерелаLi, Jin, Ping Li, Zheli Liu, Xiaofeng Chen, and Tong Li. "Secure Distributed Learning." In Privacy-Preserving Machine Learning, 47–56. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9139-3_4.
Повний текст джерелаDucoulombier, Antoine, and Michèle Sebag. "Continuous mimetic evolution." In Machine Learning: ECML-98, 334–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0026704.
Повний текст джерелаCleophas, Ton J., and Aeilko H. Zwinderman. "Continuous Sequential Techniques." In Machine Learning in Medicine, 187–94. Dordrecht: Springer Netherlands, 2013. http://dx.doi.org/10.1007/978-94-007-6886-4_18.
Повний текст джерелаChen, Zhiyuan, and Bing Liu. "Continuous Knowledge Learning in Chatbots." In Lifelong Machine Learning, 131–38. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-031-01581-6_8.
Повний текст джерелаLiu, Mark. "Q-Learning with Continuous States." In Machine Learning, Animated, 285–300. Boca Raton: Chapman and Hall/CRC, 2023. http://dx.doi.org/10.1201/b23383-15.
Повний текст джерелаТези доповідей конференцій з теми "Continuous and distributed machine learning"
Belcastro, Loris, Fabrizio Marozzo, Aleandro Presta, and Domenico Talia. "A Spark-based Task Allocation Solution for Machine Learning in the Edge-Cloud Continuum." In 2024 20th International Conference on Distributed Computing in Smart Systems and the Internet of Things (DCOSS-IoT), 576–82. IEEE, 2024. http://dx.doi.org/10.1109/dcoss-iot61029.2024.00090.
Повний текст джерелаBarros, Claudio D. T., Daniel N. R. da Silva, and Fabio A. M. Porto. "Machine Learning on Graph-Structured Data." In Anais Estendidos do Simpósio Brasileiro de Banco de Dados. Sociedade Brasileira de Computação - SBC, 2021. http://dx.doi.org/10.5753/sbbd_estendido.2021.18179.
Повний текст джерелаChepurnov, A., and N. Ershov. "APPLICATION OF MACHINE LEARNING METHODS FOR CROSS-CLASSIFICATION OF ALGORITHMS AND PROBLEMS OF MULTIVARIATE CONTINUOUS OPTIMIZATION." In 9th International Conference "Distributed Computing and Grid Technologies in Science and Education". Crossref, 2021. http://dx.doi.org/10.54546/mlit.2021.67.50.001.
Повний текст джерелаSartzetakis, Ippokratis, Polyzois Soumplis, Panagiotis Pantazopoulos, Konstantinos V. Katsaros, Vasilis Sourlas, and Emmanouel Manos Varvarigos. "Resource Allocation for Distributed Machine Learning at the Edge-Cloud Continuum." In ICC 2022 - IEEE International Conference on Communications. IEEE, 2022. http://dx.doi.org/10.1109/icc45855.2022.9838647.
Повний текст джерелаTiezzi, Matteo, Simone Marullo, Lapo Faggi, Enrico Meloni, Alessandro Betti, and Stefano Melacci. "Stochastic Coherence Over Attention Trajectory For Continuous Learning In Video Streams." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/483.
Повний текст джерелаGupta, Sujasha, Srivatsava Krishnan, and Vishnubaba Sundaresan. "Structural Health Monitoring of Composite Structures via Machine Learning of Mechanoluminescence." In ASME 2019 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/smasis2019-5697.
Повний текст джерелаMendoza, Alberto, Çağrı Cerrahoğlu, Alessandro Delfino, and Martin Sundin. "Signal Processing and Machine Learning for Effective Integration of Distributed Fiber Optic Sensing Data in Production Petrophysics." In 2022 SPWLA 63rd Annual Symposium. Society of Petrophysicists and Well Log Analysts, 2022. http://dx.doi.org/10.30632/spwla-2022-0016.
Повний текст джерелаSadigov, Teymur, Cagri Cerrahoglu, James Ramsay, Laurence Burchell, Sean Cavalero, Thomas Watson, Pradyumna Thiruvenkatanathan, and Martin Sundin. "Real-Time Water Injection Monitoring with Distributed Fiber Optics Using Physics-Informed Machine Learning." In Offshore Technology Conference. OTC, 2021. http://dx.doi.org/10.4043/30982-ms.
Повний текст джерела"Session details: 1st Workshop on Distributed Machine Learning for the Intelligent Computing Continuum (DML-ICC)." In UCC '21: 2021 IEEE/ACM 14th International Conference on Utility and Cloud Computing, edited by Luiz F. Bittencourt, Ian Foster, and Filip De Turck. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3517186.
Повний текст джерелаGao, Hongchang, Hanzi Xu, and Slobodan Vucetic. "Sample Efficient Decentralized Stochastic Frank-Wolfe Methods for Continuous DR-Submodular Maximization." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/482.
Повний текст джерелаЗвіти організацій з теми "Continuous and distributed machine learning"
Shead, Timothy, Jonathan Berry, Cynthia Phillips, and Jared Saia. Information-Theoretically Secure Distributed Machine Learning. Office of Scientific and Technical Information (OSTI), November 2019. http://dx.doi.org/10.2172/1763277.
Повний текст джерелаLee, Ying-Ying, and Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, October 2019. http://dx.doi.org/10.1920/wp.cem.2019.5419.
Повний текст джерелаLee, Ying-Ying, and Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, December 2019. http://dx.doi.org/10.1920/wp.cem.2019.7219.
Повний текст джерелаHuang, Amy, Katelyn Barnes, Joseph Bearer, Evan Chrisinger, and Christopher Stone. Integrating Distributed-Memory Machine Learning into Large-Scale HPC Simulations. Office of Scientific and Technical Information (OSTI), May 2018. http://dx.doi.org/10.2172/1460078.
Повний текст джерелаVarastehpour, Soheil, Hamid Sharifzadeh, and Iman Ardekani. A Comprehensive Review of Deep Learning Algorithms. Unitec ePress, 2021. http://dx.doi.org/10.34074/ocds.092.
Повний текст джерелаLiu, Xiaopei, Dan Liu, and Cong’e Tan. Gut microbiome-based machine learning for diagnostic prediction of liver fibrosis and cirrhosis: a systematic review and meta-analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, May 2022. http://dx.doi.org/10.37766/inplasy2022.5.0133.
Повний текст джерелаChoquette, Gary. PR-000-16209-WEB Data Management Best Practices Learned from CEPM. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), April 2019. http://dx.doi.org/10.55274/r0011568.
Повний текст джерелаVisser, R., H. Kao, R. M. H. Dokht, A. B. Mahani, and S. Venables. A comprehensive earthquake catalogue for northeastern British Columbia: the northern Montney trend from 2017 to 2020 and the Kiskatinaw Seismic Monitoring and Mitigation Area from 2019 to 2020. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/329078.
Повний текст джерелаHarris, L. B., P. Adiban, and E. Gloaguen. The role of enigmatic deep crustal and upper mantle structures on Au and magmatic Ni-Cu-PGE-Cr mineralization in the Superior Province. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/328984.
Повний текст джерела