Littérature scientifique sur le sujet « Continuous and distributed machine learning »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Sommaire
Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Continuous and distributed machine learning ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Articles de revues sur le sujet "Continuous and distributed machine learning"
Stan, Ioan-Mihail, Siarhei Padolski et Christopher Jon Lee. « Exploring the self-service model to visualize the results of the ATLAS Machine Learning analysis jobs in BigPanDA with Openshift OKD3 ». EPJ Web of Conferences 251 (2021) : 02009. http://dx.doi.org/10.1051/epjconf/202125102009.
Texte intégralYin, Zhongdong, Jingjing Tu et Yonghai Xu. « Development of a Kernel Extreme Learning Machine Model for Capacity Selection of Distributed Generation Considering the Characteristics of Electric Vehicles ». Applied Sciences 9, no 12 (13 juin 2019) : 2401. http://dx.doi.org/10.3390/app9122401.
Texte intégralBrophy, Eoin, Maarten De Vos, Geraldine Boylan et Tomás Ward. « Estimation of Continuous Blood Pressure from PPG via a Federated Learning Approach ». Sensors 21, no 18 (21 septembre 2021) : 6311. http://dx.doi.org/10.3390/s21186311.
Texte intégralVrachimis, Andreas, Stella Gkegka et Kostas Kolomvatsos. « Resilient edge machine learning in smart city environments ». Journal of Smart Cities and Society 2, no 1 (7 juillet 2023) : 3–24. http://dx.doi.org/10.3233/scs-230005.
Texte intégralMusa, M. O., et E. E. Odokuma. « A framework for the detection of distributed denial of service attacks on network logs using ML and DL classifiers ». Scientia Africana 22, no 3 (25 janvier 2024) : 153–64. http://dx.doi.org/10.4314/sa.v22i3.14.
Texte intégralOliveri, Giorgio, Lucas C. van Laake, Cesare Carissimo, Clara Miette et Johannes T. B. Overvelde. « Continuous learning of emergent behavior in robotic matter ». Proceedings of the National Academy of Sciences 118, no 21 (10 mai 2021) : e2017015118. http://dx.doi.org/10.1073/pnas.2017015118.
Texte intégralKodaira, Daisuke, Kazuki Tsukazaki, Taiki Kure et Junji Kondoh. « Improving Forecast Reliability for Geographically Distributed Photovoltaic Generations ». Energies 14, no 21 (4 novembre 2021) : 7340. http://dx.doi.org/10.3390/en14217340.
Texte intégralHua, Xia, et Lei Han. « Design and Practical Application of Sports Visualization Platform Based on Tracking Algorithm ». Computational Intelligence and Neuroscience 2022 (16 août 2022) : 1–9. http://dx.doi.org/10.1155/2022/4744939.
Texte intégralRustam, Furqan, Muhammad Faheem Mushtaq, Ameer Hamza, Muhammad Shoaib Farooq, Anca Delia Jurcut et Imran Ashraf. « Denial of Service Attack Classification Using Machine Learning with Multi-Features ». Electronics 11, no 22 (20 novembre 2022) : 3817. http://dx.doi.org/10.3390/electronics11223817.
Texte intégralHuang, Leqi. « Problems, solutions and improvements on federated learning model ». Applied and Computational Engineering 22, no 1 (23 octobre 2023) : 183–86. http://dx.doi.org/10.54254/2755-2721/22/20231215.
Texte intégralThèses sur le sujet "Continuous and distributed machine learning"
Armond, Kenneth C. Jr. « Distributed Support Vector Machine Learning ». ScholarWorks@UNO, 2008. http://scholarworks.uno.edu/td/711.
Texte intégralAddanki, Ravichandra. « Learning generalizable device placement algorithms for distributed machine learning ». Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122746.
Texte intégralCataloged from PDF version of thesis.
Includes bibliographical references (pages 47-50).
We present Placeto, a reinforcement learning (RL) approach to efficiently find device placements for distributed neural network training. Unlike prior approaches that only find a device placement for a specific computation graph, Placeto can learn generalizable device placement policies that can be applied to any graph. We propose two key ideas in our approach: (1) we represent the policy as performing iterative placement improvements, rather than outputting a placement in one shot; (2) we use graph embeddings to capture relevant information about the structure of the computation graph, without relying on node labels for indexing. These ideas allow Placeto to train efficiently and generalize to unseen graphs. Our experiments show that Placeto requires up to 6.1 x fewer training steps to find placements that are on par with or better than the best placements found by prior approaches. Moreover, Placeto is able to learn a generalizable placement policy for any given family of graphs, which can then be used without any retraining to predict optimized placements for unseen graphs from the same family. This eliminates the large overhead incurred by prior RL approaches whose lack of generalizability necessitates re-training from scratch every time a new graph is to be placed.
by Ravichandra Addanki.
S.M.
S.M. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
Johansson, Samuel, et Karol Wojtulewicz. « Machine learning algorithms in a distributed context ». Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920.
Texte intégralKarimi, Ahmad Maroof. « Distributed Machine Learning Based Intrusion Detection System ». University of Toledo / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1470401374.
Texte intégralZam, Anton. « Evaluating Distributed Machine Learning using IoT Devices ». Thesis, Mittuniversitetet, Institutionen för informationssystem och –teknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-42388.
Texte intégralInternet of things is growing every year with new devices being added all the time. Although some of the devices are continuously in use a large amount of them are mostly idle and sitting on untapped processing power that could be used to compute machine learning computations. There currently exist a lot of different methods to combine the processing power of multiple devices to compute machine learning task these are often called distributed machine learning methods. The main focus of this thesis is to evaluate these distributed machine learning methods to see if they could be implemented on IoT devices and if so, measure how efficient and scalable these methods are. The method chosen for implementation was called “MultiWorkerMirrorStrategy” and this method was evaluated by comparing the training time, training accuracy and evaluation accuracy of 2,3 and 4 Raspberry pi:s with a nondistributed machine learning method with 1 Raspberry pi. The results showed that although the computational power increased with every added device the training time increased while the rest of the measurements stayed the same. After the results were analyzed and discussed the conclusion of this were that the overhead added for communicating between devices were to high resulting in this method being very inefficient and wouldn’t scale without some sort of optimization being added.
Thompson, Simon Giles. « Distributed boosting algorithms ». Thesis, University of Portsmouth, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285529.
Texte intégralDahlberg, Leslie. « Evolutionary Computation in Continuous Optimization and Machine Learning ». Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35674.
Texte intégralOuyang, Hua. « Optimal stochastic and distributed algorithms for machine learning ». Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49091.
Texte intégralPrueller, Hans. « Distributed online machine learning for mobile care systems ». Thesis, De Montfort University, 2014. http://hdl.handle.net/2086/10875.
Texte intégralKonečný, Jakub. « Stochastic, distributed and federated optimization for machine learning ». Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/31478.
Texte intégralLivres sur le sujet "Continuous and distributed machine learning"
Weiss, Gerhard. Distributed machine learning. Sankt Augustin : Infix, 1995.
Trouver le texte intégralTestas, Abdelaziz. Distributed Machine Learning with PySpark. Berkeley, CA : Apress, 2023. http://dx.doi.org/10.1007/978-1-4842-9751-3.
Texte intégralAmini, M. Hadi, dir. Distributed Machine Learning and Computing. Cham : Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-57567-9.
Texte intégralJiang, Jiawei, Bin Cui et Ce Zhang. Distributed Machine Learning and Gradient Optimization. Singapore : Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-3420-8.
Texte intégralJoshi, Gauri. Optimization Algorithms for Distributed Machine Learning. Cham : Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-19067-4.
Texte intégralSahoo, Jyoti Prakash, Asis Kumar Tripathy, Manoranjan Mohanty, Kuan-Ching Li et Ajit Kumar Nayak, dir. Advances in Distributed Computing and Machine Learning. Singapore : Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-4807-6.
Texte intégralRout, Rashmi Ranjan, Soumya Kanti Ghosh, Prasanta K. Jana, Asis Kumar Tripathy, Jyoti Prakash Sahoo et Kuan-Ching Li, dir. Advances in Distributed Computing and Machine Learning. Singapore : Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-1018-0.
Texte intégralTripathy, Asis Kumar, Mahasweta Sarkar, Jyoti Prakash Sahoo, Kuan-Ching Li et Suchismita Chinara, dir. Advances in Distributed Computing and Machine Learning. Singapore : Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-4218-3.
Texte intégralNanda, Umakanta, Asis Kumar Tripathy, Jyoti Prakash Sahoo, Mahasweta Sarkar et Kuan-Ching Li, dir. Advances in Distributed Computing and Machine Learning. Singapore : Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-1841-2.
Texte intégralChinara, Suchismita, Asis Kumar Tripathy, Kuan-Ching Li, Jyoti Prakash Sahoo et Alekha Kumar Mishra, dir. Advances in Distributed Computing and Machine Learning. Singapore : Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-1203-2.
Texte intégralChapitres de livres sur le sujet "Continuous and distributed machine learning"
Carter, Eric, et Matthew Hurst. « Continuous Delivery ». Dans Agile Machine Learning, 59–69. Berkeley, CA : Apress, 2019. http://dx.doi.org/10.1007/978-1-4842-5107-2_3.
Texte intégralYang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen et Han Yu. « Distributed Machine Learning ». Dans Federated Learning, 33–48. Cham : Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4_3.
Texte intégralGalakatos, Alex, Andrew Crotty et Tim Kraska. « Distributed Machine Learning ». Dans Encyclopedia of Database Systems, 1–6. New York, NY : Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_80647-1.
Texte intégralGalakatos, Alex, Andrew Crotty et Tim Kraska. « Distributed Machine Learning ». Dans Encyclopedia of Database Systems, 1196–201. New York, NY : Springer New York, 2018. http://dx.doi.org/10.1007/978-1-4614-8265-9_80647.
Texte intégralShultz, Thomas R., Scott E. Fahlman, Susan Craw, Periklis Andritsos, Panayiotis Tsaparas, Ricardo Silva, Chris Drummond et al. « Continuous Attribute ». Dans Encyclopedia of Machine Learning, 226. Boston, MA : Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_172.
Texte intégralLi, Jin, Ping Li, Zheli Liu, Xiaofeng Chen et Tong Li. « Secure Distributed Learning ». Dans Privacy-Preserving Machine Learning, 47–56. Singapore : Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9139-3_4.
Texte intégralDucoulombier, Antoine, et Michèle Sebag. « Continuous mimetic evolution ». Dans Machine Learning : ECML-98, 334–45. Berlin, Heidelberg : Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0026704.
Texte intégralCleophas, Ton J., et Aeilko H. Zwinderman. « Continuous Sequential Techniques ». Dans Machine Learning in Medicine, 187–94. Dordrecht : Springer Netherlands, 2013. http://dx.doi.org/10.1007/978-94-007-6886-4_18.
Texte intégralChen, Zhiyuan, et Bing Liu. « Continuous Knowledge Learning in Chatbots ». Dans Lifelong Machine Learning, 131–38. Cham : Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-031-01581-6_8.
Texte intégralLiu, Mark. « Q-Learning with Continuous States ». Dans Machine Learning, Animated, 285–300. Boca Raton : Chapman and Hall/CRC, 2023. http://dx.doi.org/10.1201/b23383-15.
Texte intégralActes de conférences sur le sujet "Continuous and distributed machine learning"
Belcastro, Loris, Fabrizio Marozzo, Aleandro Presta et Domenico Talia. « A Spark-based Task Allocation Solution for Machine Learning in the Edge-Cloud Continuum ». Dans 2024 20th International Conference on Distributed Computing in Smart Systems and the Internet of Things (DCOSS-IoT), 576–82. IEEE, 2024. http://dx.doi.org/10.1109/dcoss-iot61029.2024.00090.
Texte intégralBarros, Claudio D. T., Daniel N. R. da Silva et Fabio A. M. Porto. « Machine Learning on Graph-Structured Data ». Dans Anais Estendidos do Simpósio Brasileiro de Banco de Dados. Sociedade Brasileira de Computação - SBC, 2021. http://dx.doi.org/10.5753/sbbd_estendido.2021.18179.
Texte intégralChepurnov, A., et N. Ershov. « APPLICATION OF MACHINE LEARNING METHODS FOR CROSS-CLASSIFICATION OF ALGORITHMS AND PROBLEMS OF MULTIVARIATE CONTINUOUS OPTIMIZATION ». Dans 9th International Conference "Distributed Computing and Grid Technologies in Science and Education". Crossref, 2021. http://dx.doi.org/10.54546/mlit.2021.67.50.001.
Texte intégralSartzetakis, Ippokratis, Polyzois Soumplis, Panagiotis Pantazopoulos, Konstantinos V. Katsaros, Vasilis Sourlas et Emmanouel Manos Varvarigos. « Resource Allocation for Distributed Machine Learning at the Edge-Cloud Continuum ». Dans ICC 2022 - IEEE International Conference on Communications. IEEE, 2022. http://dx.doi.org/10.1109/icc45855.2022.9838647.
Texte intégralTiezzi, Matteo, Simone Marullo, Lapo Faggi, Enrico Meloni, Alessandro Betti et Stefano Melacci. « Stochastic Coherence Over Attention Trajectory For Continuous Learning In Video Streams ». Dans Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California : International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/483.
Texte intégralGupta, Sujasha, Srivatsava Krishnan et Vishnubaba Sundaresan. « Structural Health Monitoring of Composite Structures via Machine Learning of Mechanoluminescence ». Dans ASME 2019 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/smasis2019-5697.
Texte intégralMendoza, Alberto, Çağrı Cerrahoğlu, Alessandro Delfino et Martin Sundin. « Signal Processing and Machine Learning for Effective Integration of Distributed Fiber Optic Sensing Data in Production Petrophysics ». Dans 2022 SPWLA 63rd Annual Symposium. Society of Petrophysicists and Well Log Analysts, 2022. http://dx.doi.org/10.30632/spwla-2022-0016.
Texte intégralSadigov, Teymur, Cagri Cerrahoglu, James Ramsay, Laurence Burchell, Sean Cavalero, Thomas Watson, Pradyumna Thiruvenkatanathan et Martin Sundin. « Real-Time Water Injection Monitoring with Distributed Fiber Optics Using Physics-Informed Machine Learning ». Dans Offshore Technology Conference. OTC, 2021. http://dx.doi.org/10.4043/30982-ms.
Texte intégral« Session details : 1st Workshop on Distributed Machine Learning for the Intelligent Computing Continuum (DML-ICC) ». Dans UCC '21 : 2021 IEEE/ACM 14th International Conference on Utility and Cloud Computing, sous la direction de Luiz F. Bittencourt, Ian Foster et Filip De Turck. New York, NY, USA : ACM, 2021. http://dx.doi.org/10.1145/3517186.
Texte intégralGao, Hongchang, Hanzi Xu et Slobodan Vucetic. « Sample Efficient Decentralized Stochastic Frank-Wolfe Methods for Continuous DR-Submodular Maximization ». Dans Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California : International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/482.
Texte intégralRapports d'organisations sur le sujet "Continuous and distributed machine learning"
Shead, Timothy, Jonathan Berry, Cynthia Phillips et Jared Saia. Information-Theoretically Secure Distributed Machine Learning. Office of Scientific and Technical Information (OSTI), novembre 2019. http://dx.doi.org/10.2172/1763277.
Texte intégralLee, Ying-Ying, et Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, octobre 2019. http://dx.doi.org/10.1920/wp.cem.2019.5419.
Texte intégralLee, Ying-Ying, et Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, décembre 2019. http://dx.doi.org/10.1920/wp.cem.2019.7219.
Texte intégralHuang, Amy, Katelyn Barnes, Joseph Bearer, Evan Chrisinger et Christopher Stone. Integrating Distributed-Memory Machine Learning into Large-Scale HPC Simulations. Office of Scientific and Technical Information (OSTI), mai 2018. http://dx.doi.org/10.2172/1460078.
Texte intégralVarastehpour, Soheil, Hamid Sharifzadeh et Iman Ardekani. A Comprehensive Review of Deep Learning Algorithms. Unitec ePress, 2021. http://dx.doi.org/10.34074/ocds.092.
Texte intégralLiu, Xiaopei, Dan Liu et Cong’e Tan. Gut microbiome-based machine learning for diagnostic prediction of liver fibrosis and cirrhosis : a systematic review and meta-analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, mai 2022. http://dx.doi.org/10.37766/inplasy2022.5.0133.
Texte intégralChoquette, Gary. PR-000-16209-WEB Data Management Best Practices Learned from CEPM. Chantilly, Virginia : Pipeline Research Council International, Inc. (PRCI), avril 2019. http://dx.doi.org/10.55274/r0011568.
Texte intégralVisser, R., H. Kao, R. M. H. Dokht, A. B. Mahani et S. Venables. A comprehensive earthquake catalogue for northeastern British Columbia : the northern Montney trend from 2017 to 2020 and the Kiskatinaw Seismic Monitoring and Mitigation Area from 2019 to 2020. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/329078.
Texte intégralHarris, L. B., P. Adiban et E. Gloaguen. The role of enigmatic deep crustal and upper mantle structures on Au and magmatic Ni-Cu-PGE-Cr mineralization in the Superior Province. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/328984.
Texte intégral