Literatura científica selecionada sobre o tema "Continuous and distributed machine learning"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Continuous and distributed machine learning".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Continuous and distributed machine learning"
Stan, Ioan-Mihail, Siarhei Padolski e Christopher Jon Lee. "Exploring the self-service model to visualize the results of the ATLAS Machine Learning analysis jobs in BigPanDA with Openshift OKD3". EPJ Web of Conferences 251 (2021): 02009. http://dx.doi.org/10.1051/epjconf/202125102009.
Texto completo da fonteYin, Zhongdong, Jingjing Tu e Yonghai Xu. "Development of a Kernel Extreme Learning Machine Model for Capacity Selection of Distributed Generation Considering the Characteristics of Electric Vehicles". Applied Sciences 9, n.º 12 (13 de junho de 2019): 2401. http://dx.doi.org/10.3390/app9122401.
Texto completo da fonteBrophy, Eoin, Maarten De Vos, Geraldine Boylan e Tomás Ward. "Estimation of Continuous Blood Pressure from PPG via a Federated Learning Approach". Sensors 21, n.º 18 (21 de setembro de 2021): 6311. http://dx.doi.org/10.3390/s21186311.
Texto completo da fonteVrachimis, Andreas, Stella Gkegka e Kostas Kolomvatsos. "Resilient edge machine learning in smart city environments". Journal of Smart Cities and Society 2, n.º 1 (7 de julho de 2023): 3–24. http://dx.doi.org/10.3233/scs-230005.
Texto completo da fonteMusa, M. O., e E. E. Odokuma. "A framework for the detection of distributed denial of service attacks on network logs using ML and DL classifiers". Scientia Africana 22, n.º 3 (25 de janeiro de 2024): 153–64. http://dx.doi.org/10.4314/sa.v22i3.14.
Texto completo da fonteOliveri, Giorgio, Lucas C. van Laake, Cesare Carissimo, Clara Miette e Johannes T. B. Overvelde. "Continuous learning of emergent behavior in robotic matter". Proceedings of the National Academy of Sciences 118, n.º 21 (10 de maio de 2021): e2017015118. http://dx.doi.org/10.1073/pnas.2017015118.
Texto completo da fonteKodaira, Daisuke, Kazuki Tsukazaki, Taiki Kure e Junji Kondoh. "Improving Forecast Reliability for Geographically Distributed Photovoltaic Generations". Energies 14, n.º 21 (4 de novembro de 2021): 7340. http://dx.doi.org/10.3390/en14217340.
Texto completo da fonteHua, Xia, e Lei Han. "Design and Practical Application of Sports Visualization Platform Based on Tracking Algorithm". Computational Intelligence and Neuroscience 2022 (16 de agosto de 2022): 1–9. http://dx.doi.org/10.1155/2022/4744939.
Texto completo da fonteRustam, Furqan, Muhammad Faheem Mushtaq, Ameer Hamza, Muhammad Shoaib Farooq, Anca Delia Jurcut e Imran Ashraf. "Denial of Service Attack Classification Using Machine Learning with Multi-Features". Electronics 11, n.º 22 (20 de novembro de 2022): 3817. http://dx.doi.org/10.3390/electronics11223817.
Texto completo da fonteHuang, Leqi. "Problems, solutions and improvements on federated learning model". Applied and Computational Engineering 22, n.º 1 (23 de outubro de 2023): 183–86. http://dx.doi.org/10.54254/2755-2721/22/20231215.
Texto completo da fonteTeses / dissertações sobre o assunto "Continuous and distributed machine learning"
Armond, Kenneth C. Jr. "Distributed Support Vector Machine Learning". ScholarWorks@UNO, 2008. http://scholarworks.uno.edu/td/711.
Texto completo da fonteAddanki, Ravichandra. "Learning generalizable device placement algorithms for distributed machine learning". Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122746.
Texto completo da fonteCataloged from PDF version of thesis.
Includes bibliographical references (pages 47-50).
We present Placeto, a reinforcement learning (RL) approach to efficiently find device placements for distributed neural network training. Unlike prior approaches that only find a device placement for a specific computation graph, Placeto can learn generalizable device placement policies that can be applied to any graph. We propose two key ideas in our approach: (1) we represent the policy as performing iterative placement improvements, rather than outputting a placement in one shot; (2) we use graph embeddings to capture relevant information about the structure of the computation graph, without relying on node labels for indexing. These ideas allow Placeto to train efficiently and generalize to unseen graphs. Our experiments show that Placeto requires up to 6.1 x fewer training steps to find placements that are on par with or better than the best placements found by prior approaches. Moreover, Placeto is able to learn a generalizable placement policy for any given family of graphs, which can then be used without any retraining to predict optimized placements for unseen graphs from the same family. This eliminates the large overhead incurred by prior RL approaches whose lack of generalizability necessitates re-training from scratch every time a new graph is to be placed.
by Ravichandra Addanki.
S.M.
S.M. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
Johansson, Samuel, e Karol Wojtulewicz. "Machine learning algorithms in a distributed context". Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148920.
Texto completo da fonteKarimi, Ahmad Maroof. "Distributed Machine Learning Based Intrusion Detection System". University of Toledo / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1470401374.
Texto completo da fonteZam, Anton. "Evaluating Distributed Machine Learning using IoT Devices". Thesis, Mittuniversitetet, Institutionen för informationssystem och –teknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-42388.
Texto completo da fonteInternet of things is growing every year with new devices being added all the time. Although some of the devices are continuously in use a large amount of them are mostly idle and sitting on untapped processing power that could be used to compute machine learning computations. There currently exist a lot of different methods to combine the processing power of multiple devices to compute machine learning task these are often called distributed machine learning methods. The main focus of this thesis is to evaluate these distributed machine learning methods to see if they could be implemented on IoT devices and if so, measure how efficient and scalable these methods are. The method chosen for implementation was called “MultiWorkerMirrorStrategy” and this method was evaluated by comparing the training time, training accuracy and evaluation accuracy of 2,3 and 4 Raspberry pi:s with a nondistributed machine learning method with 1 Raspberry pi. The results showed that although the computational power increased with every added device the training time increased while the rest of the measurements stayed the same. After the results were analyzed and discussed the conclusion of this were that the overhead added for communicating between devices were to high resulting in this method being very inefficient and wouldn’t scale without some sort of optimization being added.
Thompson, Simon Giles. "Distributed boosting algorithms". Thesis, University of Portsmouth, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285529.
Texto completo da fonteDahlberg, Leslie. "Evolutionary Computation in Continuous Optimization and Machine Learning". Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35674.
Texto completo da fonteOuyang, Hua. "Optimal stochastic and distributed algorithms for machine learning". Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49091.
Texto completo da fontePrueller, Hans. "Distributed online machine learning for mobile care systems". Thesis, De Montfort University, 2014. http://hdl.handle.net/2086/10875.
Texto completo da fonteKonečný, Jakub. "Stochastic, distributed and federated optimization for machine learning". Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/31478.
Texto completo da fonteLivros sobre o assunto "Continuous and distributed machine learning"
Weiss, Gerhard. Distributed machine learning. Sankt Augustin: Infix, 1995.
Encontre o texto completo da fonteTestas, Abdelaziz. Distributed Machine Learning with PySpark. Berkeley, CA: Apress, 2023. http://dx.doi.org/10.1007/978-1-4842-9751-3.
Texto completo da fonteAmini, M. Hadi, ed. Distributed Machine Learning and Computing. Cham: Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-57567-9.
Texto completo da fonteJiang, Jiawei, Bin Cui e Ce Zhang. Distributed Machine Learning and Gradient Optimization. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-3420-8.
Texto completo da fonteJoshi, Gauri. Optimization Algorithms for Distributed Machine Learning. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-19067-4.
Texto completo da fonteSahoo, Jyoti Prakash, Asis Kumar Tripathy, Manoranjan Mohanty, Kuan-Ching Li e Ajit Kumar Nayak, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-4807-6.
Texto completo da fonteRout, Rashmi Ranjan, Soumya Kanti Ghosh, Prasanta K. Jana, Asis Kumar Tripathy, Jyoti Prakash Sahoo e Kuan-Ching Li, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-1018-0.
Texto completo da fonteTripathy, Asis Kumar, Mahasweta Sarkar, Jyoti Prakash Sahoo, Kuan-Ching Li e Suchismita Chinara, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-4218-3.
Texto completo da fonteNanda, Umakanta, Asis Kumar Tripathy, Jyoti Prakash Sahoo, Mahasweta Sarkar e Kuan-Ching Li, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-1841-2.
Texto completo da fonteChinara, Suchismita, Asis Kumar Tripathy, Kuan-Ching Li, Jyoti Prakash Sahoo e Alekha Kumar Mishra, eds. Advances in Distributed Computing and Machine Learning. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-1203-2.
Texto completo da fonteCapítulos de livros sobre o assunto "Continuous and distributed machine learning"
Carter, Eric, e Matthew Hurst. "Continuous Delivery". In Agile Machine Learning, 59–69. Berkeley, CA: Apress, 2019. http://dx.doi.org/10.1007/978-1-4842-5107-2_3.
Texto completo da fonteYang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen e Han Yu. "Distributed Machine Learning". In Federated Learning, 33–48. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4_3.
Texto completo da fonteGalakatos, Alex, Andrew Crotty e Tim Kraska. "Distributed Machine Learning". In Encyclopedia of Database Systems, 1–6. New York, NY: Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_80647-1.
Texto completo da fonteGalakatos, Alex, Andrew Crotty e Tim Kraska. "Distributed Machine Learning". In Encyclopedia of Database Systems, 1196–201. New York, NY: Springer New York, 2018. http://dx.doi.org/10.1007/978-1-4614-8265-9_80647.
Texto completo da fonteShultz, Thomas R., Scott E. Fahlman, Susan Craw, Periklis Andritsos, Panayiotis Tsaparas, Ricardo Silva, Chris Drummond et al. "Continuous Attribute". In Encyclopedia of Machine Learning, 226. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_172.
Texto completo da fonteLi, Jin, Ping Li, Zheli Liu, Xiaofeng Chen e Tong Li. "Secure Distributed Learning". In Privacy-Preserving Machine Learning, 47–56. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9139-3_4.
Texto completo da fonteDucoulombier, Antoine, e Michèle Sebag. "Continuous mimetic evolution". In Machine Learning: ECML-98, 334–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0026704.
Texto completo da fonteCleophas, Ton J., e Aeilko H. Zwinderman. "Continuous Sequential Techniques". In Machine Learning in Medicine, 187–94. Dordrecht: Springer Netherlands, 2013. http://dx.doi.org/10.1007/978-94-007-6886-4_18.
Texto completo da fonteChen, Zhiyuan, e Bing Liu. "Continuous Knowledge Learning in Chatbots". In Lifelong Machine Learning, 131–38. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-031-01581-6_8.
Texto completo da fonteLiu, Mark. "Q-Learning with Continuous States". In Machine Learning, Animated, 285–300. Boca Raton: Chapman and Hall/CRC, 2023. http://dx.doi.org/10.1201/b23383-15.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Continuous and distributed machine learning"
Belcastro, Loris, Fabrizio Marozzo, Aleandro Presta e Domenico Talia. "A Spark-based Task Allocation Solution for Machine Learning in the Edge-Cloud Continuum". In 2024 20th International Conference on Distributed Computing in Smart Systems and the Internet of Things (DCOSS-IoT), 576–82. IEEE, 2024. http://dx.doi.org/10.1109/dcoss-iot61029.2024.00090.
Texto completo da fonteBarros, Claudio D. T., Daniel N. R. da Silva e Fabio A. M. Porto. "Machine Learning on Graph-Structured Data". In Anais Estendidos do Simpósio Brasileiro de Banco de Dados. Sociedade Brasileira de Computação - SBC, 2021. http://dx.doi.org/10.5753/sbbd_estendido.2021.18179.
Texto completo da fonteChepurnov, A., e N. Ershov. "APPLICATION OF MACHINE LEARNING METHODS FOR CROSS-CLASSIFICATION OF ALGORITHMS AND PROBLEMS OF MULTIVARIATE CONTINUOUS OPTIMIZATION". In 9th International Conference "Distributed Computing and Grid Technologies in Science and Education". Crossref, 2021. http://dx.doi.org/10.54546/mlit.2021.67.50.001.
Texto completo da fonteSartzetakis, Ippokratis, Polyzois Soumplis, Panagiotis Pantazopoulos, Konstantinos V. Katsaros, Vasilis Sourlas e Emmanouel Manos Varvarigos. "Resource Allocation for Distributed Machine Learning at the Edge-Cloud Continuum". In ICC 2022 - IEEE International Conference on Communications. IEEE, 2022. http://dx.doi.org/10.1109/icc45855.2022.9838647.
Texto completo da fonteTiezzi, Matteo, Simone Marullo, Lapo Faggi, Enrico Meloni, Alessandro Betti e Stefano Melacci. "Stochastic Coherence Over Attention Trajectory For Continuous Learning In Video Streams". In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/483.
Texto completo da fonteGupta, Sujasha, Srivatsava Krishnan e Vishnubaba Sundaresan. "Structural Health Monitoring of Composite Structures via Machine Learning of Mechanoluminescence". In ASME 2019 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/smasis2019-5697.
Texto completo da fonteMendoza, Alberto, Çağrı Cerrahoğlu, Alessandro Delfino e Martin Sundin. "Signal Processing and Machine Learning for Effective Integration of Distributed Fiber Optic Sensing Data in Production Petrophysics". In 2022 SPWLA 63rd Annual Symposium. Society of Petrophysicists and Well Log Analysts, 2022. http://dx.doi.org/10.30632/spwla-2022-0016.
Texto completo da fonteSadigov, Teymur, Cagri Cerrahoglu, James Ramsay, Laurence Burchell, Sean Cavalero, Thomas Watson, Pradyumna Thiruvenkatanathan e Martin Sundin. "Real-Time Water Injection Monitoring with Distributed Fiber Optics Using Physics-Informed Machine Learning". In Offshore Technology Conference. OTC, 2021. http://dx.doi.org/10.4043/30982-ms.
Texto completo da fonte"Session details: 1st Workshop on Distributed Machine Learning for the Intelligent Computing Continuum (DML-ICC)". In UCC '21: 2021 IEEE/ACM 14th International Conference on Utility and Cloud Computing, editado por Luiz F. Bittencourt, Ian Foster e Filip De Turck. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3517186.
Texto completo da fonteGao, Hongchang, Hanzi Xu e Slobodan Vucetic. "Sample Efficient Decentralized Stochastic Frank-Wolfe Methods for Continuous DR-Submodular Maximization". In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/482.
Texto completo da fonteRelatórios de organizações sobre o assunto "Continuous and distributed machine learning"
Shead, Timothy, Jonathan Berry, Cynthia Phillips e Jared Saia. Information-Theoretically Secure Distributed Machine Learning. Office of Scientific and Technical Information (OSTI), novembro de 2019. http://dx.doi.org/10.2172/1763277.
Texto completo da fonteLee, Ying-Ying, e Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, outubro de 2019. http://dx.doi.org/10.1920/wp.cem.2019.5419.
Texto completo da fonteLee, Ying-Ying, e Kyle Colangelo. Double debiased machine learning nonparametric inference with continuous treatments. The IFS, dezembro de 2019. http://dx.doi.org/10.1920/wp.cem.2019.7219.
Texto completo da fonteHuang, Amy, Katelyn Barnes, Joseph Bearer, Evan Chrisinger e Christopher Stone. Integrating Distributed-Memory Machine Learning into Large-Scale HPC Simulations. Office of Scientific and Technical Information (OSTI), maio de 2018. http://dx.doi.org/10.2172/1460078.
Texto completo da fonteVarastehpour, Soheil, Hamid Sharifzadeh e Iman Ardekani. A Comprehensive Review of Deep Learning Algorithms. Unitec ePress, 2021. http://dx.doi.org/10.34074/ocds.092.
Texto completo da fonteLiu, Xiaopei, Dan Liu e Cong’e Tan. Gut microbiome-based machine learning for diagnostic prediction of liver fibrosis and cirrhosis: a systematic review and meta-analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, maio de 2022. http://dx.doi.org/10.37766/inplasy2022.5.0133.
Texto completo da fonteChoquette, Gary. PR-000-16209-WEB Data Management Best Practices Learned from CEPM. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), abril de 2019. http://dx.doi.org/10.55274/r0011568.
Texto completo da fonteVisser, R., H. Kao, R. M. H. Dokht, A. B. Mahani e S. Venables. A comprehensive earthquake catalogue for northeastern British Columbia: the northern Montney trend from 2017 to 2020 and the Kiskatinaw Seismic Monitoring and Mitigation Area from 2019 to 2020. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/329078.
Texto completo da fonteHarris, L. B., P. Adiban e E. Gloaguen. The role of enigmatic deep crustal and upper mantle structures on Au and magmatic Ni-Cu-PGE-Cr mineralization in the Superior Province. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/328984.
Texto completo da fonte