Academic literature on the topic 'Federate learning'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Federate learning.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Federate learning"
Oktian, Yustus Eko, Brian Stanley, and Sang-Gon Lee. "Building Trusted Federated Learning on Blockchain." Symmetry 14, no. 7 (July 8, 2022): 1407. http://dx.doi.org/10.3390/sym14071407.
Full textLi, Yanbin, Yue Li, Huanliang Xu, and Shougang Ren. "An Adaptive Communication-Efficient Federated Learning to Resist Gradient-Based Reconstruction Attacks." Security and Communication Networks 2021 (April 22, 2021): 1–16. http://dx.doi.org/10.1155/2021/9919030.
Full textBektemyssova, G. U., G. S. Bakirova, Sh G. Yermukhanbetova, A. Shyntore, D. B. Umutkulov, and Zh S. Mangysheva. "Analysis of the relevance and prospects of application of federate training." Bulletin of the National Engineering Academy of the Republic of Kazakhstan 92, no. 2 (June 30, 2024): 56–65. http://dx.doi.org/10.47533/2024.1606-146x.26.
Full textShkurti, Lamir, and Mennan Selimi. "AdaptiveMesh: Adaptive Federate Learning for Resource-Constrained Wireless Environments." International Journal of Online and Biomedical Engineering (iJOE) 20, no. 14 (November 14, 2024): 22–37. http://dx.doi.org/10.3991/ijoe.v20i14.50559.
Full textKholod, Ivan, Evgeny Yanaki, Dmitry Fomichev, Evgeniy Shalugin, Evgenia Novikova, Evgeny Filippov, and Mats Nordlund. "Open-Source Federated Learning Frameworks for IoT: A Comparative Review and Analysis." Sensors 21, no. 1 (December 29, 2020): 167. http://dx.doi.org/10.3390/s21010167.
Full textSrinivas, C., S. Venkatramulu, V. Chandra Shekar Rao, B. Raghuram, K. Vinay Kumar, and Sreenivas Pratapagiri. "Decentralized Machine Learning based Energy Efficient Routing and Intrusion Detection in Unmanned Aerial Network (UAV)." International Journal on Recent and Innovation Trends in Computing and Communication 11, no. 6s (June 13, 2023): 517–27. http://dx.doi.org/10.17762/ijritcc.v11i6s.6960.
Full textTabaszewski, Maciej, Paweł Twardowski, Martyna Wiciak-Pikuła, Natalia Znojkiewicz, Agata Felusiak-Czyryca, and Jakub Czyżycki. "Machine Learning Approaches for Monitoring of Tool Wear during Grey Cast-Iron Turning." Materials 15, no. 12 (June 20, 2022): 4359. http://dx.doi.org/10.3390/ma15124359.
Full textLaunet, Laëtitia, Yuandou Wang, Adrián Colomer, Jorge Igual, Cristian Pulgarín-Ospina, Spiros Koulouzis, Riccardo Bianchi, et al. "Federating Medical Deep Learning Models from Private Jupyter Notebooks to Distributed Institutions." Applied Sciences 13, no. 2 (January 9, 2023): 919. http://dx.doi.org/10.3390/app13020919.
Full textParekh, Nisha Harish, and Mrs Vrushali Shinde. "Federated Learning : A Paradigm Shift in Collaborative Machine Learning." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 11 (November 10, 2024): 1–6. http://dx.doi.org/10.55041/ijsrem38501.
Full textШубин, Б., Т. Максимюк, О. Яремко, Л. Фабрі, and Д. Мрозек. "МОДЕЛЬ ІНТЕГРАЦІЇ ФЕДЕРАТИВНОГО НАВЧАННЯ В МЕРЕЖІ МОБІЛЬНОГО ЗВ’ЯЗКУ 5-ГО ПОКОЛІННЯ." Information and communication technologies, electronic engineering 2, no. 1 (August 2022): 26–35. http://dx.doi.org/10.23939/ictee2022.01.026.
Full textDissertations / Theses on the topic "Federate learning"
Eriksson, Henrik. "Federated Learning in Large Scale Networks : Exploring Hierarchical Federated Learning." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-292744.
Full textFederated Learning står inför en utmaning när det gäller att hantera data med en hög grad av heterogenitet och det kan i vissa fall vara olämpligt att använda sig av en approach där en och samma modell är tränad för att användas av alla noder i nätverket. Olika approacher för att hantera detta problem har undersökts som att anpassa den tränade modellen till varje nod och att klustra noderna i nätverket och träna en egen modell för varje kluster inom vilket datan är mindre heterogen. I detta arbete studeras möjligheterna att förbättra prestandan hos de lokala modellerna genom att dra nytta av den hierarkiska anordning som uppstår när de deltagande noderna i nätverket grupperas i kluster. Experiment är utförda med ett Long Short-Term Memory-nätverk för att utföra tidsserieprognoser för att utvärdera olika approacher som drar nytta av den hierarkiska anordningen och jämför dem med vanliga federated learning-approacher. Experimenten är utförda med ett dataset insamlat av Ericsson AB. Det består av "handoversfrån basstationer i en europeisk stad. De hierarkiska approacherna visade inga fördelar jämfört med de vanliga två-nivåapproacherna.
Taiello, Riccardo. "Apprentissage automatique sécurisé pour l'analyse collaborative des données de santé à grande échelle." Electronic Thesis or Diss., Université Côte d'Azur, 2024. http://www.theses.fr/2024COAZ4031.
Full textThis PhD thesis explores the integration of privacy preservation, medical imaging, and Federated Learning (FL) using advanced cryptographic methods. Within the context of medical image analysis, we develop a privacy-preserving image registration (PPIR) framework. This framework addresses the challenge of registering images confidentially, without revealing their contents. By extending classical registration paradigms, we incorporate cryptographic tools like secure multi-party computation and homomorphic encryption to perform these operations securely. These tools are vital as they prevent data leakage during processing. Given the challenges associated with the performance and scalability of cryptographic methods in high-dimensional data, we optimize our image registration operations using gradient approximations. Our focus extends to increasingly complex registration methods, such as rigid, affine, and non-linear approaches using cubic splines or diffeomorphisms, parameterized by time-varying velocity fields. We demonstrate how these sophisticated registration methods can integrate privacy-preserving mechanisms effectively across various tasks. Concurrently, the thesis addresses the challenge of stragglers in FL, emphasizing the role of Secure Aggregation (SA) in collaborative model training. We introduce "Eagle", a synchronous SA scheme designed to optimize participation by late-arriving devices, significantly enhancing computational and communication efficiencies. We also present "Owl", tailored for buffered asynchronous FL settings, consistently outperforming earlier solutions. Furthermore, in the realm of Buffered AsyncSA, we propose two novel approaches: "Buffalo" and "Buffalo+". "Buffalo" advances SA techniques for Buffered AsyncSA, while "Buffalo+" counters sophisticated attacks that traditional methods fail to detect, such as model replacement. This solution leverages the properties of incremental hash functions and explores the sparsity in the quantization of local gradients from client models. Both Buffalo and Buffalo+ are validated theoretically and experimentally, demonstrating their effectiveness in a new cross-device FL task for medical devices.Finally, this thesis has devoted particular attention to the translation of privacy-preserving tools in real-world applications, notably through the FL open-source framework Fed-BioMed. Contributions concern the introduction of one of the first practical SA implementations specifically designed for cross-silo FL among hospitals, showcasing several practical use cases
Mäenpää, Dylan. "Towards Peer-to-Peer Federated Learning: Algorithms and Comparisons to Centralized Federated Learning." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-176778.
Full textLiang, Jiarong. "Federated Learning for Bioimage Classification." Thesis, Uppsala universitet, Institutionen för biologisk grundutbildning, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-420615.
Full textZhao, Qiwei. "Federated Learning with Heterogeneous Challenge." Thesis, The University of Sydney, 2022. https://hdl.handle.net/2123/27399.
Full textCarlsson, Robert. "Privacy-Preserved Federated Learning : A survey of applicable machine learning algorithms in a federated environment." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-424383.
Full textDinh, The Canh. "Distributed Algorithms for Fast and Personalized Federated Learning." Thesis, The University of Sydney, 2023. https://hdl.handle.net/2123/30019.
Full textFelix, Johannes Morsbach. "Hardened Model Aggregation for Federated Learning backed by Distributed Trust Towards decentralizing Federated Learning using a Blockchain." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-423621.
Full textLeconte, Louis. "Compression and federated learning : an approach to frugal machine learning." Electronic Thesis or Diss., Sorbonne université, 2024. http://www.theses.fr/2024SORUS107.
Full text“Intelligent” devices and tools are gradually becoming the standard, as the implementation of algorithms based on artificial neural networks is experiencing widespread development. Neural networks consist of non-linear machine learning models that manipulate high-dimensional objects and obtain state-of-the-art performances in various areas, such as image recognition, speech recognition, natural language processing, and recommendation systems.However, training a neural network on a device with lower computing capacity can be challenging, as it can imply cutting back on memory, computing time or power. A natural approach to simplify this training is to use quantized neural networks, whose parameters and operations use efficient low-bit primitives. However, optimizing a function over a discrete set in high dimension is complex, and can still be prohibitively expensive in terms of computational power. For this reason, many modern applications use a network of devices to store individual data and share the computational load. A new approach, federated learning, considers a distributed environment: Data is stored on devices and a centralized server orchestrates the training process across multiple devices.In this thesis, we investigate different aspects of (stochastic) optimization with the goal of reducing energy costs for potentially very heterogeneous devices. The first two contributions of this work are dedicated to the case of quantized neural networks. Our first idea is based on an annealing strategy: we formulate the discrete optimization problem as a constrained optimization problem (where the size of the constraint is reduced over iterations). We then focus on a heuristic for training binary deep neural networks. In this particular framework, the parameters of the neural networks can only have two values. The rest of the thesis is about efficient federated learning. Following our contributions developed for training quantized neural network, we integrate them into a federated environment. Then, we propose a novel unbiased compression technique that can be used in any gradient based distributed optimization framework. Our final contributions address the particular case of asynchronous federated learning, where devices have different computational speeds and/or access to bandwidth. We first propose a contribution that reweights the contributions of distributed devices. Then, in our final work, through a detailed queuing dynamics analysis, we propose a significant improvement to the complexity bounds provided in the literature onasynchronous federated learning.In summary, this thesis presents novel contributions to the field of quantized neural networks and federated learning by addressing critical challenges and providing innovative solutions for efficient and sustainable learning in a distributed and heterogeneous environment. Although the potential benefits are promising, especially in terms of energy savings, caution is needed as a rebound effect could occur
Adapa, Supriya. "TensorFlow Federated Learning: Application to Decentralized Data." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.
Find full textBooks on the topic "Federate learning"
Yang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen, and Han Yu. Federated Learning. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4.
Full textLudwig, Heiko, and Nathalie Baracaldo, eds. Federated Learning. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-96896-0.
Full textYang, Qiang, Lixin Fan, and Han Yu, eds. Federated Learning. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-63076-8.
Full textJin, Yaochu, Hangyu Zhu, Jinjin Xu, and Yang Chen. Federated Learning. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-19-7083-2.
Full textUddin, M. Irfan, and Wali Khan Mashwani. Federated Learning. Boca Raton: CRC Press, 2024. http://dx.doi.org/10.1201/9781003466581.
Full textSahoo, Jayakrushna, Mariya Ouaissa, and Akarsh K. Nair. Federated Learning. New York: Apple Academic Press, 2024. http://dx.doi.org/10.1201/9781003497196.
Full textRehman, Muhammad Habib ur, and Mohamed Medhat Gaber, eds. Federated Learning Systems. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-70604-3.
Full textGoebel, Randy, Han Yu, Boi Faltings, Lixin Fan, and Zehui Xiong, eds. Trustworthy Federated Learning. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-28996-5.
Full textRazavi-Far, Roozbeh, Boyu Wang, Matthew E. Taylor, and Qiang Yang, eds. Federated and Transfer Learning. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-11748-0.
Full textKrishnan, Saravanan, A. Jose Anand, R. Srinivasan, R. Kavitha, and S. Suresh. Handbook on Federated Learning. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003384854.
Full textBook chapters on the topic "Federate learning"
Rehman, Atiq Ur, Samir Brahim Belhaouari, Tanya Stanko, and Vladimir Gorovoy. "Divide to Federate Clustering Concept for Unsupervised Learning." In Proceedings of Seventh International Congress on Information and Communication Technology, 19–29. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-2397-5_3.
Full textHuang, Hai, Wei Wu, Xin Tang, and Zhong Zhou. "Federate Migration in Grid-Based Virtual Wargame Collaborative Environment." In Technologies for E-Learning and Digital Entertainment, 606–15. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11736639_75.
Full textJin, Yaochu, Hangyu Zhu, Jinjin Xu, and Yang Chen. "Summary and Outlook." In Federated Learning, 213–15. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-7083-2_5.
Full textJin, Yaochu, Hangyu Zhu, Jinjin Xu, and Yang Chen. "Introduction." In Federated Learning, 1–92. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-7083-2_1.
Full textJin, Yaochu, Hangyu Zhu, Jinjin Xu, and Yang Chen. "Communication Efficient Federated Learning." In Federated Learning, 93–137. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-7083-2_2.
Full textJin, Yaochu, Hangyu Zhu, Jinjin Xu, and Yang Chen. "Secure Federated Learning." In Federated Learning, 165–212. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-7083-2_4.
Full textJin, Yaochu, Hangyu Zhu, Jinjin Xu, and Yang Chen. "Evolutionary Multi-objective Federated Learning." In Federated Learning, 139–64. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-7083-2_3.
Full textYang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen, and Han Yu. "Incentive Mechanism Design for Federated Learning." In Federated Learning, 95–105. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4_7.
Full textYang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen, and Han Yu. "Introduction." In Federated Learning, 1–15. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4_1.
Full textYang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen, and Han Yu. "Vertical Federated Learning." In Federated Learning, 69–81. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4_5.
Full textConference papers on the topic "Federate learning"
Seo, Seonguk, Jinkyu Kim, Geeho Kim, and Bohyung Han. "Relaxed Contrastive Learning for Federated Learning." In 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 12279–88. IEEE, 2024. http://dx.doi.org/10.1109/cvpr52733.2024.01167.
Full textAlbuquerque, R. A., L. P. Dias, Momo Ziazet, K. Vandikas, S. Ickin, B. Jaumard, C. Natalino, L. Wosinska, P. Monti, and E. Wong. "Asynchronous Federated Split Learning." In 2024 IEEE 8th International Conference on Fog and Edge Computing (ICFEC), 11–18. IEEE, 2024. http://dx.doi.org/10.1109/icfec61590.2024.00010.
Full textOh, Seungeun, Jihong Park, Praneeth Vepakomma, Sihun Baek, Ramesh Raskar, Mehdi Bennis, and Seong-Lyun Kim. "LocFedMix-SL: Localize, Federate, and Mix for Improved Scalability, Convergence, and Latency in Split Learning." In WWW '22: The ACM Web Conference 2022. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3485447.3512153.
Full textOh, Seungeun, Jihong Park, Praneeth Vepakomma, Sihun Baek, Ramesh Raskar, Mehdi Bennis, and Seong-Lyun Kim. "LocFedMix-SL: Localize, Federate, and Mix for Improved Scalability, Convergence, and Latency in Split Learning." In WWW '22: The ACM Web Conference 2022. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3485447.3512153.
Full textCosta, Arthur N. F. Martins da, and Pedro Silva. "Computação, Saúde e Segurança: Explorando o Potencial da Aprendizagem Federada na Detecção de Arritmias Cardíacas." In Escola Regional de Computação Aplicada à Saúde. Sociedade Brasileira de Computação - SBC, 2024. http://dx.doi.org/10.5753/ercas.2024.238587.
Full textLiu, Gaoyang, Xiaoqiang Ma, Yang Yang, Chen Wang, and Jiangchuan Liu. "FedEraser: Enabling Efficient Client-Level Data Removal from Federated Learning Models." In 2021 IEEE/ACM 29th International Symposium on Quality of Service (IWQOS). IEEE, 2021. http://dx.doi.org/10.1109/iwqos52092.2021.9521274.
Full textDupuy, Christophe, Tanya G. Roosta, Leo Long, Clement Chung, Rahul Gupta, and Salman Avestimehr. "Learnings from Federated Learning in The Real World." In ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2022. http://dx.doi.org/10.1109/icassp43922.2022.9747113.
Full textda Silva, Vinicios B., Renan R. de Oliveira, Antonio Oliveira-Jr, and Ronaldo M. da Costa. "Treinamento Federado Aplicado à Segmentação do Ventrículo Esquerdo." In Escola Regional de Informática de Goiás. Sociedade Brasileira de Computação, 2023. http://dx.doi.org/10.5753/erigo.2023.237317.
Full textChen, Zhikun, Daofeng Li, Ming Zhao, Sihai Zhang, and Jinkang Zhu. "Semi-Federated Learning." In 2020 IEEE Wireless Communications and Networking Conference (WCNC). IEEE, 2020. http://dx.doi.org/10.1109/wcnc45663.2020.9120453.
Full textRizk, Elsa, Stefan Vlaski, and Ali H. Sayed. "Dynamic Federated Learning." In 2020 IEEE 21st International Workshop on Signal Processing Advances in Wireless Communications (SPAWC). IEEE, 2020. http://dx.doi.org/10.1109/spawc48557.2020.9154327.
Full textReports on the topic "Federate learning"
Shteyn, Anastasia, Konrad Kollnig, and Calum Inverarity. Federated learning: an introduction [report]. Open Data Institute, January 2023. http://dx.doi.org/10.61557/vnfu8593.
Full textWang, Yixuan. Federated Learning User Friendly Web App. Ames (Iowa): Iowa State University, August 2023. http://dx.doi.org/10.31274/cc-20240624-720.
Full textSokolovsky, Dmitry, Sergey Sokolov, and Alexey Rezaykin. e-learning course "Informatics". SIB-Expertise, January 2024. http://dx.doi.org/10.12731/er0785.29012024.
Full textFerraz, Claudio, Frederico Finan, and Diana Moreira. Corrupting Learning: Evidence from Missing Federal Education Funds in Brazil. Cambridge, MA: National Bureau of Economic Research, June 2012. http://dx.doi.org/10.3386/w18150.
Full textInman, Robert, and Daniel Rubinfeld. Federal Institutions and the Democratic Transition: Learning from South Africa. Cambridge, MA: National Bureau of Economic Research, January 2008. http://dx.doi.org/10.3386/w13733.
Full textEugenio, Evercita. Federated Learning and Differential Privacy: What might AI-Enhanced co-design of microelectronics learn?. Office of Scientific and Technical Information (OSTI), May 2022. http://dx.doi.org/10.2172/1868417.
Full textWorley, Sean, Scott Palmer, and Nathan Woods. Building, Sustaining and Improving: Using Federal Funds for Summer Learning and Afterschool. Education Counsel, July 2022. http://dx.doi.org/10.59656/yd-os9931.001.
Full textDavis, Allison Crean, John Hitchcock, Beth-Ann Tek, Holly Bozeman, Kristen Pugh, Clarissa McKithen, and Molly Hershey-Arista. A National Call to Action for Summer Learning: How Did States Respond? Westat, July 2023. http://dx.doi.org/10.59656/yd-os6574.001.
Full textBadrinarayan, Aneesha, and Linda Darling-Hammond. Developing State Assessment Systems That Support Teaching and Learning: What Can the Federal Government Do? Learning Policy Institute, April 2023. http://dx.doi.org/10.54300/885.821.
Full textHart, Nick Hart, Sara Stefanik Stefanik, Christopher Murell Murell, and Karol Olejniczak Olejniczak. Blueprints for Learning: A Synthesis of Federal Evidence-Building Plans Under the Evidence Act. Data Foundation, June 2024. http://dx.doi.org/10.15868/socialsector.43901.
Full text