Academic literature on the topic 'Federated averaging'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Federated averaging.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Federated averaging"
Dhada, Maharshi, Amit Kumar Jain, and Ajith Kumar Parlikad. "Empirical Convergence Analysis of Federated Averaging for Failure Prognosis." IFAC-PapersOnLine 53, no. 3 (2020): 360–65. http://dx.doi.org/10.1016/j.ifacol.2020.11.058.
Full textBalakrishnan, Ravikumar, Mustafa Akdeniz, Sagar Dhakal, Arjun Anand, Ariela Zeira, and Nageen Himayat. "Resource Management and Model Personalization for Federated Learning over Wireless Edge Networks." Journal of Sensor and Actuator Networks 10, no. 1 (February 23, 2021): 17. http://dx.doi.org/10.3390/jsan10010017.
Full textXiao, Peng, Samuel Cheng, Vladimir Stankovic, and Dejan Vukobratovic. "Averaging Is Probably Not the Optimum Way of Aggregating Parameters in Federated Learning." Entropy 22, no. 3 (March 11, 2020): 314. http://dx.doi.org/10.3390/e22030314.
Full textWu, Xing, Zhaowang Liang, and Jianjia Wang. "FedMed: A Federated Learning Framework for Language Modeling." Sensors 20, no. 14 (July 21, 2020): 4048. http://dx.doi.org/10.3390/s20144048.
Full textShao, Rulin, Hongyu He, Ziwei Chen, Hui Liu, and Dianbo Liu. "Stochastic Channel-Based Federated Learning With Neural Network Pruning for Medical Data Privacy Preservation: Model Development and Experimental Validation." JMIR Formative Research 4, no. 12 (December 22, 2020): e17265. http://dx.doi.org/10.2196/17265.
Full textAsad, Muhammad, Ahmed Moustafa, and Takayuki Ito. "FedOpt: Towards Communication Efficiency and Privacy Preservation in Federated Learning." Applied Sciences 10, no. 8 (April 21, 2020): 2864. http://dx.doi.org/10.3390/app10082864.
Full textMunoz-Martin, Joan Francesc, David Llaveria, Christoph Herbert, Miriam Pablos, Hyuk Park, and Adriano Camps. "Soil Moisture Estimation Synergy Using GNSS-R and L-Band Microwave Radiometry Data from FSSCat/FMPL-2." Remote Sensing 13, no. 5 (March 5, 2021): 994. http://dx.doi.org/10.3390/rs13050994.
Full textCasado, Fernando E., Dylan Lema, Marcos F. Criado, Roberto Iglesias, Carlos V. Regueiro, and Senén Barro. "Concept drift detection and adaptation for federated and continual learning." Multimedia Tools and Applications, July 17, 2021. http://dx.doi.org/10.1007/s11042-021-11219-x.
Full textImteaj, Ahmed, and M. Hadi Amini. "FedPARL: Client Activity and Resource-Oriented Lightweight Federated Learning Model for Resource-Constrained Heterogeneous IoT Environment." Frontiers in Communications and Networks 2 (April 29, 2021). http://dx.doi.org/10.3389/frcmn.2021.657653.
Full textDissertations / Theses on the topic "Federated averaging"
Backstad, Sebastian. "Federated Averaging Deep Q-NetworkA Distributed Deep Reinforcement Learning Algorithm." Thesis, Umeå universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-149637.
Full textLangelaar, Johannes, and Mattsson Adam Strömme. "Federated Neural Collaborative Filtering for privacy-preserving recommender systems." Thesis, Uppsala universitet, Avdelningen för systemteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-446913.
Full textReddy, Sashlin. "A comparative analysis of dynamic averaging techniques in federated learning." Thesis, 2020. https://hdl.handle.net/10539/31059.
Full textDue to the advancements in mobile technology and user privacy concerns, federated learning has emerged as a popular machine learning (ML) method to push training of statistical models to the edge. Federated learning involves training a shared model under the coordination of a centralized server from a federation of participating clients. In practice federated learning methods have to overcome large network delays and bandwidth limits. To overcome the communication bottlenecks, recent works propose methods to reduce the communication frequency that have negligible impact on model accuracy also defined as model performance. Naive methods reduce the number of communication rounds in order to reduce the communication frequency. However, it is possible to invest communication more efficiently through dynamic communication protocols. This is deemed as dynamic averaging. Few have addressed such protocols. More so, few works base this dynamic averaging protocol on the diversity of the data and the loss. In this work, we introduce dynamic averaging frameworks based on the diversity of the data as well as the loss encountered by each client. This overcomes the assumption that each client participates equally and addresses the properties of federated learning. Results show that the overall communication overhead is reduced with negligible decrease in accuracy
CK2021
Book chapters on the topic "Federated averaging"
Remedios, Samuel W., John A. Butman, Bennett A. Landman, and Dzung L. Pham. "Federated Gradient Averaging for Multi-Site Training with Momentum-Based Optimizers." In Domain Adaptation and Representation Transfer, and Distributed and Collaborative Learning, 170–80. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60548-3_17.
Full textZhao, Fengpan, Yan Huang, Saide Zhu, Venkata Malladi, and Yubao Wu. "A Weighted Federated Averaging Framework to Reduce the Negative Influence from the Dishonest Users." In Security, Privacy, and Anonymity in Computation, Communication, and Storage, 241–50. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-68851-6_17.
Full textConference papers on the topic "Federated averaging"
Wang, Zheng, Xiaoliang Fan, Jianzhong Qi, Chenglu Wen, Cheng Wang, and Rongshan Yu. "Federated Learning with Fair Averaging." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/223.
Full textRo, Jae, Mingqing Chen, Rajiv Mathews, Mehryar Mohri, and Ananda Theertha Suresh. "Communication-Efficient Agnostic Federated Averaging." In Interspeech 2021. ISCA: ISCA, 2021. http://dx.doi.org/10.21437/interspeech.2021-153.
Full textLi, Yiwei, Tsung-Hui Chang, and Chong-Yung Chi. "Secure Federated Averaging Algorithm with Differential Privacy." In 2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP). IEEE, 2020. http://dx.doi.org/10.1109/mlsp49062.2020.9231531.
Full textDesai, Nirmit, and Dinesh Verma. "Properties of federated averaging on highly distributed data." In Artificial Intelligence and Machine Learning for Multi-Domain Operations Applications, edited by Tien Pham. SPIE, 2019. http://dx.doi.org/10.1117/12.2518941.
Full textWang, Shuai, Richard Cornelius Suwandi, and Tsung-Hui Chang. "Demystifying Model Averaging for Communication-Efficient Federated Matrix Factorization." In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9413927.
Full text