Academic literature on the topic 'Federated learning applications'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Federated learning applications.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Federated learning applications"

1

Saha, Sudipan, and Tahir Ahmad. "Federated transfer learning: Concept and applications." Intelligenza Artificiale 15, no. 1 (July 28, 2021): 35–44. http://dx.doi.org/10.3233/ia-200075.

Full text
Abstract:
Development of Artificial Intelligence (AI) is inherently tied to the development of data. However, in most industries data exists in form of isolated islands, with limited scope of sharing between different organizations. This is an hindrance to the further development of AI. Federated learning has emerged as a possible solution to this problem in the last few years without compromising user privacy. Among different variants of the federated learning, noteworthy is federated transfer learning (FTL) that allows knowledge to be transferred across domains that do not have many overlapping features and users. In this work we provide a comprehensive survey of the existing works on this topic. In more details, we study the background of FTL and its different existing applications. We further analyze FTL from privacy and machine learning perspective.
APA, Harvard, Vancouver, ISO, and other styles
2

Launet, Laëtitia, Yuandou Wang, Adrián Colomer, Jorge Igual, Cristian Pulgarín-Ospina, Spiros Koulouzis, Riccardo Bianchi, et al. "Federating Medical Deep Learning Models from Private Jupyter Notebooks to Distributed Institutions." Applied Sciences 13, no. 2 (January 9, 2023): 919. http://dx.doi.org/10.3390/app13020919.

Full text
Abstract:
Deep learning-based algorithms have led to tremendous progress over the last years, but they face a bottleneck as their optimal development highly relies on access to large datasets. To mitigate this limitation, cross-silo federated learning has emerged as a way to train collaborative models among multiple institutions without having to share the raw data used for model training. However, although artificial intelligence experts have the expertise to develop state-of-the-art models and actively share their code through notebook environments, implementing a federated learning system in real-world applications entails significant engineering and deployment efforts. To reduce the complexity of federation setups and bridge the gap between federated learning and notebook users, this paper introduces a solution that leverages the Jupyter environment as part of the federated learning pipeline and simplifies its automation, the Notebook Federator. The feasibility of this approach is then demonstrated with a collaborative model solving a digital pathology image analysis task in which the federated model reaches an accuracy of 0.8633 on the test set, as compared to the centralized configurations for each institution obtaining 0.7881, 0.6514, and 0.8096, respectively. As a fast and reproducible tool, the proposed solution enables the deployment of a cross-country federated environment in only a few minutes.
APA, Harvard, Vancouver, ISO, and other styles
3

Benedict, Shajulin, Deepumon Saji, Rajesh P. Sukumaran, and Bhagyalakshmi M. "Blockchain-Enabled Federated Learning on Kubernetes for Air Quality Prediction Applications." September 2021 3, no. 3 (August 30, 2021): 196–217. http://dx.doi.org/10.36548/jaicn.2021.3.004.

Full text
Abstract:
The biggest realization of the Machine Learning (ML) in societal applications, including air quality prediction, has been the inclusion of novel learning techniques with the focus on solving privacy and scalability issues which capture the inventiveness of tens of thousands of data scientists. Transferring learning models across multi-regions or locations has been a considerable challenge as sufficient technologies were not adopted in the recent past. This paper proposes a Blockchain- enabled Federated Learning Air Quality Prediction (BFL-AQP) framework on Kubernetes cluster which transfers the learning model parameters of ML algorithms across distributed cluster nodes and predicts the air quality parameters of different locations. Experiments were carried out to explore the frame- work and transfer learning models of air quality prediction parameters. Besides, the performance aspects of increasing the Kubernetes cluster nodes of blockchains in the federated learning environment were studied; the time taken to establish seven blockchain organizations on top of the Kubernetes cluster while investigating into the federated learning algorithms namely Federated Random Forests (FRF) and Federated Linear Regression (FLR) for air quality predictions, were revealed in the paper.
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Li, Yuxi Fan, Mike Tse, and Kuo-Yi Lin. "A review of applications in federated learning." Computers & Industrial Engineering 149 (November 2020): 106854. http://dx.doi.org/10.1016/j.cie.2020.106854.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Amiri, Mohammad Mohammadi, Tolga M. Duman, Deniz Gunduz, Sanjeev R. Kulkarni, and H. Vincent Poor Poor. "Blind Federated Edge Learning." IEEE Transactions on Wireless Communications 20, no. 8 (August 2021): 5129–43. http://dx.doi.org/10.1109/twc.2021.3065920.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fu, Xingbo, Binchi Zhang, Yushun Dong, Chen Chen, and Jundong Li. "Federated Graph Machine Learning." ACM SIGKDD Explorations Newsletter 24, no. 2 (November 29, 2022): 32–47. http://dx.doi.org/10.1145/3575637.3575644.

Full text
Abstract:
Graph machine learning has gained great attention in both academia and industry recently. Most of the graph machine learning models, such as Graph Neural Networks (GNNs), are trained over massive graph data. However, in many realworld scenarios, such as hospitalization prediction in healthcare systems, the graph data is usually stored at multiple data owners and cannot be directly accessed by any other parties due to privacy concerns and regulation restrictions. Federated Graph Machine Learning (FGML) is a promising solution to tackle this challenge by training graph machine learning models in a federated manner. In this survey, we conduct a comprehensive review of the literature in FGML. Specifically, we first provide a new taxonomy to divide the existing problems in FGML into two settings, namely, FL with structured data and structured FL. Then, we review the mainstream techniques in each setting and elaborate on how they address the challenges under FGML. In addition, we summarize the real-world applications of FGML from different domains and introduce open graph datasets and platforms adopted in FGML. Finally, we present several limitations in the existing studies with promising research directions in this field.
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Yang, Anbu Huang, Yun Luo, He Huang, Youzhi Liu, Yuanyuan Chen, Lican Feng, Tianjian Chen, Han Yu, and Qiang Yang. "FedVision: An Online Visual Object Detection Platform Powered by Federated Learning." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 08 (April 3, 2020): 13172–79. http://dx.doi.org/10.1609/aaai.v34i08.7021.

Full text
Abstract:
Visual object detection is a computer vision-based artificial intelligence (AI) technique which has many practical applications (e.g., fire hazard monitoring). However, due to privacy concerns and the high cost of transmitting video data, it is highly challenging to build object detection models on centrally stored large training datasets following the current approach. Federated learning (FL) is a promising approach to resolve this challenge. Nevertheless, there currently lacks an easy to use tool to enable computer vision application developers who are not experts in federated learning to conveniently leverage this technology and apply it in their systems. In this paper, we report FedVision - a machine learning engineering platform to support the development of federated learning powered computer vision applications. The platform has been deployed through a collaboration between WeBank and Extreme Vision to help customers develop computer vision-based safety monitoring solutions in smart city applications. Over four months of usage, it has achieved significant efficiency improvement and cost reduction while removing the need to transmit sensitive data for three major corporate customers. To the best of our knowledge, this is the first real application of FL in computer vision-based tasks.
APA, Harvard, Vancouver, ISO, and other styles
8

Mun, Hyunsu, and Youngseok Lee. "Internet Traffic Classification with Federated Learning." Electronics 10, no. 1 (December 28, 2020): 27. http://dx.doi.org/10.3390/electronics10010027.

Full text
Abstract:
As Internet traffic classification is a typical problem for ISPs or mobile carriers, there have been a lot of studies based on statistical packet header information, deep packet inspection, or machine learning. Due to recent advances in end-to-end encryption and dynamic port policies, machine or deep learning has been an essential key to improve the accuracy of packet classification. In addition, ISPs or mobile carriers should carefully deal with the privacy issue while collecting user packets for accounting or security. The recent development of distributed machine learning, called federated learning, collaboratively carries out machine learning jobs on the clients without uploading data to a central server. Although federated learning provides an on-device learning framework towards user privacy protection, its feasibility and performance of Internet traffic classification have not been fully examined. In this paper, we propose a federated-learning traffic classification protocol (FLIC), which can achieve an accuracy comparable to centralized deep learning for Internet application identification without privacy leakage. FLIC can classify new applications on-the-fly when a participant joins in learning with a new application, which has not been done in previous works. By implementing the prototype of FLIC clients and a server with TensorFlow, the clients gather packets, perform the on-device training job and exchange the training results with the FLIC server. In addition, we demonstrate that federated learning-based packet classification achieves an accuracy of 88% under non-independent and identically distributed (non-IID) traffic across clients. When a new application that can be classified dynamically as a client participates in learning was added, an accuracy of 92% was achieved.
APA, Harvard, Vancouver, ISO, and other styles
9

Yang, Qiang. "Toward Responsible AI: An Overview of Federated Learning for User-centered Privacy-preserving Computing." ACM Transactions on Interactive Intelligent Systems 11, no. 3-4 (December 31, 2021): 1–22. http://dx.doi.org/10.1145/3485875.

Full text
Abstract:
With the rapid advances of Artificial Intelligence (AI) technologies and applications, an increasing concern is on the development and application of responsible AI technologies. Building AI technologies or machine-learning models often requires massive amounts of data, which may include sensitive, user private information to be collected from different sites or countries. Privacy, security, and data governance constraints rule out a brute force process in the acquisition and integration of these data. It is thus a serious challenge to protect user privacy while achieving high-performance models. This article reviews recent progress of federated learning in addressing this challenge in the context of privacy-preserving computing. Federated learning allows global AI models to be trained and used among multiple decentralized data sources with high security and privacy guarantees, as well as sound incentive mechanisms. This article presents the background, motivations, definitions, architectures, and applications of federated learning as a new paradigm for building privacy-preserving, responsible AI ecosystems.
APA, Harvard, Vancouver, ISO, and other styles
10

Yang, Zhaohui, Mingzhe Chen, Kai-Kit Wong, H. Vincent Poor, and Shuguang Cui. "Federated Learning for 6G: Applications, Challenges, and Opportunities." Engineering 8 (January 2022): 33–41. http://dx.doi.org/10.1016/j.eng.2021.12.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Federated learning applications"

1

Bhogi, Keerthana. "Two New Applications of Tensors to Machine Learning for Wireless Communications." Thesis, Virginia Tech, 2021. http://hdl.handle.net/10919/104970.

Full text
Abstract:
With the increasing number of wireless devices and the phenomenal amount of data that is being generated by them, there is a growing interest in the wireless communications community to complement the traditional model-driven design approaches with data-driven machine learning (ML)-based solutions. However, managing the large-scale multi-dimensional data to maintain the efficiency and scalability of the ML algorithms has obviously been a challenge. Tensors provide a useful framework to represent multi-dimensional data in an integrated manner by preserving relationships in data across different dimensions. This thesis studies two new applications of tensors to ML for wireless communications where the tensor structure of the concerned data is exploited in novel ways. The first contribution of this thesis is a tensor learning-based low-complexity precoder codebook design technique for a full-dimension multiple-input multiple-output (FD-MIMO) system with a uniform planar antenna (UPA) array at the transmitter (Tx) whose channel distribution is available through a dataset. Represented as a tensor, the FD-MIMO channel is further decomposed using a tensor decomposition technique to obtain an optimal precoder which is a function of Kronecker-Product (KP) of two low-dimensional precoders, each corresponding to the horizontal and vertical dimensions of the FD-MIMO channel. From the design perspective, we have made contributions in deriving a criterion for optimal product precoder codebooks using the obtained low-dimensional precoders. We show that this product codebook design problem is an unsupervised clustering problem on a Cartesian Product Grassmann Manifold (CPM), where the optimal cluster centroids form the desired codebook. We further simplify this clustering problem to a $K$-means algorithm on the low-dimensional factor Grassmann manifolds (GMs) of the CPM which correspond to the horizontal and vertical dimensions of the UPA, thus significantly reducing the complexity of precoder codebook construction when compared to the existing codebook learning techniques. The second contribution of this thesis is a tensor-based bandwidth-efficient gradient communication technique for federated learning (FL) with convolutional neural networks (CNNs). Concisely, FL is a decentralized ML approach that allows to jointly train an ML model at the server using the data generated by the distributed users coordinated by a server, by sharing only the local gradients with the server and not the raw data. Here, we focus on efficient compression and reconstruction of convolutional gradients at the users and the server, respectively. To reduce the gradient communication overhead, we compress the sparse gradients at the users to obtain their low-dimensional estimates using compressive sensing (CS)-based technique and transmit to the server for joint training of the CNN. We exploit a natural tensor structure offered by the convolutional gradients to demonstrate the correlation of a gradient element with its neighbors. We propose a novel prior for the convolutional gradients that captures the described spatial consistency along with its sparse nature in an appropriate way. We further propose a novel Bayesian reconstruction algorithm based on the Generalized Approximate Message Passing (GAMP) framework that exploits this prior information about the gradients. Through the numerical simulations, we demonstrate that the developed gradient reconstruction method improves the convergence of the CNN model.
Master of Science
The increase in the number of wireless and mobile devices have led to the generation of massive amounts of multi-modal data at the users in various real-world applications including wireless communications. This has led to an increasing interest in machine learning (ML)-based data-driven techniques for communication system design. The native setting of ML is {em centralized} where all the data is available on a single device. However, the distributed nature of the users and their data has also motivated the development of distributed ML techniques. Since the success of ML techniques is grounded in their data-based nature, there is a need to maintain the efficiency and scalability of the algorithms to manage the large-scale data. Tensors are multi-dimensional arrays that provide an integrated way of representing multi-modal data. Tensor algebra and tensor decompositions have enabled the extension of several classical ML techniques to tensors-based ML techniques in various application domains such as computer vision, data-mining, image processing, and wireless communications. Tensors-based ML techniques have shown to improve the performance of the ML models because of their ability to leverage the underlying structural information in the data. In this thesis, we present two new applications of tensors to ML for wireless applications and show how the tensor structure of the concerned data can be exploited and incorporated in different ways. The first contribution is a tensor learning-based precoder codebook design technique for full-dimension multiple-input multiple-output (FD-MIMO) systems where we develop a scheme for designing low-complexity product precoder codebooks by identifying and leveraging a tensor representation of the FD-MIMO channel. The second contribution is a tensor-based gradient communication scheme for a decentralized ML technique known as federated learning (FL) with convolutional neural networks (CNNs), where we design a novel bandwidth-efficient gradient compression-reconstruction algorithm that leverages a tensor structure of the convolutional gradients. The numerical simulations in both applications demonstrate that exploiting the underlying tensor structure in the data provides significant gains in their respective performance criteria.
APA, Harvard, Vancouver, ISO, and other styles
2

Adapa, Supriya. "TensorFlow Federated Learning: Application to Decentralized Data." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.

Find full text
Abstract:
Machine learning is a complex discipline. But implementing machine learning models is far less daunting and difficult than it used to be, thanks to machine learning frameworks such as Google’s TensorFlow Federated that ease the process of acquiring data, training models, serving predictions, and refining future results. There are an estimated 3 billion smartphones in the world and 7 billion connected devices. These phones and devices are constantly generating new data. Traditional analytics and machine learning need that data to be centrally collected before it is processed to yield insights, ML models, and ultimately better products. This centralized approach can be problematic if the data is sensitive or expensive to centralize. Wouldn’t it be better if we could run the data analysis and machine learning right on the devices where that data is generated, and still be able to aggregate together what’s been learned? TensorFlow Federated (TFF) is an open-source framework for experimenting with machine learning and other computations on decentralized data. It implements an approach called Federated Learning (FL), which enables many participating clients to train shared ML models while keeping their data locally. We have designed TFF based on our experiences with developing the federated learning technology at Google, where it powers ML models for mobile keyboard predictions and on-device search. With TFF, we are excited to put a flexible, open framework for locally simulating decentralized computations into the hands of all TensorFlow users. By using Twitter datasets we have done text classification of positives and negatives tweets of Twitter Account by using the Twitter application in machine learning.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Federated learning applications"

1

Yadav, Satya Prakash, Bhoopesh Singh Bhati, Dharmendra Prasad Mahato, and Sachin Kumar, eds. Federated Learning for IoT Applications. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kumar, Sachin, Satya Prakash Yadav, Dharmendra Prasad Mahato, and Bhoopesh Singh BHATI. Federated Learning for IoT Applications. Springer International Publishing AG, 2021.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Federated Learning with Python: Design and Implement a Federated Learning System and Develop Applications Using Existing Frameworks. Packt Publishing, Limited, 2022.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Baracaldo, Nathalie, and Heiko Ludwig. Federated Learning: A Comprehensive Overview of Methods and Applications. Springer International Publishing AG, 2022.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Federated learning applications"

1

Yang, Qiang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen, and Han Yu. "Selected Applications." In Federated Learning, 133–41. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-031-01585-4_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kishor, Kaushal. "Personalized Federated Learning." In Federated Learning for IoT Applications, 31–52. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kishor, Kaushal. "Communication-Efficient Federated Learning." In Federated Learning for IoT Applications, 135–56. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Pandey, Mohit, Shubhangi Pandey, and Ajit Kumar. "Introduction to Federated Learning." In Federated Learning for IoT Applications, 1–17. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gupta, Deena Nath, Rajendra Kumar, and Ashwani Kumar. "Federated Learning for IoT Devices." In Federated Learning for IoT Applications, 19–29. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Solanki, Tanu, Bipin Kumar Rai, and Shivani Sharma. "Federated Learning Using Tensor Flow." In Federated Learning for IoT Applications, 157–67. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gupta, Deena Nath, Rajendra Kumar, and Shamsul Haque Ansari. "Federated Learning for an IoT Application." In Federated Learning for IoT Applications, 53–66. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gupta, Sugandh, and Sapna Katiyar. "Communication-Efficient Federated Learning in Wireless-Edge Architecture." In Federated Learning for IoT Applications, 117–34. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Khan, Rijwan, Mahima Gupta, Pallavi Kumari, and Narendra Kumar. "A Prospective Study of Federated Machine Learning in Medical Science." In Federated Learning for IoT Applications, 105–16. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bhati, Nitesh Singh, Garvit Chugh, and Bhoopesh Singh Bhati. "Federated Machine Learning with Data Mining in Healthcare." In Federated Learning for IoT Applications, 231–42. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85559-8_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Federated learning applications"

1

Heusinger, Moritz, Christoph Raab, Fabrice Rossi, and Frank-Michael Schleif. "Federated Learning - Methods, Applications and beyond." In ESANN 2021 - European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. Louvain-la-Neuve (Belgium): Ciaco - i6doc.com, 2021. http://dx.doi.org/10.14428/esann/2021.es2021-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sultana, Khadija, Khandakar Ahmed, Bruce Gu, and Hua Wang. "Elastic Optimized Edge Federated Learning." In 2022 International Conference on Networking and Network Applications (NaNA). IEEE, 2022. http://dx.doi.org/10.1109/nana56854.2022.00056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Qinbin, Bingsheng He, and Dawn Song. "Practical One-Shot Federated Learning for Cross-Silo Setting." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/205.

Full text
Abstract:
Federated learning enables multiple parties to collaboratively learn a model without exchanging their data. While most existing federated learning algorithms need many rounds to converge, one-shot federated learning (i.e., federated learning with a single communication round) is a promising approach to make federated learning applicable in cross-silo setting in practice. However, existing one-shot algorithms only support specific models and do not provide any privacy guarantees, which significantly limit the applications in practice. In this paper, we propose a practical one-shot federated learning algorithm named FedKT. By utilizing the knowledge transfer technique, FedKT can be applied to any classification models and can flexibly achieve differential privacy guarantees. Our experiments on various tasks show that FedKT can significantly outperform the other state-of-the-art federated learning algorithms with a single communication round.
APA, Harvard, Vancouver, ISO, and other styles
4

Guberovic, Emanuel, Tomislav Lipic, and Igor Cavrak. "Dew Intelligence: Federated learning perspective." In 2021 IEEE 45th Annual Computers, Software, and Applications Conference (COMPSAC). IEEE, 2021. http://dx.doi.org/10.1109/compsac51774.2021.00274.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dirir, Ahmed, Khaled Salah, Davor Svetinovic, Raja Jayaraman, Ibrar Yaqoob, and Salil S. Kanhere. "Blockchain-Based Decentralized Federated Learning." In 2022 Fourth International Conference on Blockchain Computing and Applications (BCCA). IEEE, 2022. http://dx.doi.org/10.1109/bcca55292.2022.9921963.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fang, Minghong, Jia Liu, Neil Zhenqiang Gong, and Elizabeth S. Bentley. "AFLGuard: Byzantine-robust Asynchronous Federated Learning." In ACSAC: Annual Computer Security Applications Conference. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3564625.3567991.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Han, Mingqi, Xinghua Sun, Sihui Zheng, Xijun Wang, and Hongzhou Tan. "Resource Rationing for Federated Learning with Reinforcement Learning." In 2021 Computing, Communications and IoT Applications (ComComAp). IEEE, 2021. http://dx.doi.org/10.1109/comcomap53641.2021.9653111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hao, Meng, Hongwei Li, Guowen Xu, Hanxiao Chen, and Tianwei Zhang. "Efficient, Private and Robust Federated Learning." In ACSAC '21: Annual Computer Security Applications Conference. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3485832.3488014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chandran, Pravin, Raghavendra Bhat, Avinash Chakravarthy, and Srikanth Chandar. "Divide-and-Conquer Federated Learning Under Data Heterogeneity." In International Conference on AI, Machine Learning and Applications (AIMLA 2021). Academy and Industry Research Collaboration Center (AIRCC), 2021. http://dx.doi.org/10.5121/csit.2021.111302.

Full text
Abstract:
Federated Learning allows training of data stored in distributed devices without the need for centralizing training-data, thereby maintaining data-privacy. Addressing the ability to handle data heterogeneity (non-identical and independent distribution or non-IID) is a key enabler for the wider deployment of Federated Learning. In this paper, we propose a novel Divide-andConquer training methodology that enables the use of the popular FedAvg aggregation algorithm by over-coming the acknowledged FedAvg limitations in non-IID environments. We propose a novel use of Cosine-distance based Weight Divergence metric to determine the exact point where a Deep Learning network can be divided into class-agnostic initial layers and class-specific deep layers for performing a Divide and Conquer training. We show that the methodology achieves trained-model accuracy at-par with (and in certain cases exceeding) the numbers achieved by state-of-the-art algorithms like FedProx, FedMA, etc. Also, we show that this methodology leads to compute and/or bandwidth optimizations under certain documented conditions.
APA, Harvard, Vancouver, ISO, and other styles
10

Wu, Xin, Zhi Wang, Jian Zhao, Yan Zhang, and Yu Wu. "FedBC: Blockchain-based Decentralized Federated Learning." In 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA). IEEE, 2020. http://dx.doi.org/10.1109/icaica50127.2020.9182705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography