Journal articles on the topic 'Edge computing with artificial intelligence'

To see the other types of publications on this topic, follow the link: Edge computing with artificial intelligence.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Edge computing with artificial intelligence.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Songlin Chen, Songlin Chen, Hong Wen Songlin Chen, and Jinsong Wu Hong Wen. "Artificial Intelligence Based Traffic Control for Edge Computing Assisted Vehicle Networks." 網際網路技術學刊 23, no. 5 (September 2022): 989–96. http://dx.doi.org/10.53106/160792642022092305007.

Full text
Abstract:
<p>Edge computing supported vehicle networks have attracted considerable attention in recent years both from industry and academia due to their extensive applications in urban traffic control systems. We present a general overview of Artificial Intelligence (AI)-based traffic control approaches which focuses mainly on dynamic traffic control via edge computing devices. A collaborative edge computing network embedded in the AI-based traffic control system is proposed to process the massive data from roadside sensors to shorten the real-time response time, which supports efficient traffic control and maximizes the utilization of computing resources in terms of incident levels associated with different rescue schemes. Furthermore, several open research issues and indicated future directions are discussed.</p> <p>&nbsp;</p>
APA, Harvard, Vancouver, ISO, and other styles
2

Deng, Shuiguang, Hailiang Zhao, Weijia Fang, Jianwei Yin, Schahram Dustdar, and Albert Y. Zomaya. "Edge Intelligence: The Confluence of Edge Computing and Artificial Intelligence." IEEE Internet of Things Journal 7, no. 8 (August 2020): 7457–69. http://dx.doi.org/10.1109/jiot.2020.2984887.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Debauche, Olivier, Meryem Elmoulat, Saïd Mahmoudi, Sidi Ahmed Mahmoudi, Adriano Guttadauria, Pierre Manneback, and Frédéric Lebeau. "Towards Landslides Early Warning System With Fog - Edge Computing And Artificial Intelligence**." Journal of Ubiquitous Systems and Pervasive Networks 15, no. 02 (March 1, 2021): 11–17. http://dx.doi.org/10.5383/juspn.15.02.002.

Full text
Abstract:
Landslides are phenomena that cause significant human and economic losses. Researchers have investigated the prediction of high landslides susceptibility with various methodologies based upon statistical and mathematical models, in addition to artificial intelligence tools. These methodologies allow to determine the areas that could present a serious risk of landslides. Monitoring these risky areas is particularly important for developing an Early Warning Systems (EWS). As matter of fact, the variety of landslides’ types make their monitoring a sophisticated task to accomplish. Indeed, each landslide area has its own specificities and potential triggering factors; therefore, there is no single device that can monitor all types of landslides. Consequently, Wireless Sensor Networks (WSN) combined with Internet of Things (IoT) allow to set up large-scale data acquisition systems. In addition, recent advances in Artificial Intelligence (AI) and Federated Learning (FL) allow to develop performant algorithms to analyze this data and predict early landslides events at edge level (on gateways). These algorithms are trained in this case at fog level on specific hardware. The novelty of the work proposed in this paper is the integration of Federated Learning based on Fog-Edge approaches to continuously improve prediction models.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhou, Zhi, Xu Chen, En Li, Liekang Zeng, Ke Luo, and Junshan Zhang. "Edge Intelligence: Paving the Last Mile of Artificial Intelligence With Edge Computing." Proceedings of the IEEE 107, no. 8 (August 2019): 1738–62. http://dx.doi.org/10.1109/jproc.2019.2918951.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sathish. "Artificial Intelligence based Edge Computing Framework for Optimization of Mobile Communication." Journal of ISMAC 2, no. 3 (July 9, 2020): 160–65. http://dx.doi.org/10.36548/jismac.2020.3.004.

Full text
Abstract:
For improving the mobile service quality and acceleration of content delivery, edge computing techniques have been providing optimal solution to bridge the device requirements and cloud capacity by network edges. The advancements of technologies like edge computing and mobile communication has contributed greatly towards these developments. The mobile edge system is enabled with Machine Learning techniques in order to improve the edge system intelligence, optimization of communication, caching and mobile edge computing. For this purpose, a smart framework is developed based on artificial intelligence enabling reduction of unwanted communication load of the system as well as enhancement of applications and optimization of the system dynamically. The models can be trained more accurately using the learning parameters that are exchanged between the edge nodes and the collaborating devices. The adaptivity and cognitive ability of the system is enhanced towards the mobile communication system despite the low learning overhead and helps in attaining a near optimal performance. The opportunities and challenges of smart systems in the near future are also discussed in this paper.
APA, Harvard, Vancouver, ISO, and other styles
6

Sun, Junlei. "The Legal Regulation of Artificial Intelligence and Edge Computing Automation Decision-Making Risk in Wireless Network Communication." Wireless Communications and Mobile Computing 2022 (March 12, 2022): 1–13. http://dx.doi.org/10.1155/2022/1303252.

Full text
Abstract:
This article is aimed at studying the legal regulation of artificial intelligence and edge computing automated decision-making risks in wireless network communications. The data under artificial intelligence is full of flexibility and vitality, which has changed the way of data existence in the whole society. Its core is various algorithm programs, which determine the existence of artificial intelligence. In this environment, society develops rapidly with unstoppable momentum. However, from a legal perspective, artificial intelligence has algorithmic discrimination, such as gender discrimination, clothing discrimination, and racial discrimination. It does not possess openness, objectivity, and accountability. The consequences are sometimes serious enough to endanger the public interest of the entire society, leading to market disorder, etc. Therefore, the problem of artificial intelligence algorithm discrimination remains to be solved. This article uses algorithms to adjust algorithm discrimination to reduce the harm caused by artificial intelligence algorithm discrimination to a certain extent. First of all, this article introduces a regulatory-based edge cloud computing architecture model. It is mentioned that distributed cloud computing can use subsystems to calculate various resources and storage resources and can make automated decisions when calculating certain data. In order to reduce the impact of algorithm discrimination and trigger data diversification to reduce the probability of discrimination, an edge computing network data capture system is designed. And this article mentions the BP neural network model. The BP neural network model is divided into input layer, output layer, and hidden layer. The training samples are passed from the input layer to the output layer through the hidden layer. If the output information does not meet expectations, the error will be back-propagated, and the connection weight will be adjusted continuously. This paper proposes a deep learning system model in real-time artificial intelligence driven by edge computing. When this model is applied to legal regulations, it can cooperate with edge computing and artificial intelligence algorithms to provide high-precision automated decision-making. Finally, this paper designs an artificial intelligence-assisted automated decision-making experiment based on the theory of legal computing. This paper proposes a Bayesian algorithm that uses edge algorithms to merge into artificial intelligence and verifies the feasibility of this hypothesis through experiments. The experimental results show that it has a certain ability to regulate algorithmic discrimination caused by artificial intelligence in legal regulations. It can improve the regulatory effects of laws and regulations to a certain extent, and the improved artificial intelligence Bayesian algorithm clustering effect of edge computing is increased by about 7.2%.
APA, Harvard, Vancouver, ISO, and other styles
7

Elmoulat, Meryem, Olivier Debauche, Saïd Mahmoudi, Sidi Ahmed Mahmoudi, Pierre Manneback, and Frédéric Lebeau. "Edge Computing and Artificial Intelligence for Landslides Monitoring." Procedia Computer Science 177 (2020): 480–87. http://dx.doi.org/10.1016/j.procs.2020.10.066.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Huh, Jun-Ho, and Yeong-Seok Seo. "Understanding Edge Computing: Engineering Evolution With Artificial Intelligence." IEEE Access 7 (2019): 164229–45. http://dx.doi.org/10.1109/access.2019.2945338.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Shanshan. "Enterprise Management Optimization by Using Artificial Intelligence and Edge Computing." International Journal of Distributed Systems and Technologies 13, no. 3 (July 1, 2022): 1–9. http://dx.doi.org/10.4018/ijdst.307994.

Full text
Abstract:
In the internet era, huge data is generated every day. With the help of cloud computing, enterprises can store and analyze these data more conveniently. With the emergence of the internet of things, more hardware devices have accessed the network and produced massive data. The data heavily relies on cloud computing for centralized data processing and analysis. However, the rapid growth of data volume has exceeded the network throughput capacity of cloud computing. By deploying computing nodes at the edge of the local network, edge computing allows devices to complete data collection and preprocessing in the local network. Thus, it can overcome the problems of low efficiency and large transmission delay of cloud computing for massive native data. This paper designs a human trajectory training system for enterprise management. The simulation demonstrates that the system can support human trajectory tracing and prediction for enterprise management.
APA, Harvard, Vancouver, ISO, and other styles
10

Radanliev, Petar, David De Roure, Kevin Page, Max Van Kleek, Omar Santos, La’Treall Maddox, Pete Burnap, Eirini Anthi, and Carsten Maple. "Design of a dynamic and self-adapting system, supported with artificial intelligence, machine learning and real-time intelligence for predictive cyber risk analytics in extreme environments – cyber risk in the colonisation of Mars." Safety in Extreme Environments 2, no. 3 (October 2020): 219–30. http://dx.doi.org/10.1007/s42797-021-00025-1.

Full text
Abstract:
AbstractMultiple governmental agencies and private organisations have made commitments for the colonisation of Mars. Such colonisation requires complex systems and infrastructure that could be very costly to repair or replace in cases of cyber-attacks. This paper surveys deep learning algorithms, IoT cyber security and risk models, and established mathematical formulas to identify the best approach for developing a dynamic and self-adapting system for predictive cyber risk analytics supported with Artificial Intelligence and Machine Learning and real-time intelligence in edge computing. The paper presents a new mathematical approach for integrating concepts for cognition engine design, edge computing and Artificial Intelligence and Machine Learning to automate anomaly detection. This engine instigates a step change by applying Artificial Intelligence and Machine Learning embedded at the edge of IoT networks, to deliver safe and functional real-time intelligence for predictive cyber risk analytics. This will enhance capacities for risk analytics and assists in the creation of a comprehensive and systematic understanding of the opportunities and threats that arise when edge computing nodes are deployed, and when Artificial Intelligence and Machine Learning technologies are migrated to the periphery of the internet and into local IoT networks.
APA, Harvard, Vancouver, ISO, and other styles
11

Corchado, Juan M., Sascha Ossowski, Sara Rodríguez-González, and Fernando De la Prieta. "Advances in Explainable Artificial Intelligence and Edge Computing Applications." Electronics 11, no. 19 (September 28, 2022): 3111. http://dx.doi.org/10.3390/electronics11193111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Lin, Bor-Shing, Tiku Yu, Chih-Wei Peng, Chueh-Ho Lin, Hung-Kai Hsu, I.-Jung Lee, and Zhao Zhang. "Fall Detection System With Artificial Intelligence-Based Edge Computing." IEEE Access 10 (2022): 4328–39. http://dx.doi.org/10.1109/access.2021.3140164.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Muneeb, Muhammad, Kwang-Man Ko, and Young-Hoon Park. "A Fog Computing Architecture with Multi-Layer for Computing-Intensive IoT Applications." Applied Sciences 11, no. 24 (December 7, 2021): 11585. http://dx.doi.org/10.3390/app112411585.

Full text
Abstract:
The emergence of new technologies and the era of IoT which will be based on compute-intensive applications. These applications will increase the traffic volume of today’s network infrastructure and will impact more on emerging Fifth Generation (5G) system. Research is going in many details, such as how to provide automation in managing and configuring data analysis tasks over cloud and edges, and to achieve minimum latency and bandwidth consumption with optimizing task allocation. The major challenge for researchers is to push the artificial intelligence to the edge to fully discover the potential of the fog computing paradigm. There are existing intelligence-based fog computing frameworks for IoT based applications, but research on Edge-Artificial Intelligence (Edge-AI) is still in its initial stage. Therefore, we chose to focus on data analytics and offloading in our proposed architecture. To address these problems, we have proposed a prototype of our architecture, which is a multi-layered architecture for data analysis between cloud and fog computing layers to perform latency- sensitive analysis with low latency. The main goal of this research is to use this multi-layer fog computing platform for enhancement of data analysis system based on IoT devices in real-time. Our research based on the policy of the OpenFog Consortium which will offer the good outcomes, but also surveillance and data analysis functionalities. We presented through case studies that our proposed prototype architecture outperformed the cloud-only environment in delay-time, network usage, and energy consumption.
APA, Harvard, Vancouver, ISO, and other styles
14

Kang, Zelun. "Research on risk prediction method of software robot based on artificial intelligence." Journal of Physics: Conference Series 2248, no. 1 (April 1, 2022): 012003. http://dx.doi.org/10.1088/1742-6596/2248/1/012003.

Full text
Abstract:
Abstract Aiming at the problem of software robot’s recognition of life scenes, this paper studies the recognition and judgment method based on AI edge computing. Based on artificial intelligence methods and the theory of edge computing, through the analysis of the overall architecture of edge computing, the scene judgment rules and recognition algorithms are clarified. Mainly, the feature extraction and recognition part of the software robot image recognition function is arranged in the edge server, so that the judgment and recognition can be quickly realized and the system operation efficiency can be improved. The experimental verification shows that: under the same conditions, the software robot identification error of the method in this paper is lower, and the calculation time is shorter than that of other software robots, which is far superior to the traditional identification method. At the same time, by changing the comparison of data receiving methods, it can also be proved that the use of edge computing is more efficient, and the recognition problems in the work of software robots can be realized.
APA, Harvard, Vancouver, ISO, and other styles
15

Sodhro, Ali Hassan, Sandeep Pirbhulal, and Victor Hugo C. de Albuquerque. "Artificial Intelligence-Driven Mechanism for Edge Computing-Based Industrial Applications." IEEE Transactions on Industrial Informatics 15, no. 7 (July 2019): 4235–43. http://dx.doi.org/10.1109/tii.2019.2902878.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Debauche, Olivier, Saïd Mahmoudi, Sidi Ahmed Mahmoudi, Pierre Manneback, Jérôme Bindelle, and Frédéric Lebeau. "Edge Computing and Artificial Intelligence for Real-time Poultry Monitoring." Procedia Computer Science 175 (2020): 534–41. http://dx.doi.org/10.1016/j.procs.2020.07.076.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Carvalho, Gonçalo, Bruno Cabral, Vasco Pereira, and Jorge Bernardino. "Computation offloading in Edge Computing environments using Artificial Intelligence techniques." Engineering Applications of Artificial Intelligence 95 (October 2020): 103840. http://dx.doi.org/10.1016/j.engappai.2020.103840.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Martin, Jon, David Cantero, Maite González, Andrea Cabrera, Mikel Larrañaga, Evangelos Maltezos, Panagiotis Lioupis, et al. "Embedded Vision Intelligence for the Safety of Smart Cities." Journal of Imaging 8, no. 12 (December 14, 2022): 326. http://dx.doi.org/10.3390/jimaging8120326.

Full text
Abstract:
Advances in Artificial intelligence (AI) and embedded systems have resulted on a recent increase in use of image processing applications for smart cities’ safety. This enables a cost-adequate scale of automated video surveillance, increasing the data available and releasing human intervention. At the same time, although deep learning is a very intensive task in terms of computing resources, hardware and software improvements have emerged, allowing embedded systems to implement sophisticated machine learning algorithms at the edge. Additionally, new lightweight open-source middleware for constrained resource devices, such as EdgeX Foundry, have appeared to facilitate the collection and processing of data at sensor level, with communication capabilities to exchange data with a cloud enterprise application. The objective of this work is to show and describe the development of two Edge Smart Camera Systems for safety of Smart cities within S4AllCities H2020 project. Hence, the work presents hardware and software modules developed within the project, including a custom hardware platform specifically developed for the deployment of deep learning models based on the I.MX8 Plus from NXP, which considerably reduces processing and inference times; a custom Video Analytics Edge Computing (VAEC) system deployed on a commercial NVIDIA Jetson TX2 platform, which provides high level results on person detection processes; and an edge computing framework for the management of those two edge devices, namely Distributed Edge Computing framework, DECIoT. To verify the utility and functionality of the systems, extended experiments were performed. The results highlight their potential to provide enhanced situational awareness and demonstrate the suitability for edge machine vision applications for safety in smart cities.
APA, Harvard, Vancouver, ISO, and other styles
19

Schwabe, Nils, Yexu Zhou, Leon Hielscher, Tobias Röddiger, Till Riedel, and Sebastian Reiter. "Tools and methods for Edge-AI-systems." at - Automatisierungstechnik 70, no. 9 (September 1, 2022): 767–76. http://dx.doi.org/10.1515/auto-2022-0023.

Full text
Abstract:
Abstract The enormous potential of artificial intelligence, especially artificial neural networks, when used for edge computing applications in cars, traffic lights or smart watches, has not yet been fully exploited today. The reasons for this are the computing, energy and memory requirements of modern neural networks, which typically cannot be met by embedded devices. This article provides a detailed summary of today’s challenges and gives a deeper insight into existing solutions that enable neural network performance with modern HW/SW co-design techniques.
APA, Harvard, Vancouver, ISO, and other styles
20

Liu, Zhongle. "Analysis of Physical Expansion Training Based on Edge Computing and Artificial Intelligence." Mobile Information Systems 2021 (June 1, 2021): 1–9. http://dx.doi.org/10.1155/2021/9145952.

Full text
Abstract:
The effective development of physical expansion training benefits from the rapid development of computer technology, especially the integration of Edge Computing (EC) and Artificial Intelligence (AI) technology. Physical expansion training is mainly based on the collective form, and how to improve the quality of training to achieve results has become the content of everyone’s attention. As a representative technology in the field of AI, deep learning and EC evolving from traditional cloud computing technology are all well applied to physical expansion training. Traditional EC methods have problems such as high computing cost and long computing time. In this paper, deep learning technology is introduced to optimize EC methods. The EC cycle is set through the Internet of Things (IoT) topology to obtain the data upload speed. The CNN (Convolutional Neural Network) model introduces deep reinforcement learning technology, implements convolution calculations, and completes the resource allocation of EC for each trainer’s wearable sensor device, which realizes the optimization of EC based on deep reinforcement learning. The experiment results show that the proposed method can effectively control the server’s occupancy time, the energy cost of the edge server, and the computing cost. The proposed method in this paper can also improve the resource allocation ability of EC, ensure the uniform speed of the computing process, and improve the efficiency of EC.
APA, Harvard, Vancouver, ISO, and other styles
21

Xu, Zhanyang, Wentao Liu, Jingwang Huang, Chenyi Yang, Jiawei Lu, and Haozhe Tan. "Artificial Intelligence for Securing IoT Services in Edge Computing: A Survey." Security and Communication Networks 2020 (September 14, 2020): 1–13. http://dx.doi.org/10.1155/2020/8872586.

Full text
Abstract:
With the explosive growth of data generated by the Internet of Things (IoT) devices, the traditional cloud computing model by transferring all data to the cloud for processing has gradually failed to meet the real-time requirement of IoT services due to high network latency. Edge computing (EC) as a new computing paradigm shifts the data processing from the cloud to the edge nodes (ENs), greatly improving the Quality of Service (QoS) for those IoT applications with low-latency requirements. However, compared to other endpoint devices such as smartphones or computers, distributed ENs are more vulnerable to attacks for restricted computing resources and storage. In the context that security and privacy preservation have become urgent issues for EC, great progress in artificial intelligence (AI) opens many possible windows to address the security challenges. The powerful learning ability of AI enables the system to identify malicious attacks more accurately and efficiently. Meanwhile, to a certain extent, transferring model parameters instead of raw data avoids privacy leakage. In this paper, a comprehensive survey of the contribution of AI to the IoT security in EC is presented. First, the research status and some basic definitions are introduced. Next, the IoT service framework with EC is discussed. The survey of privacy preservation and blockchain for edge-enabled IoT services with AI is then presented. In the end, the open issues and challenges on the application of AI in IoT services based on EC are discussed.
APA, Harvard, Vancouver, ISO, and other styles
22

Zhou, Chengcheng, Qian Liu, and Ruolei Zeng. "Novel Defense Schemes for Artificial Intelligence Deployed in Edge Computing Environment." Wireless Communications and Mobile Computing 2020 (August 3, 2020): 1–20. http://dx.doi.org/10.1155/2020/8832697.

Full text
Abstract:
The last few years have seen the great potential of artificial intelligence (AI) technology to efficiently and effectively deal with an incredible deluge of data generated by the Internet of Things (IoT) devices. If all the massive data is transferred to the cloud for intelligent processing, it not only brings considerable challenges to the network bandwidth but also cannot meet the needs of AI applications that require fast and real-time response. Therefore, to achieve this requirement, mobile or multiaccess edge computing (MEC) is receiving a substantial amount of interest, and its importance is gradually becoming more prominent. However, with the emerging of edge intelligence, AI also suffers from several tremendous security threats in AI model training, AI model inference, and private data. This paper provides three novel defense strategies to tackle malicious attacks in three aspects. First of all, we introduce a cloud-edge collaborative antiattack scheme to realize a reliable incremental updating of AI by ensuring the data security generated in the training phase. Furthermore, we propose an edge-enhanced defense strategy based on adaptive traceability and punishment mechanism to effectively and radically solve the security problem in the inference stage of the AI model. Finally, we establish a system model based on chaotic encryption with the three-layer architecture of MEC to effectively guarantee the security and privacy of the data during the construction of AI models. The experimental results of these three countermeasures verify the correctness of the conclusion and the feasibility of the methods.
APA, Harvard, Vancouver, ISO, and other styles
23

Dai, Yueyue, Du Xu, Sabita Maharjan, Guanhua Qiao, and Yan Zhang. "Artificial Intelligence Empowered Edge Computing and Caching for Internet of Vehicles." IEEE Wireless Communications 26, no. 3 (June 2019): 12–18. http://dx.doi.org/10.1109/mwc.2019.1800411.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Wang, Pin, Zhijian Gao, Yimin Li, Lingyu Zeng, and Hongmei Zhong. "Design and Implementation of a Radioactive Source Intelligent Search Robot Based on Artificial Intelligence Edge Computing." Wireless Communications and Mobile Computing 2022 (May 17, 2022): 1–12. http://dx.doi.org/10.1155/2022/3940348.

Full text
Abstract:
Artificial intelligence is a very broad science, which consists of different fields, such as machine learning, and computer vision. In recent years, the world nuclear industry has developed vigorously. At the same time, incidents of loss of radioactive sources also occur from time to time. At present, most of the search for radioactive sources adopt manual search, which is inefficient, and the searchers are vulnerable to radiation damage. Sending a robot to the search an area where there may be an uncontrolled radioactive source is different. Not only does it improve efficiency, it also protects people from radiation. Therefore, it is of great practical significance to design a radioactive source search robot. This paper mainly introduces the design and implementation of a radioactive source intelligent search robot based on artificial intelligence edge computing, aiming to provide some ideas and directions for the research of radioactive source intelligent search robot. In this paper, a research method for the design and implementation of a radioactive source intelligent search robot based on artificial intelligence edge computing is proposed, including intelligent edge computing and gamma-ray imaging algorithms, which are used to carry out related experiments on the design and implementation of radioactive sources, an intelligent search robot based on edge computing. The experimental results of this paper show that the average resolution of the radioactive source search robot is 90.55%, and the resolution results are more prominent.
APA, Harvard, Vancouver, ISO, and other styles
25

Liu, Yang, Qingtian Wang, Haitao Liu, Jiaying Zong, and Fengyi Yang. "Edge Intelligence-Based RAN Architecture for 6G Internet of Things." Discrete Dynamics in Nature and Society 2022 (November 15, 2022): 1–11. http://dx.doi.org/10.1155/2022/4955498.

Full text
Abstract:
Edge Intelligence, which blends Artificial Intelligence (AI) with Radio Access Network (RAN) and edge computing, is recommended as a crucial enabling technology for 6G to accommodate intelligent and efficient applications. In this study, we proposed Edge Intelligent Radio Access Network Architecture (EIRA) by introducing new intelligence modules, which include broadband edge platforms that allow policies to interact with virtualized RAN for various applications. We also developed a Markov chain-based RAN Intelligence Control (RIC) scheduling policy for allocating intelligence elements. Experimental results justified that the virtualized RAN delivers on its performance promises in terms of throughput, latency, and resource utilization.
APA, Harvard, Vancouver, ISO, and other styles
26

Zeng, Zeng, Cen Chen, Bharadwaj Veeravalli, Keqin Li, and Joey Tianyi Zhou. "Introduction to the Special Issue on edge intelligence: Neurocomputing meets edge computing." Neurocomputing 472 (February 2022): 149–51. http://dx.doi.org/10.1016/j.neucom.2021.11.069.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Hao, Long, and Li-Min Zhou. "Evaluation Index of School Sports Resources Based on Artificial Intelligence and Edge Computing." Mobile Information Systems 2022 (January 13, 2022): 1–9. http://dx.doi.org/10.1155/2022/9925930.

Full text
Abstract:
As the demand for education continues to increase, the relative lack of physical resources has become a bottleneck hindering the development of school physical education to a certain extent. This research mainly discusses the evaluation index system of school sports resources based on artificial intelligence and edge computing. Human resources, financial resources, and material resources in school sports resources are the three major resources in resource science. University sports stadium information publicity uses Internet technology to establish a sports information management platform and mobile Internet terminals to optimize university sports resources and stadium information management services. It uses artificial intelligence technology to improve venue information management. It establishes a comprehensive platform for venue management information, collects multidimensional information, provides information resources and accurate information push, and links venue information with public fitness needs. Using edge computing to realize nearby cloud processing of video data, reduce the phenomenon of black screen jams during live broadcast, improve data computing capabilities, and reduce users’ dependence on the performance of terminal devices, build a smart sports resource platform, combine artificial intelligence (AI) to create smart communities, smart venues, and realize intelligent operations such as event service operations and safety prevention and control in important event venues. During the live broadcast of the student sports league, the nearby cloud processing of video data is realized in the form of edge computing, which improves the data computing ability and reduces the performance dependence on the user terminal equipment itself. In the academic survey of college physical education teachers, undergraduates accounted for 26.99%, masters accounted for 60.3%, and doctoral degrees accounted for 12.8%. This research will help the reasonable allocation of school sports resources.
APA, Harvard, Vancouver, ISO, and other styles
28

Zhang, Zhonghua, Xifei Song, Lei Liu, Jie Yin, Yu Wang, and Dapeng Lan. "Recent Advances in Blockchain and Artificial Intelligence Integration: Feasibility Analysis, Research Issues, Applications, Challenges, and Future Work." Security and Communication Networks 2021 (June 24, 2021): 1–15. http://dx.doi.org/10.1155/2021/9991535.

Full text
Abstract:
Blockchain constructs a distributed point-to-point system, which is a secure and verifiable mechanism for decentralized transaction validation and is widely used in financial economy, Internet of Things, large data, cloud computing, and edge computing. On the other hand, artificial intelligence technology is gradually promoting the intelligent development of various industries. As two promising technologies today, there is a natural advantage in the convergence between blockchain and artificial intelligence technologies. Blockchain makes artificial intelligence more autonomous and credible, and artificial intelligence can prompt blockchain toward intelligence. In this paper, we analyze the combination of blockchain and artificial intelligence from a more comprehensive and three-dimensional point of view. We first introduce the background of artificial intelligence and the concept, characteristics, and key technologies of blockchain and subsequently analyze the feasibility of combining blockchain with artificial intelligence. Next, we summarize the research work on the convergence of blockchain and artificial intelligence in home and overseas within this category. After that, we list some related application scenarios about the convergence of both technologies and also point out existing problems and challenges. Finally, we discuss the future work.
APA, Harvard, Vancouver, ISO, and other styles
29

Chen, Ching-Han, and Chao-Tsu Liu. "Person Re-Identification Microservice over Artificial Intelligence Internet of Things Edge Computing Gateway." Electronics 10, no. 18 (September 15, 2021): 2264. http://dx.doi.org/10.3390/electronics10182264.

Full text
Abstract:
With the increase in the number of surveillance cameras being deployed globally, an important topic is person re-identification (Re-ID), which identifies the same person from multiple different angles and different directions across multiple cameras. However, because of the privacy issues involved in the identification of individuals, Re-ID systems cannot send the image data to cloud, and these data must be processed on edge servers. However, there has been a significant increase in computing resources owing to the processing of artificial intelligence (AI) algorithms through edge computing (EC). Consequently, the traditional AI internet of things (AIoT) architecture is no longer sufficient. In this study, we designed a Re-ID system at the AIoT EC gateway, which utilizes a microservice to perform Re-ID calculations on EC and balances efficiency with privacy protection. Experimental results indicate that this architecture can provide sufficient Re-ID computing resources to allow the system to scale up or down flexibly to support different scenarios and demand loads.
APA, Harvard, Vancouver, ISO, and other styles
30

Liu, Jia, Jianjian Xiang, Yongjun Jin, Renhua Liu, Jining Yan, and Lizhe Wang. "Boost Precision Agriculture with Unmanned Aerial Vehicle Remote Sensing and Edge Intelligence: A Survey." Remote Sensing 13, no. 21 (October 30, 2021): 4387. http://dx.doi.org/10.3390/rs13214387.

Full text
Abstract:
In recent years unmanned aerial vehicles (UAVs) have emerged as a popular and cost-effective technology to capture high spatial and temporal resolution remote sensing (RS) images for a wide range of precision agriculture applications, which can help reduce costs and environmental impacts by providing detailed agricultural information to optimize field practices. Furthermore, deep learning (DL) has been successfully applied in agricultural applications such as weed detection, crop pest and disease detection, etc. as an intelligent tool. However, most DL-based methods place high computation, memory and network demands on resources. Cloud computing can increase processing efficiency with high scalability and low cost, but results in high latency and great pressure on the network bandwidth. The emerging of edge intelligence, although still in the early stages, provides a promising solution for artificial intelligence (AI) applications on intelligent edge devices at the edge of the network close to data sources. These devices are with built-in processors enabling onboard analytics or AI (e.g., UAVs and Internet of Things gateways). Therefore, in this paper, a comprehensive survey on the latest developments of precision agriculture with UAV RS and edge intelligence is conducted for the first time. The major insights observed are as follows: (a) in terms of UAV systems, small or light, fixed-wing or industrial rotor-wing UAVs are widely used in precision agriculture; (b) sensors on UAVs can provide multi-source datasets, and there are only a few public UAV dataset for intelligent precision agriculture, mainly from RGB sensors and a few from multispectral and hyperspectral sensors; (c) DL-based UAV RS methods can be categorized into classification, object detection and segmentation tasks, and convolutional neural network and recurrent neural network are the mostly common used network architectures; (d) cloud computing is a common solution to UAV RS data processing, while edge computing brings the computing close to data sources; (e) edge intelligence is the convergence of artificial intelligence and edge computing, in which model compression especially parameter pruning and quantization is the most important and widely used technique at present, and typical edge resources include central processing units, graphics processing units and field programmable gate arrays.
APA, Harvard, Vancouver, ISO, and other styles
31

GOMEZ LARRAKOETXEA, NEREA, BORJA SANZ URQUIJO, JON GARCIA BARRUETABEÑA, and IKER PASTOR LOPEZ. "PERFORMANCE-BASED MACHINE LEARNING ALGORITHM SELECTION STRATEGY IN EDGE COMPUTING ENVIRONMENTS." DYNA 98, no. 1 (January 1, 2023): 38–44. http://dx.doi.org/10.6036/10671.

Full text
Abstract:
Currently most of the data collected in companies and industrial manufacturing environments through IoT devices is processed in the cloud [1]. Given the large volume of data that each company manages due to the emergence of IoT, cloud computing is not the best option for certain sectors such as automotive [3]. Within this sector, the quality perceived by the end customer is closely linked to the assembly line. These assembly lines collect a high number of variables (temperatures, pressures, pumps, etc.) and a real-time prediction by means of small digital twins in this process would avoid both material and labor costs. Today, to perform such anticipation through artificial intelligence models in real time is unfeasible due to the latency that exists with cloud processing. Therefore, there is an imminent need to develop applications that are deployed at the edge of the network for the automotive manufacturing and painting process that enable the generation of digital twins. In this regard, the proposal of the 'Edge Computing' concept [2] seeks to mitigate in part this situation. For this reason, this paper has focused not only on demonstrating that splitting the data of a large model by generating small 'Edge' models maintains the predictive capacity of each algorithm, but also stress tests (limits, constraints, etc.) have been performed on the 'Edge Computing' hardware technology currently available on the market that allows generating and processing this type of models. Key Words: Edge Computing, Internet of things, Machine Learning, Industry 4.0
APA, Harvard, Vancouver, ISO, and other styles
32

Zhang, Yongqiang, Hongchang Yu, Wanzhen Zhou, and Menghua Man. "Application and Research of IoT Architecture for End-Net-Cloud Edge Computing." Electronics 12, no. 1 (December 20, 2022): 1. http://dx.doi.org/10.3390/electronics12010001.

Full text
Abstract:
At the edge of the network close to the source of the data, edge computing deploys computing, storage and other capabilities to provide intelligent services in close proximity and offers low bandwidth consumption, low latency and high security. It satisfies the requirements of transmission bandwidth, real-time and security for Internet of Things (IoT) application scenarios. Based on the IoT architecture, an IoT edge computing (EC-IoT) reference architecture is proposed, which contained three layers: The end edge, the network edge and the cloud edge. Furthermore, the key technologies of the application of artificial intelligence (AI) technology in the EC-IoT reference architecture is analyzed. Platforms for different EC-IoT reference architecture edge locations are classified by comparing IoT edge computing platforms. On the basis of EC-IoT reference architecture, an industrial Internet of Things (IIoT) edge computing solution, an Internet of Vehicles (IoV) edge computing architecture and a reference architecture of the IoT edge gateway-based smart home are proposed. Finally, the trends and challenges of EC-IoT are examined, and the EC-IoT architecture will have very promising applications.
APA, Harvard, Vancouver, ISO, and other styles
33

Chang, Wan-Jung, Chia-Hao Hsu, and Liang-Bi Chen. "A Pose Estimation-Based Fall Detection Methodology Using Artificial Intelligence Edge Computing." IEEE Access 9 (2021): 129965–76. http://dx.doi.org/10.1109/access.2021.3113824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Hussain, Bilal, Qinghe Du, Ali Imran, and Muhammad Ali Imran. "Artificial Intelligence-Powered Mobile Edge Computing-Based Anomaly Detection in Cellular Networks." IEEE Transactions on Industrial Informatics 16, no. 8 (August 2020): 4986–96. http://dx.doi.org/10.1109/tii.2019.2953201.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Zhao, Lindong, Xuguang Zhang, Jianxin Chen, and Liang Zhou. "Physical Layer Security in the Age of Artificial Intelligence and Edge Computing." IEEE Wireless Communications 27, no. 5 (October 2020): 174–80. http://dx.doi.org/10.1109/mwc.001.2000044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Debauche, Olivier, Said Mahmoudi, Sidi Ahmed Mahmoudi, Pierre Manneback, and Frédéric Lebeau. "Edge Computing and Artificial Intelligence Semantically Driven. Application to a Climatic Enclosure." Procedia Computer Science 175 (2020): 542–47. http://dx.doi.org/10.1016/j.procs.2020.07.077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Liu, Yuxin, Tian Wang, Shaobo Zhang, Xuxun Liu, and Xiao Liu. "Artificial intelligence aware and security-enhanced traceback technique in mobile edge computing." Computer Communications 161 (September 2020): 375–86. http://dx.doi.org/10.1016/j.comcom.2020.08.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Yang, Lei, Xu Chen, Samir M. Perlaza, and Junshan Zhang. "Special Issue on Artificial-Intelligence-Powered Edge Computing for Internet of Things." IEEE Internet of Things Journal 7, no. 10 (October 2020): 9224–26. http://dx.doi.org/10.1109/jiot.2020.3019948.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Zhu, Ying. "Network Public Opinion Prediction and Control Based on Edge Computing and Artificial Intelligence New Paradigm." Wireless Communications and Mobile Computing 2021 (April 19, 2021): 1–11. http://dx.doi.org/10.1155/2021/5566647.

Full text
Abstract:
In this paper, an adaptive edge service placement mechanism based on online learning and a predictive edge service migration method based on factor graph model are proposed to solve the edge computing service placement problem from the edge computing dimension. First, the time series of the development of online chaotic public opinion is a platform for vectorized collection of keyword index trends using the theory of chaotic phase space reconstruction. Secondly, it is necessary to use the main index method to judge whether the time series has the chaotic characteristics of the network public opinion data. The simulation results show that network public opinion is the development characteristic of chaotic time series. Finally, the prediction model is improved by using complex network topology. Through the simulation experiment of network public opinion and chaotic time series, the results show that the improved model has the advantages of accuracy, rapidity, and self-adaptability and can be applied to other fields.
APA, Harvard, Vancouver, ISO, and other styles
40

Allen, Timothy P., and Carver A. Mead. "A silicon retina for computing local edge orientations." Neural Networks 1 (January 1988): 481. http://dx.doi.org/10.1016/0893-6080(88)90503-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Ning, Zhaolong, Peiran Dong, Xiaojie Wang, Joel J. P. C. Rodrigues, and Feng Xia. "Deep Reinforcement Learning for Vehicular Edge Computing." ACM Transactions on Intelligent Systems and Technology 10, no. 6 (December 14, 2019): 1–24. http://dx.doi.org/10.1145/3317572.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Lapegna, Marco, Walter Balzano, Norbert Meyer, and Diego Romano. "Clustering Algorithms on Low-Power and High-Performance Devices for Edge Computing Environments." Sensors 21, no. 16 (August 10, 2021): 5395. http://dx.doi.org/10.3390/s21165395.

Full text
Abstract:
The synergy between Artificial Intelligence and the Edge Computing paradigm promises to transfer decision-making processes to the periphery of sensor networks without the involvement of central data servers. For this reason, we recently witnessed an impetuous development of devices that integrate sensors and computing resources in a single board to process data directly on the collection place. Due to the particular context where they are used, the main feature of these boards is the reduced energy consumption, even if they do not exhibit absolute computing powers comparable to modern high-end CPUs. Among the most popular Artificial Intelligence techniques, clustering algorithms are practical tools for discovering correlations or affinities within data collected in large datasets, but a parallel implementation is an essential requirement because of their high computational cost. Therefore, in the present work, we investigate how to implement clustering algorithms on parallel and low-energy devices for edge computing environments. In particular, we present the experiments related to two devices with different features: the quad-core UDOO X86 Advanced+ board and the GPU-based NVIDIA Jetson Nano board, evaluating them from the performance and the energy consumption points of view. The experiments show that they realize a more favorable trade-off between these two requirements than other high-end computing devices.
APA, Harvard, Vancouver, ISO, and other styles
43

Guo, Yixuan. "Contextualized Design of IoT (Internet of Things) Finance for Edge Artificial Intelligence Computing." Computational Intelligence and Neuroscience 2022 (March 9, 2022): 1–10. http://dx.doi.org/10.1155/2022/6046957.

Full text
Abstract:
With the widespread application of IoT technology in the world, the new industry of IoT finance has emerged. Under this new business model, commercial banks and other financial institutions can realize safer and more convenient financial services such as payment, financing and asset management through the application of IoT technology and communication network technology. In the cloud computing model, the local terminal device of IOT will transmit the collected data to the cloud server through the network, and the cloud server will complete the data operation. Cloud computing model can well solve the problem of poor performance of IoT devices, but with the increasing number of IoT terminal devices and huge number of devices accessing the network, cloud computing model is constrained by network bandwidth and performance bottleneck, which brings a series of problems such as high latency, poor real-time and low security. In this paper, based on the new industry of IoT finance which is developing rapidly, we construct a POT (Peaks Over Threshold) over threshold model to empirically analyze the operational risk of commercial banks by using the risk loss data of commercial banks, and estimate the corresponding ES values by using the control variables method to measure the operational risk of traditional commercial banks and IoT finance respectively, and compare the total ES values of the two. This paper adopts the control variable method to reduce the frequency of each type of loss events of operational risk of commercial banks in China respectively.
APA, Harvard, Vancouver, ISO, and other styles
44

Daniaty, Diah, Benny Firmansyah, Aan Ardiansyah, and Toni Efendi. "Analisis Bibliometrik pada Penerapan Artificial Intelligence di Smart Manufacturing." Seminar Nasional Official Statistics 2022, no. 1 (November 1, 2022): 491–506. http://dx.doi.org/10.34123/semnasoffstat.v2022i1.1120.

Full text
Abstract:
Dengan kemajuan teknologi yang pesat di era revolusi industri 4.0, sistem manufaktur bertransformasi dan bergeser menuju ke era digitalisasi pabrik. Studi ini menyelidiki bagaimana penerapan artificial intelligence (AI) pada smart manufacturing dibahas dalam literatur akademis saat ini. Berdasarkan teknik bibliometrik, 399 publikasi diambil dari database Scopus dari 2013 hingga 2022 dan dianalisis untuk mengidentifikasi pola perubahan penelitian AI, sumber jurnal yang paling produktif, negara yang paling banyak dikutip, studi yang paling berpengaruh, dan kata kunci yang paling relevan. Topik-topik terbaru terkait penerapan AI pada smart manufacturing juga diidentifikasi. Program VOSViewer dan Biblioshiny digunakan untuk melakukan analisis bibliometrik. Penelitian AI juga berfokus pada peran teknologi lain seperti internet of things (IoT), cloud computing, big data, edge computing, blockchain, dan digital twin dalam mendukung kegiatan manufaktur, seperti meningkatkan otomatisasi, melakukan analisis prediktif, dan mengukur performa. Studi ini menjelaskan pandangan akademisi dan praktisi tentang apa yang telah diteliti dan mengidentifikasi kemungkinan peluang untuk studi masa depan.
APA, Harvard, Vancouver, ISO, and other styles
45

Lim, JongBeom. "Latency-Aware Task Scheduling for IoT Applications Based on Artificial Intelligence with Partitioning in Small-Scale Fog Computing Environments." Sensors 22, no. 19 (September 27, 2022): 7326. http://dx.doi.org/10.3390/s22197326.

Full text
Abstract:
The Internet of Things applications have become popular because of their lightweight nature and usefulness, which require low latency and response time. Hence, Internet of Things applications are deployed with the fog management layer (software) in closely located edge servers (hardware) as per the requirements. Due to their lightweight properties, Internet of Things applications do not consume many computing resources. Therefore, it is common that a small-scale data center can accommodate thousands of Internet of Things applications. However, in small-scale fog computing environments, task scheduling of applications is limited to offering low latency and response times. In this paper, we propose a latency-aware task scheduling method for Internet of Things applications based on artificial intelligence in small-scale fog computing environments. The core concept of the proposed task scheduling is to use artificial neural networks with partitioning capabilities. With the partitioning technique for artificial neural networks, multiple edge servers are able to learn and calculate hyperparameters in parallel, which reduces scheduling times and service level objectives. Performance evaluation with state-of-the-art studies shows the effectiveness and efficiency of the proposed task scheduling in small-scale fog computing environments while introducing negligible energy consumption.
APA, Harvard, Vancouver, ISO, and other styles
46

Lu, Jinzhi, Xiaochen Zheng, and Dimitris Kiritsis. "Special Issue: Smart Resilient Manufacturing." Applied Sciences 13, no. 1 (December 29, 2022): 464. http://dx.doi.org/10.3390/app13010464.

Full text
Abstract:
During the past decades, the global manufacturing industries have been reshaped by the rapid development of advanced technologies, such as cyber-physical systems, Internet of Things, artificial intelligence (AI), machine learning, cloud/edge computing, smart sensing, advanced robotics, blockchain/distributed ledger technology, etc [...]
APA, Harvard, Vancouver, ISO, and other styles
47

Yin, Guimei. "Intelligent framework for social robots based on artificial intelligence-driven mobile edge computing." Computers & Electrical Engineering 96 (December 2021): 107616. http://dx.doi.org/10.1016/j.compeleceng.2021.107616.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Chang, Zhuoqing, Shubo Liu, Xingxing Xiong, Zhaohui Cai, and Guoqing Tu. "A Survey of Recent Advances in Edge-Computing-Powered Artificial Intelligence of Things." IEEE Internet of Things Journal 8, no. 18 (September 15, 2021): 13849–75. http://dx.doi.org/10.1109/jiot.2021.3088875.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Lu, Yujun, Xiaoyong Hu, and Yu Su. "Framework of industrial networking sensing system based on edge computing and artificial intelligence." Journal of Intelligent & Fuzzy Systems 38, no. 1 (January 9, 2020): 283–91. http://dx.doi.org/10.3233/jifs-179403.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Premkumar, S., and AN Sigappi. "IoT-enabled edge computing model for smart irrigation system." Journal of Intelligent Systems 31, no. 1 (January 1, 2022): 632–50. http://dx.doi.org/10.1515/jisys-2022-0046.

Full text
Abstract:
Abstract Precision agriculture is a breakthrough in digital farming technology, which facilitates the application of precise and exact amount of input level of water and fertilizer to the crop at the required time for increasing the yield. Since agriculture relies on direct rainfall than irrigation and the prediction of rainfall date is easily available from web source, the integration of rainfall prediction with precision agriculture helps to regulate the water consumption in farms. In this work, an edge computing model is developed for predicting soil moisture in real time and managing the water usage in accordance with rain prediction. A soil moisture prediction hybrid algorithm (SMPHA) has been developed that revolves around the decision-making techniques with live environmental parameters including weather parameters for the prediction of soil moisture through the impact of precipitation. Numerous algorithms with the combination of regression + clustering are estimated, and it is inferred that XGBoost + k-means outperforms other algorithmic combinations that is deployed in edge model. This model is used as an intermediary between the end IoT devices and cloud that results in the saving of computationally intensive processing performed on cloud servers. The servers located on a local edge network perform the developed algorithmic computations. Avoiding transmission over the cloud results in significant latency, response time, and computation power savings and therefore increases the efficiency of data transfer. The proposed edge computing model is implemented in Raspberry Pi as an edge, Heroku as cloud, and edge nodes as the combination of Pi with actuators and sensors. The monitored data from Pi are stored in MongoDB webserver that is controlled by Web dashboard. Finally, the developed model is implemented in cloud and edge where the edge server implementation performs better in terms of latency, bandwidth, throughput, response time, and CPU memory usage.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography