Zeitschriftenartikel zum Thema „Beyong edge computing“

Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Beyong edge computing.

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Zeitschriftenartikel für die Forschung zum Thema "Beyong edge computing" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Zeitschriftenartikel für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Tseng, Chia-Wei, Fan-Hsun Tseng, Yao-Tsung Yang, Chien-Chang Liu und Li-Der Chou. „Task Scheduling for Edge Computing with Agile VNFs On-Demand Service Model toward 5G and Beyond“. Wireless Communications and Mobile Computing 2018 (11.07.2018): 1–13. http://dx.doi.org/10.1155/2018/7802797.

Der volle Inhalt der Quelle
Annotation:
The demand for satisfying service requests, effectively allocating computing resources, and providing service on-demand application continuously increases along with the rapid development of the Internet. Edge computing is used to satisfy the low latency, network connection, and local data processing requirements and to alleviate the workload in the cloud. This paper proposes a gateway-based edge computing service model to reduce the latency of data transmission and the network bandwidth from and to the cloud. An on-demand computing resource allocation can be achieved by adjusting the task schedule of the edge gateway via the lightweight virtualization technology, Docker. The edge gateway can also process the service requests in the local network. The proposed edge computing service model not only eliminates the computation burden of the traditional cloud service model but also improves the operation efficiency of the edge computing nodes. This model can also be used for various innovation applications in the cloud-edge computing environment for 5G and beyond.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Sedhom, Germien G., Alshimaa H. Ismail und Basma M. Yousef. „Literature Review and Novel Trends of Mobile Edge Computing for 5G and Beyond“. Journal of Artificial Intelligence and Metaheuristics 2, Nr. 2 (2022): 18–28. http://dx.doi.org/10.54216/jaim.020202.

Der volle Inhalt der Quelle
Annotation:
Because of the rapid evolution of communications technologies, such as the Internet of Things (IoT) and fifth generation (5G) systems and beyond, the latest developments have seen a fundamental change in mobile computing. Mobile computing is moved from central mobile cloud computing to mobile edge computing (MEC). Therefore, MEC is considered an essential technology for 5G technology and beyond. The MEC technology permits user equipment (UEs) to execute numerous high-computational operations by creating computing capabilities at the edge networks and inside access networks. Consequently, in this paper, we extensively address the role of MEC in 5G networks and beyond. Accordingly, we first investigate the MEC architecture, the characteristics of edge computing, and the MEC challenges. Then, the paper discusses the MEC use cases and service scenarios. Further, computations offloading is explored. Lastly, we propose upcoming research difficulties in incorporating MEC with the 5G system and beyond.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Passian, Ali, und Neena Imam. „Nanosystems, Edge Computing, and the Next Generation Computing Systems“. Sensors 19, Nr. 18 (19.09.2019): 4048. http://dx.doi.org/10.3390/s19184048.

Der volle Inhalt der Quelle
Annotation:
It is widely recognized that nanoscience and nanotechnology and their subfields, such as nanophotonics, nanoelectronics, and nanomechanics, have had a tremendous impact on recent advances in sensing, imaging, and communication, with notable developments, including novel transistors and processor architectures. For example, in addition to being supremely fast, optical and photonic components and devices are capable of operating across multiple orders of magnitude length, power, and spectral scales, encompassing the range from macroscopic device sizes and kW energies to atomic domains and single-photon energies. The extreme versatility of the associated electromagnetic phenomena and applications, both classical and quantum, are therefore highly appealing to the rapidly evolving computing and communication realms, where innovations in both hardware and software are necessary to meet the growing speed and memory requirements. Development of all-optical components, photonic chips, interconnects, and processors will bring the speed of light, photon coherence properties, field confinement and enhancement, information-carrying capacity, and the broad spectrum of light into the high-performance computing, the internet of things, and industries related to cloud, fog, and recently edge computing. Conversely, owing to their extraordinary properties, 0D, 1D, and 2D materials are being explored as a physical basis for the next generation of logic components and processors. Carbon nanotubes, for example, have been recently used to create a new processor beyond proof of principle. These developments, in conjunction with neuromorphic and quantum computing, are envisioned to maintain the growth of computing power beyond the projected plateau for silicon technology. We survey the qualitative figures of merit of technologies of current interest for the next generation computing with an emphasis on edge computing.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Liu, Xiao, Jiong Jin und Fang Dong. „Edge-Computing-Based Intelligent IoT: Architectures, Algorithms and Applications“. Sensors 22, Nr. 12 (13.06.2022): 4464. http://dx.doi.org/10.3390/s22124464.

Der volle Inhalt der Quelle
Annotation:
With the rapid growth of the Internet of Things (IoT), 5G networks and beyond, the computing paradigm for intelligent IoT systems is shifting from conventional centralized-cloud computing to distributed edge computing [...]
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Sodanapalli, Sana, Hewan Shrestha, Chandramohan Dhasarathan, Puviyarasi T. und Sam Goundar. „Recent Advances in Edge Computing Paradigms“. International Journal of Fog Computing 4, Nr. 1 (Januar 2021): 37–51. http://dx.doi.org/10.4018/ijfc.2021010103.

Der volle Inhalt der Quelle
Annotation:
Edge computing is an exciting new approach to network architecture that helps organizations break beyond the limitations imposed by traditional cloud-based networks. It has emerged as a viable and important architecture that supports distributed computing to deploy compute and storage resources closer to the data source. Edge and fog computing addresses three principles of network limitations of bandwidth, latency, congestion, and reliability. The research community sees edge computing at manufacturing, farming, network optimization, workplace safety, improved healthcare, transportation, etc. The promise of this technology will be realized through addressing new research challenges in the IoT paradigm and the design of highly-efficient communication technology with minimum cost and effort.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Yeonjoo Lim, Yeonjoo Lim, und Jong-Hyouk Lee Yeonjoo Lim. „Container-based Service Relocation for Beyond 5G Networks“. 網際網路技術學刊 23, Nr. 4 (Juli 2022): 911–18. http://dx.doi.org/10.53106/160792642022072304026.

Der volle Inhalt der Quelle
Annotation:
<p>With the advent of 5G networks, various research on Multi-access Edge Computing (MEC) to provide high-reliability and ultra-low latency services are being actively conducted. MEC is an intelligent service distributed cloud technology that provides a high level of personal services by deploying cloud servers to edge networks physically closed to users. However, there is a technical issue to be solved, e.g., the service being used by a user does not exist in the new edge network, and there may even be situations in which the service cannot be provided in the new edge network. To address this, the service application must be relocated according to the location of the user&rsquo;s movement. Various research works are underway to solve this service relocation issue, e.g., cold/live migration studies have been carried in legacy cloud environments. In this paper, we propose a container migration technique that guarantees a smooth service application relocation for mobile users. We design scenarios for adaptive handoff and describe the detailed operation process. In addition, we present our MEC testbed, which has been used to experiment our container migration technique.</p> <p>&nbsp;</p>
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Kashkarov, D., und A. Koucheryavy. „THE MULTI-ACCESS EDGE COMPUTING APPLICATIONS AND DEVELOPMENT ANALYSIS FOR TELECOMMUNICATION NETWORKS“. Telecom IT 8, Nr. 1 (April 2020): 28–33. http://dx.doi.org/10.31854/2307-1303-2020-8-1-28-33.

Der volle Inhalt der Quelle
Annotation:
Research subject. The article is devoted to the analysis of the applications and development the multi-access edge computing on fifth-generation and beyond telecommunication networks. Method. System analysis. Core results. The identification of development perspectives of a multi-access edge computing on fifth-generation and beyond telecommunication networks. Practical relevance. The results of the article can be used by scientific organizations when planning the development of telecommunication networks, as well as universities in the educational process.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Durga, S., Esther Daniel, J. Andrew Onesimu und Yuichi Sei. „Resource Provisioning Techniques in Multi-Access Edge Computing Environments: Outlook, Expression, and Beyond“. Mobile Information Systems 2022 (19.12.2022): 1–24. http://dx.doi.org/10.1155/2022/7283516.

Der volle Inhalt der Quelle
Annotation:
Mobile cloud computing promises a research foundation in information and communication technology (ICT). Multi-access edge computing is an intermediate solution that reduces latency by delivering cloud computing services close to IoT and mobile clients (MCs), hence addressing the performance issues of mobile cloud computing. However, the provisioning of resources is a significant and challenging process in mobile cloud-based environments as it organizes the heterogeneous sensing and processing capacities to provide the customers with an elastic pool of resources. Resource provisioning techniques must meet quality of service (QoS) considerations such as availability, responsiveness, and reliability to avoid service-level agreement (SLA) breaches. This investigation is essential because of the unpredictable change in service demands from diverse regions and the limits of MEC’s available computing resources. In this study, resource provisioning approaches for mobile cloud computing are thoroughly and comparatively studied and classified as taxonomies of previous research. The paper concludes with an insightful summary that gives recommendations for future enhancements.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Lai, Shiwei, Rui Zhao, Shunpu Tang, Junjuan Xia, Fasheng Zhou und Liseng Fan. „Intelligent secure mobile edge computing for beyond 5G wireless networks“. Physical Communication 45 (April 2021): 101283. http://dx.doi.org/10.1016/j.phycom.2021.101283.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Madake, Vaibhav. „Edge Computing: Enhancing IoT and Beyond-Implications for Businesses and Consumers“. International Journal for Research in Applied Science and Engineering Technology 12, Nr. 12 (31.03.2024): 3077–85. http://dx.doi.org/10.22214/ijraset.2024.59598.

Der volle Inhalt der Quelle
Annotation:
Abstract: In this paper, we explore how edge computing is revolutionizing the Internet of Things (IoT) ecosystem by processing data closer to its source, thereby reducing latency, conserving bandwidth, and improving data security. We investigate the implications of this technological shift for businesses and consumers, highlighting the opportunities and challenges it presents.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Mikulics, Martin, Joachim Mayer und Hilde Helen Hardtdegen. „Cutting-edge nano-LED technology“. Journal of Applied Physics 131, Nr. 11 (21.03.2022): 110903. http://dx.doi.org/10.1063/5.0087279.

Der volle Inhalt der Quelle
Annotation:
In this Perspective, we will introduce possible future developments on group III-nitride nano-LEDs, which are based on current achievements in this rapidly arising research-technological field. First, the challenges facing their fabrication and their characteristics will be reported. These developments will be set in a broader context with primary applications in lighting, display technology, biology, and sensing. In the following, we will center on advanced applications in microscopy, lithography, communication, and optical computing. We will discuss unconventional device applications and prospects for emerging photon source-based technologies. Beyond conventional and current achievements in optoelectronics, we will present hybrid nano-LED architectures. Novel device concepts potentially could play an essential role in future photon source developments and serve as a key component for optical computing. Therefore, forefront fully photon operated logic circuits, photon-based computational processors, and photon driving memories will be discussed. All these developments will play a significant role in a future highly secure, low energy consuming green IT. Besides today's environmentally friendly terrestrial industrial and information technologies, an enormous potential of nano-LED technology for a large range of applications especially in the next stage of space research is envisaged.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Sarah, Annisa, Gianfranco Nencioni und Md Muhidul I. Khan. „Resource Allocation in Multi-access Edge Computing for 5G-and-beyond networks“. Computer Networks 227 (Mai 2023): 109720. http://dx.doi.org/10.1016/j.comnet.2023.109720.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Palle, Ranadeep Reddy. „Exo-edge computing: Pushing the limits of decentralized processing beyond the cloud“. International Journal of Engineering in Computer Science 1, Nr. 2 (01.07.2019): 67–74. http://dx.doi.org/10.33545/26633582.2019.v1.i2a.98.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Ranaweera, Pasika, Anca Jurcut und Madhusanka Liyanage. „MEC-enabled 5G Use Cases: A Survey on Security Vulnerabilities and Countermeasures“. ACM Computing Surveys 54, Nr. 9 (31.12.2022): 1–37. http://dx.doi.org/10.1145/3474552.

Der volle Inhalt der Quelle
Annotation:
The future of mobile and internet technologies are manifesting advancements beyond the existing scope of science. The concepts of automated driving, augmented-reality, and machine-type-communication are quite sophisticated and require an elevation of the current mobile infrastructure for launching. The fifth-generation (5G) mobile technology serves as the solution, though it lacks a proximate networking infrastructure to satisfy the service guarantees. Multi-access Edge Computing (MEC) envisages such an edge computing platform. In this survey, we are revealing security vulnerabilities of key 5G-based use cases deployed in the MEC context. Probable security flows of each case are specified, while countermeasures are proposed for mitigating them.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Enjie Liu, Youbing Zhao und Abimbola Efunogbon. „Boosting smarter digital health care with 5G and beyond networks“. ITU Journal on Future and Evolving Technologies 4, Nr. 1 (10.03.2023): 157–65. http://dx.doi.org/10.52953/gjnn6958.

Der volle Inhalt der Quelle
Annotation:
With 5G and beyond on the horizon, ultra-fast and low latency data transmission on the cloud and via the Internet will enable more intelligent and interactive medical and health-care applications. This paper presents a review of 5G technologies and their related applications in the health-care sector. The introduction to 5G technology includes software defined network, 5G architecture and edge computing. The second part of the paper then presents the opportunities provided by 5G to the health-care sector and employs medical imaging applications as central examples to demonstrate the impacts of 5G and the cloud. Finally, this paper summarize the benefits brought by 5G and cloud computing to the health-care sector.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Singh, Malkeet, und Mohit Angurala. „5G Technology based Edge Computing in UAV Networks for Resource Allocation with Routing using Federated Learning Access Network and Trajectory Routing Protocol“. International Journal on Future Revolution in Computer Science & Communication Engineering 8, Nr. 2 (30.06.2022): 51–61. http://dx.doi.org/10.17762/ijfrcsce.v8i2.2101.

Der volle Inhalt der Quelle
Annotation:
UAVs (Unmanned aerial vehicles) are being utilised more frequently in wireless communication networks of the Beyond Fifth Generation (B5G) that are equipped with a high-computation paradigm and intelligent applications. Due to the growing number of IoT (Internet of Things) devices in smart environments, these networks have the potential to produce a sizeable volume of heterogeneous data.This research propose novel technique in UAV based edge computing resource allocation and routing by machine learning technique. here the UAV-enabled MEC method regarding emerging IoT applications as well as role of machine learning (ML) has been analysed. In this research the UAV assisted edge computing resource allocation has been carried out using Monte Carlo federated learning based access network. Then the routing through UAV network has been carried out using trajectory based deterministic reinforcement collaborative routing protocol.We specifically conduct an experimental investigation of the tradeoff between the communication cost and the computation of the two possible methodologies.The key findings show that, despite the longer connection latency, the computation offloading strategy enables us to give a significantly greater throughput than the edge computing approach.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Tang, Yajuan, Shiwei Lai, Yanyi Rao, Wen Zhou, Fusheng Zhu, Liming Chen, Dan Deng et al. „Intelligent Distributed Data Storage for Wireless Communications in B5G Networks“. ICST Transactions on Mobile Communications and Applications 7, Nr. 2 (25.08.2022): e2. http://dx.doi.org/10.4108/eetmca.v7i2.2415.

Der volle Inhalt der Quelle
Annotation:
With the deployment and commercialization of the fifth-generation (5G) mobile communication network, the access nodes and data volume of wireless network show a massive and blowout growth trend. Taking beyond 5G (B5G) edge intelligent network as the research object, based on the deep integration of storage / computing and communication, this paper focuses on the theory and key technology of system intelligent transmission, so as to effectively support the related applications of B5G edge intelligent network in the future. This paper analyzes the research status of data storage, studies the real field distributed storage computing system, and designs the corresponding flashback shift code and error correction scheme with low storage space overhead.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Camps, Oscar, Stavros G. Stavrinides und Rodrigo Picos. „Stochastic Computing Implementation of Chaotic Systems“. Mathematics 9, Nr. 4 (13.02.2021): 375. http://dx.doi.org/10.3390/math9040375.

Der volle Inhalt der Quelle
Annotation:
An exploding demand for processing capabilities related to the emergence of the Internet of Things (IoT), Artificial Intelligence (AI), and big data, has led to the quest for increasingly efficient ways to expeditiously process the rapidly increasing amount of data. These ways include different approaches like improved devices capable of going further in the more Moore path but also new devices and architectures capable of going beyond Moore and getting more than Moore. Among the solutions being proposed, Stochastic Computing has positioned itself as a very reasonable alternative for low-power, low-area, low-speed, and adjustable precision calculations—four key-points beneficial to edge computing. On the other hand, chaotic circuits and systems appear to be an attractive solution for (low-power, green) secure data transmission in the frame of edge computing and IoT in general. Classical implementations of this class of circuits require intensive and precise calculations. This paper discusses the use of the Stochastic Computing (SC) framework for the implementation of nonlinear systems, showing that it can provide results comparable to those of classical integration, with much simpler hardware, paving the way for relevant applications.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Frankston, Bob. „Mobile-Edge Computing Versus The Internet?: Looking beyond the literal meaning of MEC“. IEEE Consumer Electronics Magazine 5, Nr. 4 (Oktober 2016): 75–76. http://dx.doi.org/10.1109/mce.2016.2590158.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Zheng, Zengwei, Mingxuan Zhou, Yuanyi Chen, Meimei Huo und Dan Chen. „Enabling real-time road anomaly detection via mobile edge computing“. International Journal of Distributed Sensor Networks 15, Nr. 11 (November 2019): 155014771989131. http://dx.doi.org/10.1177/1550147719891319.

Der volle Inhalt der Quelle
Annotation:
To discover road anomalies, a large number of detection methods have been proposed. Most of them apply classification techniques by extracting time and frequency features from the acceleration data. Existing methods are time-consuming since these methods perform on the whole datasets. In addition, few of them pay attention to the similarity of the data itself when vehicle passes over the road anomalies. In this article, we propose QF-COTE, a real-time road anomaly detection system via mobile edge computing. Specifically, QF-COTE consists of two phases: (1) Quick filter. This phase is designed to roughly extract road anomaly segments by applying random forest filter and can be performed on the edge node. (2) Road anomaly detection. In this phase, we utilize collective of transformation-based ensembles to detect road anomalies and can be performed on the cloud node. We show that our method performs clearly beyond some existing methods in both detection performance and running time. To support this conclusion, experiments are conducted based on two real-world data sets and the results are statistically analyzed. We also conduct two experiments to explore the influence of velocity and sample rate. We expect to lay the first step to some new thoughts to the field of real-time road anomalies detection in subsequent work.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Alsamhi, Saeed Hamood, Alexey V. Shvetsov, Santosh Kumar, Jahan Hassan, Mohammed A. Alhartomi, Svetlana V. Shvetsova, Radhya Sahal und Ammar Hawbani. „Computing in the Sky: A Survey on Intelligent Ubiquitous Computing for UAV-Assisted 6G Networks and Industry 4.0/5.0“. Drones 6, Nr. 7 (18.07.2022): 177. http://dx.doi.org/10.3390/drones6070177.

Der volle Inhalt der Quelle
Annotation:
Unmanned Aerial Vehicles (UAVs) are increasingly being used in a high-computation paradigm enabled with smart applications in the Beyond Fifth Generation (B5G) wireless communication networks. These networks have an avenue for generating a considerable amount of heterogeneous data by the expanding number of Internet of Things (IoT) devices in smart environments. However, storing and processing massive data with limited computational capability and energy availability at local nodes in the IoT network has been a significant difficulty, mainly when deploying Artificial Intelligence (AI) techniques to extract discriminatory information from the massive amount of data for different tasks.Therefore, Mobile Edge Computing (MEC) has evolved as a promising computing paradigm leveraged with efficient technology to improve the quality of services of edge devices and network performance better than cloud computing networks, addressing challenging problems of latency and computation-intensive offloading in a UAV-assisted framework. This paper provides a comprehensive review of intelligent UAV computing technology to enable 6G networks over smart environments. We highlight the utility of UAV computing and the critical role of Federated Learning (FL) in meeting the challenges related to energy, security, task offloading, and latency of IoT data in smart environments. We present the reader with an insight into UAV computing, advantages, applications, and challenges that can provide helpful guidance for future research.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Rahimi, Hamed, Yvan Picaud, Kamal Deep Singh, Giyyarpuram Madhusudan, Salvatore Costanzo und Olivier Boissier. „Design and Simulation of a Hybrid Architecture for Edge Computing in 5G and Beyond“. IEEE Transactions on Computers 70, Nr. 8 (01.08.2021): 1213–24. http://dx.doi.org/10.1109/tc.2021.3066579.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Yang, Hui, Yongshen Liang, Jiaqi Yuan, Qiuyan Yao, Ao Yu und Jie Zhang. „Distributed Blockchain-Based Trusted Multidomain Collaboration for Mobile Edge Computing in 5G and Beyond“. IEEE Transactions on Industrial Informatics 16, Nr. 11 (November 2020): 7094–104. http://dx.doi.org/10.1109/tii.2020.2964563.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Wan, Zheng, und Yan Li. „Deep Reinforcement Learning-Based Collaborative Video Caching and Transcoding in Clustered and Intelligent Edge B5G Networks“. Wireless Communications and Mobile Computing 2020 (12.12.2020): 1–16. http://dx.doi.org/10.1155/2020/6684293.

Der volle Inhalt der Quelle
Annotation:
In the next-generation wireless communications system of Beyond 5G networks, video streaming services have held a surprising proportion of the whole network traffic. Furthermore, the user preference and demand towards a specific video might be different because of the heterogeneity of users’ processing capabilities and the variation of network condition. Thus, it is a complicated decision problem with high-dimensional state spaces to choose appropriate quality videos according to users’ actual network condition. To address this issue, in this paper, a Content Distribution Network and Cluster-based Mobile Edge Computing framework has been proposed to enhance the ability of caching and computing and promote the collaboration among edge severs. Then, we develop a novel deep reinforcement learning-based framework to automatically obtain the intracluster collaborative caching and transcoding decisions, which are executed based on video popularity, user requirement prediction, and abilities of edge servers. Simulation results demonstrate that the quality of video streaming service can be significantly improved by using the designed deep reinforcement learning-based algorithm with less backhaul consumption and processing costs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Kumar, Priyanka Rajan, und Sonia Goel. „Empowering Smart Cities with Fog Computing: A Versatile Framework for Enhanced Healthcare Services and Beyond“. International Journal on Recent and Innovation Trends in Computing and Communication 11, Nr. 9 (27.10.2023): 335–41. http://dx.doi.org/10.17762/ijritcc.v11i9.8363.

Der volle Inhalt der Quelle
Annotation:
Fog Computing represents a distributed computing infrastructure strategically positioned at the network's edge, acting as an intermediate layer between remote cloud services and the data-generating smart devices on the ground. Leveraging this concept, a flexible and efficient smart city design emerges, offering a diverse range of applications, including smart healthcare, car parking, power management, water management, and waste management. The implementation of Fog computing enables reduced data processing latency and equitable workload distribution across fog nodes. The smart city system comprises several layers, namely connection, real-time processing, neighborhood linking, main processing, and data server layers. The flexibility of this framework allows for the scaling up or down of layers depending on specific smart city applications. In a case study focused on Smart healthcare services, the iFogSim platform was utilized to evaluate the system's performance. Notably, the results demonstrated a significant reduction in network usage, data processing latency, and processing costs when compared to traditional cloud computing solutions. Consequently, this improvement in efficiency translated into an enhanced user experience, offering superior scalability and reliability to users utilizing smart city services, including healthcare facilities.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Majumder, D., und S. M. Kumar. „A distributed e-health management model with edge computing in healthcare framework“. CARDIOMETRY, Nr. 22 (25.05.2022): 444–55. http://dx.doi.org/10.18137/cardiometry.2022.22.444455.

Der volle Inhalt der Quelle
Annotation:
Edge healthcare system is recognized as an acceptable paradigm for resolving this problem. The IoMT is divided into two sub-networks - intraWBANs and beyond-WBANs - based on the physical bonds of WBANs. Given the features of the healthcare systems, medical emergency, AoI and power depreciation are the prices of MUs. Intra-WBANs, a cooperative game shapes the wireless channel resource allocation problem. The Nash negotiation solution is used to get the unique optimum point in Pareto. MUs are regarded reasonable and perhaps egoistic in non-WBANs. Another non-cooperative activity is therefore developed to reduce overall system costs. The assessments of the performance of the system-wide cost and of the number of MUs gaining from edge computer systems are done to illustrate the success of our solution. Finally, for further effort, numerous barriers to research and open questions are highlighted.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Wen, Tai-Hao, Je-Min Hung, Wei-Hsing Huang, Chuan-Jia Jhang, Yun-Chen Lo, Hung-Hsi Hsu, Zhao-En Ke et al. „Fusion of memristor and digital compute-in-memory processing for energy-efficient edge computing“. Science 384, Nr. 6693 (19.04.2024): 325–32. http://dx.doi.org/10.1126/science.adf5538.

Der volle Inhalt der Quelle
Annotation:
Artificial intelligence (AI) edge devices prefer employing high-capacity nonvolatile compute-in-memory (CIM) to achieve high energy efficiency and rapid wakeup-to-response with sufficient accuracy. Most previous works are based on either memristor-based CIMs, which suffer from accuracy loss and do not support training as a result of limited endurance, or digital static random-access memory (SRAM)–based CIMs, which suffer from large area requirements and volatile storage. We report an AI edge processor that uses a memristor-SRAM CIM-fusion scheme to simultaneously exploit the high accuracy of the digital SRAM CIM and the high energy-efficiency and storage density of the resistive random-access memory memristor CIM. This also enables adaptive local training to accommodate personalized characterization and user environment. The fusion processor achieved high CIM capacity, short wakeup-to-response latency (392 microseconds), high peak energy efficiency (77.64 teraoperations per second per watt), and robust accuracy (<0.5% accuracy loss). This work demonstrates that memristor technology has moved beyond in-lab development stages and now has manufacturability for AI edge processors.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Bendechache, Malika, Sergej Svorobej, Patricia Takako Endo und Theo Lynn. „Simulating Resource Management across the Cloud-to-Thing Continuum: A Survey and Future Directions“. Future Internet 12, Nr. 6 (29.05.2020): 95. http://dx.doi.org/10.3390/fi12060095.

Der volle Inhalt der Quelle
Annotation:
In recent years, there has been significant advancement in resource management mechanisms for cloud computing infrastructure performance in terms of cost, quality of service (QoS) and energy consumption. The emergence of the Internet of Things has led to the development of infrastructure that extends beyond centralised data centers from the cloud to the edge, the so-called cloud-to-thing continuum (C2T). This infrastructure is characterised by extreme heterogeneity, geographic distribution, and complexity, where the key performance indicators (KPIs) for the traditional model of cloud computing may no longer apply in the same way. Existing resource management mechanisms may not be suitable for such complex environments and therefore require thorough testing, validation and evaluation before even being considered for live system implementation. Similarly, previously discounted resource management proposals may be more relevant and worthy of revisiting. Simulation is a widely used technique in the development and evaluation of resource management mechanisms for cloud computing but is a relatively nascent research area for new C2T computing paradigms such as fog and edge computing. We present a methodical literature analysis of C2T resource management research using simulation software tools to assist researchers in identifying suitable methods, algorithms, and simulation approaches for future research. We analyse 35 research articles from a total collection of 317 journal articles published from January 2009 to March 2019. We present our descriptive and synthetic analysis from a variety of perspectives including resource management, C2T layer, and simulation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Karjee, Jyotirmoy, Praveen Naik S, Kartik Anand und Vanamala N. Bhargav. „Split computing: DNN inference partition with load balancing in IoT-edge platform for beyond 5G“. Measurement: Sensors 23 (Oktober 2022): 100409. http://dx.doi.org/10.1016/j.measen.2022.100409.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Yao, Chao, Xiaoyang Wang, Zijie Zheng, Guangyu Sun und Lingyang Song. „EdgeFlow: Open-Source Multi-layer Data Flow Processing in Edge Computing for 5G and Beyond“. IEEE Network 33, Nr. 2 (März 2019): 166–73. http://dx.doi.org/10.1109/mnet.2018.1800001.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Bhat, Showkat Ahmad, Ishfaq Bashir Sofi und Chong-Yung Chi. „Edge Computing and Its Convergence With Blockchain in 5G and Beyond: Security, Challenges, and Opportunities“. IEEE Access 8 (2020): 205340–73. http://dx.doi.org/10.1109/access.2020.3037108.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Narayanan, Arun, Arthur Sousa De Sena, Daniel Gutierrez-Rojas, Dick Carrillo Melgarejo, Hafiz Majid Hussain, Mehar Ullah, Suzan Bayhan und Pedro H. J. Nardelli. „Key Advances in Pervasive Edge Computing for Industrial Internet of Things in 5G and Beyond“. IEEE Access 8 (2020): 206734–54. http://dx.doi.org/10.1109/access.2020.3037717.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Jin, Jianzhi, Ruiling Li, Xiaolian Yang, Mengyuan Jin und Fang Hu. „A network slicing algorithm for cloud-edge collaboration hybrid computing in 5G and beyond networks“. Computers and Electrical Engineering 109 (August 2023): 108750. http://dx.doi.org/10.1016/j.compeleceng.2023.108750.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Bhalerao, Ayush. „A Comprehensive Survey on Predicting Human Psychological State through Facial Feature Analysis“. International Journal for Research in Applied Science and Engineering Technology 12, Nr. 3 (31.03.2024): 731–38. http://dx.doi.org/10.22214/ijraset.2024.58777.

Der volle Inhalt der Quelle
Annotation:
Abstract: This study uses convolutional neural networks (CNNs) and image edge computing techniques to analyze facial features and forecast human psychological states in a novel way. Real-time or almost real-time face expression identification is the goal of the suggested approach, which will advance the developing field of affective computing and have potential uses in mental health evaluation and human-computer interaction. In this study, a variety of datasets with a broad range of facial expressions are used to train CNN models, guaranteeing consistent performance across a range of users and cultural backgrounds. Furthermore, the effective detection of human faces in photos is achieved by the use of the Haar Cascade Classifier, which improves the overall accuracy and dependability of the emotion recognition system. The algorithms’ efficiency is further increased by the addition of picture edge computing techniques, which makes them appropriate for deployment in contexts with limited resources. The suggested method’s accuracy in identifying and categorizing facial emotions is demonstrated by the experimental findings, indicating its potential practical applications. This research has implications for building mental health monitoring systems and enhancing user experience through technology, which goes beyond affective computing. This research fills important gaps in mental health screening and assistance while also enhancing the capabilities of facial expression recognition systems and making human-computer interaction interfaces more responsive and intuitive.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

AKL, SELIM G., und Stefan D. Bruda. „PARALLEL REAL-TIME OPTIMIZATION: BEYOND SPEEDUP“. Parallel Processing Letters 09, Nr. 04 (Dezember 1999): 499–509. http://dx.doi.org/10.1142/s0129626499000463.

Der volle Inhalt der Quelle
Annotation:
Traditionally, interest in parallel computation centered around the speedup provided by parallel algorithms over their sequential counterparts. In this paper, we ask a different type of question: Can parallel computers, due to their speed, do more than simply speed up the solution to a problem? We show that for real-time optimization problems, a parallel computer can obtain a solution that is better than that obtained by a sequential one. Specifically, a sequential and a parallel algorithm are exhibited for the problem of computing the best-possible approximation to the minimum-weight spanning tree of a connected, undirected and weighted graph whose vertices and edges are not all available at the outset, but instead arrive in real time. While the parallel algorithm succeeds in computing the exact minimum-weight spanning tree, the sequential algorithm can only manage to obtain an approximate solution. In the worst case, the ratio of the weight of the solution obtained sequentially to that of the solution computed in parallel can be arbitrarily large.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Lescano, Luis Freire, Marcos Lalama Flores und Maria Pico Pico. „Distributed Facial Recognition Facial Recognition in Visual Internet of Things (VIoT)-- An Intelligent Approach“. Journal of Intelligent Systems and Internet of Things 10, Nr. 2 (2023): 18–23. http://dx.doi.org/10.54216/jisiot.100202.

Der volle Inhalt der Quelle
Annotation:
In the rapidly evolving landscape of the Visual Internet of Things (VIoT), this paper presents a pioneering approach to distributed facial expression recognition—an intelligent system that holds transformative potential for security, human-computer interaction, and personalized services. Our journey unfolds with the development of the Light Vision Transformer (LVT) model, specifically engineered to operate on the resource-constrained edges of the VIoT network. Differentially private federated training ensures both the model's prowess and the preservation of user privacy. Through meticulous experimental evaluations, we validate the effectiveness and efficiency of our approach, shedding light on its scalability and ethical implications. This work is more than a technical endeavor; it symbolizes a commitment to responsible AI, balancing innovation with the preservation of individual rights. Our findings resonate beyond facial expression recognition, serving as a beacon for the VIoT community to explore the dynamic interplay between distributed computing, edge intelligence, and ethical considerations. As we stride towards a more connected and responsive world, this research paves the way for continued exploration, propelling VIoT technology towards a future that is both intelligent and ethically attuned.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Sarathkumar Rangarajan und Tahsien Al-Quraishi. „Navigating the Future of the Internet of Things: Emerging Trends and Transformative Applications“. Babylonian Journal of Internet of Things 2023 (25.02.2023): 8–12. http://dx.doi.org/10.58496/bjiot/2023/002.

Der volle Inhalt der Quelle
Annotation:
This editorial navigates the transformative landscape of emerging technologies within the Internet of Things (IoT), aiming to unravel their interconnected impact and humanistic implications across diverse domains. As scholarly voyagers in the realm of technological innovation, this paper delineates the synergistic interplay between blockchain integration, edge computing, Artificial Intelligence (AI)/Machine Learning (ML), digital twins, and IoT-driven smart cities. Each section of this editorial unravels the significance and applications of these technologies: blockchain fortifies IoT security, edge computing enables real-time decision-making, AI/ML augments device intelligence, digital twins refine simulations, and IoT-driven smart cities encapsulate these advancements for societal betterment. Beyond technical expositions, this narrative aspires to humanize scholarly discourse, weaving ethical considerations, interdisciplinary collaborations, and societal implications into the fabric of technological advancement. The editorial concludes by synthesizing these insights, advocating for a more connected, sustainable, and ethically informed future. Through this scholarly expedition, the aim is to inspire dialogue, stimulate interdisciplinary collaborations, and chart a course toward a future where technology converges harmoniously with societal enhancement.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Spiga, Daniele, Diego Ciangottini, Mirco Tracolli, Tommaso Tedeschi, Daniele Cesini, Tommaso Boccali, Valentina Poggioni, Marco Baioletti und Valentin Y. Kuznetsov. „Smart Caching at CMS: applying AI to XCache edge services“. EPJ Web of Conferences 245 (2020): 04024. http://dx.doi.org/10.1051/epjconf/202024504024.

Der volle Inhalt der Quelle
Annotation:
The projected Storage and Compute needs for the HL-LHC will be a factor up to 10 above what can be achieved by the evolution of current technology within a flat budget. The WLCG community is studying possible technical solutions to evolve the current computing in order to cope with the requirements; one of the main focus is resource optimization, with the ultimate aim of improving performance and efficiency, as well as simplifying and reducing operation costs. As of today the storage consolidation based on a Data Lake model is considered a good candidate for addressing HL-LHC data access challenges. The Data Lake model under evaluation can be seen as a logical system that hosts a distributed working set of analysis data. Compute power can be “close” to the lake, but also remote and thus completely external. In this context we expect data caching to play a central role as a technical solution to reduce the impact of latency and reduce network load. A geographically distributed caching layer will be functional to many satellite computing centers that might appear and disappear dynamically. In this talk we propose a system of caches, distributed at national level, describing both deployment and results of the studies made to measure the impact on the CPU efficiency. In this contribution, we also present the early results on novel caching strategy beyond the standard XRootD approach whose results will be a baseline for an AI-based smart caching system.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Tam, Prohim, Seungwoo Kang, Seyha Ros und Seokhoon Kim. „Enhancing QoS with LSTM-Based Prediction for Congestion-Aware Aggregation Scheduling in Edge Federated Learning“. Electronics 12, Nr. 17 (27.08.2023): 3615. http://dx.doi.org/10.3390/electronics12173615.

Der volle Inhalt der Quelle
Annotation:
The advancement of the sensing capabilities of end devices drives a variety of data-intensive insights, yielding valuable information for modelling intelligent industrial applications. To apply intelligent models in 5G and beyond, edge intelligence integrates edge computing systems and deep learning solutions, which enables distributed model training and inference. Edge federated learning (EFL) offers collaborative edge intelligence learning with distributed aggregation capabilities, promoting resource efficiency, participant inclusivity, and privacy preservation. However, the quality of service (QoS) faces challenges due to congestion problems that arise from the diverse models and data in practical architectures. In this paper, we develop a modified long short-term memory (LSTM)-based congestion-aware EFL (MLSTM-CEFL) approach that aims to enhance QoS in the final model convergence between end devices, edge aggregators, and the global server. Given the diversity of service types, MLSTM-CEFL proactively detects the congestion rates, adequately schedules the edge aggregations, and effectively prioritizes high mission-critical serving resources. The proposed system is formulated to handle time series analysis from local/edge model parameter loading, weighing the configuration of resource pooling properties at specific congestion intervals. The MLSTM-CEFL policy orchestrates the establishment of long-term paths for participant-aggregator scheduling and follows the expected QoS metrics after final averaging in multiple industrial application classes.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Dechouniotis, Dimitrios, Nikolaos Athanasopoulos, Aris Leivadeas, Nathalie Mitton, Raphael Jungers und Symeon Papavassiliou. „Edge Computing Resource Allocation for Dynamic Networks: The DRUID-NET Vision and Perspective“. Sensors 20, Nr. 8 (13.04.2020): 2191. http://dx.doi.org/10.3390/s20082191.

Der volle Inhalt der Quelle
Annotation:
The potential offered by the abundance of sensors, actuators, and communications in the Internet of Things (IoT) era is hindered by the limited computational capacity of local nodes. Several key challenges should be addressed to optimally and jointly exploit the network, computing, and storage resources, guaranteeing at the same time feasibility for time-critical and mission-critical tasks. We propose the DRUID-NET framework to take upon these challenges by dynamically distributing resources when the demand is rapidly varying. It includes analytic dynamical modeling of the resources, offered workload, and networking environment, incorporating phenomena typically met in wireless communications and mobile edge computing, together with new estimators of time-varying profiles. Building on this framework, we aim to develop novel resource allocation mechanisms that explicitly include service differentiation and context-awareness, being capable of guaranteeing well-defined Quality of Service (QoS) metrics. DRUID-NET goes beyond the state of the art in the design of control algorithms by incorporating resource allocation mechanisms to the decision strategy itself. To achieve these breakthroughs, we combine tools from Automata and Graph theory, Machine Learning, Modern Control Theory, and Network Theory. DRUID-NET constitutes the first truly holistic, multidisciplinary approach that extends recent, albeit fragmented results from all aforementioned fields, thus bridging the gap between efforts of different communities.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

Jiya, Eli Adama, Faith Titobiloluwa Akinyemi und Uriah A. Nwocha. „IoT and Edge Computing Technologies as Security Option for Train Service in Nigeria“. Journal of Information Systems and Informatics 5, Nr. 3 (06.09.2023): 1086–98. http://dx.doi.org/10.51519/journalisi.v5i3.560.

Der volle Inhalt der Quelle
Annotation:
The revival of rail transport service in Nigeria in recent years came at a critical moment of insecurity in Nigeria, it promised not only to serve as an alternative to overloaded vehicles on the roads but was also thought to be a safer means of transportation. Due to kidnapping on many roads, both high and low-class Nigerians patronized rail transport. However, train attacks and terrorism are major challenges that have impacted the sector negatively. Though the government has tried to improve surveillance along train routes, however, the result has not been impressive. The current level of insecurity in Nigeria is beyond the use of traditional surveillance and monitoring systems. It requires the adoption of technology to fight attacks and to monitor the health of train facilities. While the insecurity challenges that have overwhelmed the security forces will not permit the assignment of more personnel to the rail tracks that stretch several kilometres across the country, the incorporation of IoT and edge computing can be an optimal solution to the challenges of constant security problems. Among the trending technologies, IoT is a viable option that the country can adopt to improve security in the sector. It will increase the confidence of passengers and improve revenue and growth in rail transport.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Klimenko, A. B. „Two-Criteria Technique for the Resource-Saving Computing in the Fog and Edge Network Tiers“. Advanced Engineering Research 23, Nr. 1 (17.04.2023): 85–94. http://dx.doi.org/10.23947/2687-1653-2023-23-1-85-94.

Der volle Inhalt der Quelle
Annotation:
Introduction. At present, the concepts of fog and edge computing are used in a wide range of applications of various kinds. One of the key problems in the organization of computing in groups of mobile devices that make up the edge/fog layer is the mission assurance based on battery power availability. In this context, a lot of developments aimed at energy saving of device systems have been presented to date. However, one important aspect remains beyond the consideration of the problem of resource saving, namely, the issue of saving the residual resource of a computing device. The aim of this research is to formalize the workload distribution problem as two-criteria optimization problem, and to develop the basic solution technique.Materials and Methods. Within the framework of this article, an approach to resource saving is proposed. It is based on the evaluation of two device criteria: battery life and residual resource of a computing device. The residual resource of a computing device can be estimated using the probability of failure-free operation of the device, or as the reciprocal of the failure rate, taking into account that the exponential law of failure distribution is used in the simulation. From this, a model of the problem of two-criteria optimization is formulated, taking into account the dynamics of the network topology in the process of performing a user mission. The topology dynamics is reflected in the model as a sequence of topologies, each of which corresponds to a certain period of time of the system operation.Results. Based on the proposed model of the two-criteria optimization problem, a method was proposed for resource saving in the edge and foggy layers of the network. It reflected the specifics of the dynamic layers of the network, and also took into account the importance of the criteria for estimating the consumption of device resources. An experiment was conducted to evaluate the impact of the method of distributing tasks over a network cluster on the probability of failure-free operation of devices and on the average residual resource.Discussion and Conclusions. The conducted experiment has demonstrated the feasibility of using the developed method, since the distribution of tasks among executing devices had a significant impact (up to 25 % according to the results of the experiment) on the average residual resource of a computing device.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Al-Ansi, Ahmed, Abdullah M. Al-Ansi, Ammar Muthanna, Ibrahim A. Elgendy und Andrey Koucheryavy. „Survey on Intelligence Edge Computing in 6G: Characteristics, Challenges, Potential Use Cases, and Market Drivers“. Future Internet 13, Nr. 5 (30.04.2021): 118. http://dx.doi.org/10.3390/fi13050118.

Der volle Inhalt der Quelle
Annotation:
Intelligence Edge Computing (IEC) is the key enabler of emerging 5G technologies networks and beyond. IEC is considered to be a promising backbone of future services and wireless communication systems in 5G integration. In addition, IEC enables various use cases and applications, including autonomous vehicles, augmented and virtual reality, big data analytic, and other customer-oriented services. Moreover, it is one of the 5G technologies that most enhanced market drivers in different fields such as customer service, healthcare, education methods, IoT in agriculture and energy sustainability. However, 5G technological improvements face many challenges such as traffic volume, privacy, security, digitization capabilities, and required latency. Therefore, 6G is considered to be promising technology for the future. To this end, compared to other surveys, this paper provides a comprehensive survey and an inclusive overview of Intelligence Edge Computing (IEC) technologies in 6G focusing on main up-to-date characteristics, challenges, potential use cases and market drivers. Furthermore, we summarize research efforts on IEC in 5G from 2014 to 2021, in which the integration of IEC and 5G technologies are highlighted. Finally, open research challenges and new future directions in IEC with 6G networks will be discussed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Samir, Rasha, Hadia El-Hennawy und Hesham Elbadawy. „Cluster-Based Multi-User Multi-Server Caching Mechanism in Beyond 5G/6G MEC“. Sensors 23, Nr. 2 (15.01.2023): 996. http://dx.doi.org/10.3390/s23020996.

Der volle Inhalt der Quelle
Annotation:
The work on perfecting the rapid proliferation of wireless technologies resulted in the development of wireless modeling standards, protocols, and control of wireless manipulators. Several mobile communication technology applications in different fields are dramatically revolutionized to deliver more value at less cost. Multiple-access Edge Computing (MEC) offers excellent advantages for Beyond 5G (B5G) and Sixth-Generation (6G) networks, reducing latency and bandwidth usage while increasing the capability of the edge to deliver multiple services to end users in real time. We propose a Cluster-based Multi-User Multi-Server (CMUMS) caching algorithm to optimize the MEC content caching mechanism and control the distribution of high-popular tasks. As part of our work, we address the problem of integer optimization of the content that will be cached and the list of hosting servers. Therefore, a higher direct hit rate will be achieved, a lower indirect hit rate will be achieved, and the overall time delay will be reduced. As a result of the implementation of this system model, maximum utilization of resources and development of a completely new level of services and innovative approaches will be possible.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Zhang, Chen, Celimuge Wu, Min Lin, Yangfei Lin und William Liu. „Proximal Policy Optimization for Efficient D2D-Assisted Computation Offloading and Resource Allocation in Multi-Access Edge Computing“. Future Internet 16, Nr. 1 (02.01.2024): 19. http://dx.doi.org/10.3390/fi16010019.

Der volle Inhalt der Quelle
Annotation:
In the advanced 5G and beyond networks, multi-access edge computing (MEC) is increasingly recognized as a promising technology, offering the dual advantages of reducing energy utilization in cloud data centers while catering to the demands for reliability and real-time responsiveness in end devices. However, the inherent complexity and variability of MEC networks pose significant challenges in computational offloading decisions. To tackle this problem, we propose a proximal policy optimization (PPO)-based Device-to-Device (D2D)-assisted computation offloading and resource allocation scheme. We construct a realistic MEC network environment and develop a Markov decision process (MDP) model that minimizes time loss and energy consumption. The integration of a D2D communication-based offloading framework allows for collaborative task offloading between end devices and MEC servers, enhancing both resource utilization and computational efficiency. The MDP model is solved using the PPO algorithm in deep reinforcement learning to derive an optimal policy for offloading and resource allocation. Extensive comparative analysis with three benchmarked approaches has confirmed our scheme’s superior performance in latency, energy consumption, and algorithmic convergence, demonstrating its potential to improve MEC network operations in the context of emerging 5G and beyond technologies.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Pan, Bitao, Fulong Yan, Xiaotao Guo und Nicola Calabretta. „Experimental Assessment of Automatic Optical Metro Edge Computing Network for Beyond 5G Applications and Network Service Composition“. Journal of Lightwave Technology 39, Nr. 10 (15.05.2021): 3004–10. http://dx.doi.org/10.1109/jlt.2021.3064800.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Rapuzzi, R., und M. Repetto. „Building situational awareness for network threats in fog/edge computing: Emerging paradigms beyond the security perimeter model“. Future Generation Computer Systems 85 (August 2018): 235–49. http://dx.doi.org/10.1016/j.future.2018.04.007.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Dias, Imali, Lihua Ruan, Chathurika Ranaweera und Elaine Wong. „From 5G to beyond: Passive optical network and multi-access edge computing integration for latency-sensitive applications“. Optical Fiber Technology 75 (Januar 2023): 103191. http://dx.doi.org/10.1016/j.yofte.2022.103191.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Ming, Zhao, Xiuhua Li, Chuan Sun, Qilin Fan, Xiaofei Wang und Victor C. M. Leung. „Sleeping Cell Detection for Resiliency Enhancements in 5G/B5G Mobile Edge-Cloud Computing Networks“. ACM Transactions on Sensor Networks 18, Nr. 3 (31.08.2022): 1–30. http://dx.doi.org/10.1145/3512893.

Der volle Inhalt der Quelle
Annotation:
The rapid increase of data traffic has brought great challenges to the maintenance and optimization of 5G and beyond, and some smart critical infrastructures, e.g., small base stations (SBSs) in cellular cells, are facing serious security and failure threats, causing resiliency degradation concerns. Among special smart critical infrastructure failures, the sleeping cell failure is hard to address since no alarm is generally triggered. Sleeping cells can remain undetected for a long time and can severely affect the quality of service/quality of experience to users. To enhance the resiliency of the SBSs in sleeping cells, we design a mobile edge-cloud computing system and propose a semi-supervised learning-based framework to dynamically detect the sleeping cells. Particularly, we consider two indicators, recovery proportion and recovery speed, to measure the resiliency of the SBSs. Moreover, in the proposed scheme, experts’ optimization experience and each period’s detection results can be utilized to iteratively improve the performance. Then we adopt a dataset from real-world networks for performance evaluation. Trace-driven evaluation results demonstrate that the proposed scheme outperforms existing sleeping cell detection schemes, and can also reduce the communication and runtime costs and enhance the resiliency of the SBSs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Eluwole, Opeoluwa Tosin, und Mike Oluwatayo Ojo. „The Key Impacts of Softwarization in the Modern Era of 5G and the Internet of Things“. International Journal of Interdisciplinary Telecommunications and Networking 12, Nr. 3 (Juli 2020): 16–27. http://dx.doi.org/10.4018/ijitn.2020070102.

Der volle Inhalt der Quelle
Annotation:
Fascinating technologies, such as software defined networking (SDN), network function virtualization (NFV) and mobile edge computing (MEC) among others, have introduced software-enabling capabilities to telecommunications, mobile and wireless communications. To depict this systemic evolution, various terminologies, such as system cloudification, network programmability, advanced computing and most popularly, softwarization, have been used by numerous scholars. Softwarization is now becoming a fully established phenomenon, especially in the new era of the rapidly evolving Internet of Things (IoT), artificial intelligence (AI) and the looming 5G technology. Away from the research and development (R&D) focus on the technological capabilities of softwarization, this article highlights the main stakeholders in softwarization and underlines a tripartite influence of the systemic evolution i.e. technical, social and economic impacts, all of which will be vital in ensuring a sustainable 5G technology and beyond.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie