Academic literature on the topic 'Intelligent Edge Networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Intelligent Edge Networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Intelligent Edge Networks"

1

Li, Qian, Heng Liu, and Xiaoming Zhao. "IoT Networks-Aided Perception Vocal Music Singing Learning System and Piano Teaching with Edge Computing." Mobile Information Systems 2023 (April 28, 2023): 1–9. http://dx.doi.org/10.1155/2023/2074890.

Full text
Abstract:
The research on Internet of Things (IoT) network and edge computing has been a research hotspot in both industry and academia in recent years, especially for the ambient intelligence and massive communication. As a typical form of IoT network and edge computing, the intelligent perception vocal music singing learning system has attracted the attention of researchers in education and academia. Piano teaching is an important course for music majors in higher education. Strengthening piano teaching can cultivate outstanding piano talents for the country and promote the development of music art. This paper applies IoT perception technology to piano teaching, constructs an intelligent piano teaching system, and uses edge computing algorithms to accurately deploy sensors into the system by exploiting the ambient intelligence and massive communication. The system includes data acquisition, data perception, data monitoring, and other modules, making piano teaching more humanized and intelligent. Experiments show that the research in this paper provides important guidance for the application of IoT networks and edge computing, especially for the ambient intelligence and massive communication.
APA, Harvard, Vancouver, ISO, and other styles
2

Musa, Salahadin Seid, Marco Zennaro, Mulugeta Libsie, and Ermanno Pietrosemoli. "Convergence of Information-Centric Networks and Edge Intelligence for IoV: Challenges and Future Directions." Future Internet 14, no. 7 (June 25, 2022): 192. http://dx.doi.org/10.3390/fi14070192.

Full text
Abstract:
Recently the Internet of Vehicles (IoV) has become a promising research area in the field of the Internet of Things (IoT), which enables vehicles to communicate and exchange real-time information with each other, as well as with infrastructure, people, and other sensors and actuators through various communication interfaces. The realization of IoV networks faces various communication and networking challenges to meet stringent requirements of low latency, dynamic topology, high data-rate connectivity, resource allocation, multiple access, and QoS. Advances in information-centric networks (ICN), edge computing (EC), and artificial intelligence (AI) will transform and help to realize the Intelligent Internet of Vehicles (IIoV). Information-centric networks have emerged as a paradigm promising to cope with the limitations of the current host-based network architecture (TCP/IP-based networks) by providing mobility support, efficient content distribution, scalability and security based on content names, regardless of their location. Edge computing (EC), on the other hand, is a key paradigm to provide computation, storage and other cloud services in close proximity to where they are requested, thus enabling the support of real-time services. It is promising for computation-intensive applications, such as autonomous and cooperative driving, and to alleviate storage burdens (by caching). AI has recently emerged as a powerful tool to break through obstacles in various research areas including that of intelligent transport systems (ITS). ITS are smart enough to make decisions based on the status of a great variety of inputs. The convergence of ICN and EC with AI empowerment will bring new opportunities while also raising not-yet-explored obstacles to realize Intelligent IoV. In this paper, we discuss the applicability of AI techniques in solving challenging vehicular problems and enhancing the learning capacity of edge devices and ICN networks. A comprehensive review is provided of utilizing intelligence in EC and ICN to address current challenges in their application to IIoV. In particular, we focus on intelligent edge computing and networking, offloading, intelligent mobility-aware caching and forwarding and overall network performance. Furthermore, we discuss potential solutions to the presented issues. Finally, we highlight potential research directions which may illuminate efforts to develop new intelligent IoV applications.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Jiaxin, Xing Zhang, Peng Wang, Liangjingrong Liu, and Yuanjun Wang. "Double-edge intelligent integrated satellite terrestrial networks." China Communications 17, no. 9 (September 2020): 128–46. http://dx.doi.org/10.23919/jcc.2020.09.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zeydan, Engin, Josep Mangues-Bafalluy, and Yekta Turk. "Intelligent Service Orchestration in Edge Cloud Networks." IEEE Network 35, no. 6 (November 2021): 126–32. http://dx.doi.org/10.1109/mnet.101.2100214.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Pencheva, Evelina, Ivaylo Atanasov, and Ventsislav Trifonov. "Towards Intelligent, Programmable, and Open Railway Networks." Applied Sciences 12, no. 8 (April 17, 2022): 4062. http://dx.doi.org/10.3390/app12084062.

Full text
Abstract:
The virtualization and automation of network functions will be key features of future high-speed railway networks, which have to provide dependable, safe, and secure services. The virtualization of railway network functions will enable functions such as train control, train integrity protection, shunting control, and trackside monitoring and maintenance to be virtualized and to be run on general-purpose hardware. Network function virtualization combined with edge computing can deliver dynamic, low-latency, and reliable services. The automation of railway operations can be achieved by embedding intelligence into the network to optimize the railway operation performance and to enhance the passenger experience. This paper presents an innovative railway network architecture that features distributed intelligence, function cloudification and virtualization, openness, and programmability. The focus is on time-tolerant and time-sensitive intelligent services designed to follow the principles of service-oriented architecture. The interaction between identified logical identities is illustrated by use cases. The paper provides some details of the design of the interface between distributed intelligent services and presents the results of an emulation of the interface performance.
APA, Harvard, Vancouver, ISO, and other styles
6

Alam, Tanweer, Baha Rababah, Arshad Ali, and Shamimul Qamar. "Distributed Intelligence at the Edge on IoT Networks." Annals of Emerging Technologies in Computing 4, no. 5 (December 20, 2020): 1–18. http://dx.doi.org/10.33166/aetic.2020.05.001.

Full text
Abstract:
The Internet of Things (IoT) has revolutionized innovation to collect and store the information received from physical objects or sensors. The smart devices are linked to a repository that stores intelligent information executed by sensors on IoT-based smart objects. Now, the IoT is shifted from knowledge-based technologies to operational-based technologies. The IoT integrates sensors, smart devices, and a smart grid of implementations to deliver smart strategies. Nowadays, the IoT has been pondered to be an essential technology. The transmission of information to or from the cloud has recently been found to cause many network problems to include latency, power usage, security, privacy, etc. The distributed intelligence enables IoT to help the correct communication available at the correct time and correct place. Distributed Intelligence could strengthen the IoT in a variety of ways, including evaluating the integration of different big data or enhancing efficiency and distribution in huge IoT operations. While evaluating distributed intelligence in the IoT paradigm, the implementation of distributed intelligence services should take into consideration the transmission delay and bandwidth requirements of the network. In this article, the distributed intelligence at the Edge on IoT Networks, applications, opportunities, challenges and future scopes have been presented.
APA, Harvard, Vancouver, ISO, and other styles
7

Tassiulas, Leandros. "Enabling Intelligent Services at the Network Edge." ACM SIGMETRICS Performance Evaluation Review 49, no. 1 (June 22, 2022): 69–70. http://dx.doi.org/10.1145/3543516.3453912.

Full text
Abstract:
The proliferation of novel mobile applications and the associated AI services necessitates a fresh view on the architecture, algorithms and services at the network edge in order to meet stringent performance requirements. Some recent work addressing these challenges is presented. In order to meet the requirement for low-latency, the execution of computing tasks moves form the cloud to the network edge, closer to the end-users. The joint optimization of service placement and request routing in dense mobile edge computing networks is considered. Multidimensional constraints are introduced to capture the storage requirements of the vast amounts of data needed. An algorithm that achieves close-to-optimal performance using a randomized rounding technique is presented. Recent advances in network virtualization and programmability enable realization of services as chains, where flows can be steered through a pre-defined sequence of functions deployed at different network locations. The optimal deployment of such service chains where storage is a stringent constraint in addition to computation and bandwidth is considered and an approximation algorithm with provable performance guarantees is proposed and evaluated. Finally the problem of traffic flow classification as it arises in firewalls and intrusion detection applications is presented. An approach for realizing such functions based on a novel two-stage deep learning method for attack detection is presented. Leveraging the high level of data plane programmability in modern network hardware, the realization of these mechanisms at the network edge is demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
8

Guo, Hongzhi, Jiajia Liu, Ju Ren, and Yanning Zhang. "Intelligent Task Offloading in Vehicular Edge Computing Networks." IEEE Wireless Communications 27, no. 4 (August 2020): 126–32. http://dx.doi.org/10.1109/mwc.001.1900489.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bourechak, Amira, Ouarda Zedadra, Mohamed Nadjib Kouahla, Antonio Guerrieri, Hamid Seridi, and Giancarlo Fortino. "At the Confluence of Artificial Intelligence and Edge Computing in IoT-Based Applications: A Review and New Perspectives." Sensors 23, no. 3 (February 2, 2023): 1639. http://dx.doi.org/10.3390/s23031639.

Full text
Abstract:
Given its advantages in low latency, fast response, context-aware services, mobility, and privacy preservation, edge computing has emerged as the key support for intelligent applications and 5G/6G Internet of things (IoT) networks. This technology extends the cloud by providing intermediate services at the edge of the network and improving the quality of service for latency-sensitive applications. Many AI-based solutions with machine learning, deep learning, and swarm intelligence have exhibited the high potential to perform intelligent cognitive sensing, intelligent network management, big data analytics, and security enhancement for edge-based smart applications. Despite its many benefits, there are still concerns about the required capabilities of intelligent edge computing to deal with the computational complexity of machine learning techniques for big IoT data analytics. Resource constraints of edge computing, distributed computing, efficient orchestration, and synchronization of resources are all factors that require attention for quality of service improvement and cost-effective development of edge-based smart applications. In this context, this paper aims to explore the confluence of AI and edge in many application domains in order to leverage the potential of the existing research around these factors and identify new perspectives. The confluence of edge computing and AI improves the quality of user experience in emergency situations, such as in the Internet of vehicles, where critical inaccuracies or delays can lead to damage and accidents. These are the same factors that most studies have used to evaluate the success of an edge-based application. In this review, we first provide an in-depth analysis of the state of the art of AI in edge-based applications with a focus on eight application areas: smart agriculture, smart environment, smart grid, smart healthcare, smart industry, smart education, smart transportation, and security and privacy. Then, we present a qualitative comparison that emphasizes the main objective of the confluence, the roles and the use of artificial intelligence at the network edge, and the key enabling technologies for edge analytics. Then, open challenges, future research directions, and perspectives are identified and discussed. Finally, some conclusions are drawn.
APA, Harvard, Vancouver, ISO, and other styles
10

Yang, Yang, Rui Lyu, Zhipeng Gao, Lanlan Rui, and Yu Yan. "Semisupervised Graph Neural Networks for Traffic Classification in Edge Networks." Discrete Dynamics in Nature and Society 2023 (July 3, 2023): 1–13. http://dx.doi.org/10.1155/2023/2879563.

Full text
Abstract:
Edge networking brings computation and data storage as close to the point of request as possible. Various intelligent devices are connected to the edge nodes where traffic packets flow. Traffic classification tasks are thought to be a keystone for network management; researchers can analyze packets captured to understand the traffic as it hits their network. However, the existing traffic classification framework needs to conduct a unified analysis, which leads to the huge bandwidth resources required in the process of transferring all captured packet files to train a global classifier. In this paper, a semisupervised graph neural network traffic classifier is proposed for cloud-edge architecture so that cloud servers and edge nodes could cooperate to perform the traffic classification tasks in order to deliver low latency and save bandwidth on the edge nodes. To preserve the structural information and interrelationships conveyed in packets within a session, we transform traffic sessions into graphs. We segment the frequently combined consecutive packets into granules, which are later transformed into the nodes in graphs. Edges could extract the adjacency of the granules in the sessions; the edge node side then selects the highly representative samples and sends them to the cloud server; the server side uses graph neural networks to perform semisupervised classification tasks on the selected training set. Our method has been trained and tested on several datasets, such as the VPN-nonVPN dataset, and the experimental results show good performance on accuracy, recall, and F-score.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Intelligent Edge Networks"

1

Mestoukirdi, Mohamad. "Reliable and Communication-Efficient Federated Learning for Future Intelligent Edge Networks." Electronic Thesis or Diss., Sorbonne université, 2023. http://www.theses.fr/2023SORUS432.

Full text
Abstract:
Dans le domaine des futurs réseaux sans fil 6G, l'intégration de la périphérie intelligente grâce à l'avènement de l'IA représente un bond en avant considérable, promettant des avancées révolutionnaires en matière de communication sans fil. Cette intégration favorise une synergie harmonieuse, capitalisant sur le potentiel collectif de ces technologies transformatrices. Au cœur de cette intégration se trouve le rôle de l'apprentissage fédéré, un paradigme d'apprentissage décentralisé qui préserve la confidentialité des données tout en exploitant l'intelligence collective des appareils interconnectés. Dans la première partie de la thèse, nous nous attaquons au problème de l'hétérogénéité statistique dans l'apprentissage fédéré, qui découle des distributions de données divergentes entre les ensembles de données des dispositifs. Plutôt que d'entraîner un modèle unique conventionnel, qui donne souvent de mauvais résultats avec des données non identifiées, nous proposons un ensemble de règles centrées sur l'utilisateur qui produisent des modèles personnalisés adaptés aux objectifs de chaque utilisateur. Pour atténuer la surcharge de communication prohibitive associée à l'apprentissage d'un modèle personnalisé distinct pour chaque utilisateur, les utilisateurs sont répartis en groupes sur la base de la similarité de leurs objectifs. Cela permet l'apprentissage collectif de modèles personnalisés spécifiques à la cohorte. En conséquence, le nombre total de modèles personnalisés formés est réduit. Cette réduction diminue la consommation de ressources sans fil nécessaires à la transmission des mises à jour de modèles sur des canaux sans fil à bande passante limitée. Dans la deuxième partie, nous nous concentrons sur l'intégration des dispositifs à distance de l'IdO dans la périphérie intelligente en exploitant les véhicules aériens sans pilote en tant qu'orchestrateur d'apprentissage fédéré. Alors que des études antérieures ont largement exploré le potentiel des drones en tant que stations de base volantes ou relais dans les réseaux sans fil, leur utilisation pour faciliter l'apprentissage de modèles est encore un domaine de recherche relativement nouveau. Dans ce contexte, nous tirons parti de la mobilité des drones pour contourner les conditions de canal défavorables dans les zones rurales et établir des terrains d'apprentissage pour les dispositifs IoT distants. Cependant, les déploiements de drones posent des défis en termes de planification et de conception de trajectoires. À cette fin, une optimisation conjointe de la trajectoire du drone, de l'ordonnancement du dispositif et de la performance d'apprentissage est formulée et résolue à l'aide de techniques d'optimisation convexe et de la théorie des graphes. Dans la troisième partie de cette thèse, nous jetons un regard critique sur la surcharge de communication imposée par l'apprentissage fédéré sur les réseaux sans fil. Bien que les techniques de compression telles que la quantification et la sparsification des mises à jour de modèles soient largement utilisées, elles permettent souvent d'obtenir une efficacité de communication au prix d'une réduction de la performance du modèle. Pour surmonter cette limitation, nous utilisons des réseaux aléatoires sur-paramétrés pour approximer les réseaux cibles par l'élagage des paramètres plutôt que par l'optimisation directe. Il a été démontré que cette approche ne nécessite pas la transmission de plus d'un seul bit d'information par paramètre du modèle. Nous montrons que les méthodes SoTA ne parviennent pas à tirer parti de tous les avantages possibles en termes d'efficacité de la communication en utilisant cette approche. Nous proposons une fonction de perte régularisée qui prend en compte l'entropie des mises à jour transmises, ce qui se traduit par des améliorations notables de l'efficacité de la communication et de la mémoire lors de l'apprentissage fédéré sur des dispositifs périphériques, sans sacrifier la précision
In the realm of future 6G wireless networks, integrating the intelligent edge through the advent of AI signifies a momentous leap forward, promising revolutionary advancements in wireless communication. This integration fosters a harmonious synergy, capitalizing on the collective potential of these transformative technologies. Central to this integration is the role of federated learning, a decentralized learning paradigm that upholds data privacy while harnessing the collective intelligence of interconnected devices. By embracing federated learning, 6G networks can unlock a myriad of benefits for both wireless networks and edge devices. On one hand, wireless networks gain the ability to exploit data-driven solutions, surpassing the limitations of traditional model-driven approaches. Particularly, leveraging real-time data insights will empower 6G networks to adapt, optimize performance, and enhance network efficiency dynamically. On the other hand, edge devices benefit from personalized experiences and tailored solutions, catered to their specific requirements. Specifically, edge devices will experience improved performance and reduced latency through localized decision-making, real-time processing, and reduced reliance on centralized infrastructure. In the first part of the thesis, we tackle the predicament of statistical heterogeneity in federated learning stemming from divergent data distributions among devices datasets. Rather than training a conventional one-model-fits-all, which often performs poorly with non-IID data, we propose user-centric set of rules that produce personalized models tailored to each user objectives. To mitigate the prohibitive communication overhead associated with training distinct personalized model for each user, users are partitioned into clusters based on their objectives similarity. This enables collective training of cohort-specific personalized models. As a result, the total number of personalized models trained is reduced. This reduction lessens the consumption of wireless resources required to transmit model updates across bandwidth-limited wireless channels. In the second part, our focus shifts towards integrating IoT remote devices into the intelligent edge by leveraging unmanned aerial vehicles as a federated learning orchestrator. While previous studies have extensively explored the potential of UAVs as flying base stations or relays in wireless networks, their utilization in facilitating model training is still a relatively new area of research. In this context, we leverage the UAV mobility to bypass the unfavorable channel conditions in rural areas and establish learning grounds to remote IoT devices. However, UAV deployments poses challenges in terms of scheduling and trajectory design. To this end, a joint optimization of UAV trajectory, device scheduling, and the learning performance is formulated and solved using convex optimization techniques and graph theory. In the third and final part of this thesis, we take a critical look at thecommunication overhead imposed by federated learning on wireless networks. While compression techniques such as quantization and sparsification of model updates are widely used, they often achieve communication efficiency at the cost of reduced model performance. We employ over-parameterized random networks to approximate target networks through parameter pruning rather than direct optimization to overcome this limitation. This approach has been demonstrated to require transmitting no more than a single bit of information per model parameter. We show that SoTA methods fail to capitalize on the full attainable advantages in terms of communication efficiency using this approach. Accordingly, we propose a regularized loss function which considers the entropy of transmitted updates, resulting in notable improvements to communication and memory efficiency during federated training on edge devices without sacrificing accuracy
APA, Harvard, Vancouver, ISO, and other styles
2

Sigwele, Tshiamo, Yim Fun Hu, M. Ali, Jiachen Hou, M. Susanto, and H. Fitriawan. "An intelligent edge computing based semantic gateway for healthcare systems interoperability and collaboration." IEEE, 2018. http://hdl.handle.net/10454/17552.

Full text
Abstract:
Yes
The use of Information and Communications Technology (ICTs) in healthcare has the potential of minimizing medical errors, reducing healthcare cost and improving collaboration between healthcare systems which can dramatically improve the healthcare service quality. However interoperability within different healthcare systems (clinics/hospitals/pharmacies) remains an issue of further research due to a lack of collaboration and exchange of healthcare information. To solve this problem, cross healthcare system collaboration is required. This paper proposes a conceptual semantic based healthcare collaboration framework based on Internet of Things (IoT) infrastructure that is able to offer a secure cross system information and knowledge exchange between different healthcare systems seamlessly that is readable by both machines and humans. In the proposed framework, an intelligent semantic gateway is introduced where a web application with restful Application Programming Interface (API) is used to expose the healthcare information of each system for collaboration. A case study that exposed the patient's data between two different healthcare systems was practically demonstrated where a pharmacist can access the patient's electronic prescription from the clinic.
British Council Institutional Links grant under the BEIS-managed Newton Fund.
APA, Harvard, Vancouver, ISO, and other styles
3

Hasanaj, Enis, Albert Aveler, and William Söder. "Cooperative edge deepfake detection." Thesis, Jönköping University, JTH, Avdelningen för datateknik och informatik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-53790.

Full text
Abstract:
Deepfakes are an emerging problem in social media and for celebrities and political profiles, it can be devastating to their reputation if the technology ends up in the wrong hands. Creating deepfakes is becoming increasingly easy. Attempts have been made at detecting whether a face in an image is real or not but training these machine learning models can be a very time-consuming process. This research proposes a solution to training deepfake detection models cooperatively on the edge. This is done in order to evaluate if the training process, among other things, can be made more efficient with this approach.  The feasibility of edge training is evaluated by training machine learning models on several different types of iPhone devices. The models are trained using the YOLOv2 object detection system.  To test if the YOLOv2 object detection system is able to distinguish between real and fake human faces in images, several models are trained on a computer. Each model is trained with either different number of iterations or different subsets of data, since these metrics have been identified as important to the performance of the models. The performance of the models is evaluated by measuring the accuracy in detecting deepfakes.  Additionally, the deepfake detection models trained on a computer are ensembled using the bagging ensemble method. This is done in order to evaluate the feasibility of cooperatively training a deepfake detection model by combining several models.  Results show that the proposed solution is not feasible due to the time the training process takes on each mobile device. Additionally, each trained model is about 200 MB, and the size of the ensemble model grows linearly by each model added to the ensemble. This can cause the ensemble model to grow to several hundred gigabytes in size.
APA, Harvard, Vancouver, ISO, and other styles
4

Kattadige, Chamara Manoj Madarasinghe. "Network and Content Intelligence for 360 Degree Video Streaming Optimization." Thesis, The University of Sydney, 2023. https://hdl.handle.net/2123/29904.

Full text
Abstract:
In recent years, 360° videos, a.k.a. spherical frames, became popular among users creating an immersive streaming experience. Along with the advances in smart- phones and Head Mounted Devices (HMD) technology, many content providers have facilitated to host and stream 360° videos in both on-demand and live stream- ing modes. Therefore, many different applications have already arisen leveraging these immersive videos, especially to give viewers an impression of presence in a digital environment. For example, with 360° videos, now it is possible to connect people in a remote meeting in an interactive way which essentially increases the productivity of the meeting. Also, creating interactive learning materials using 360° videos for students will help deliver the learning outcomes effectively. However, streaming 360° videos is not an easy task due to several reasons. First, 360° video frames are 4–6 times larger than normal video frames to achieve the same quality as a normal video. Therefore, delivering these videos demands higher bandwidth in the network. Second, processing relatively larger frames requires more computational resources at the end devices, particularly for end user devices with limited resources. This will impact not only the delivery of 360° videos but also many other applications running on shared resources. Third, these videos need to be streamed with very low latency requirements due their interactive nature. Inability to satisfy these requirements can result in poor Quality of Experience (QoE) for the user. For example, insufficient bandwidth incurs frequent rebuffer- ing and poor video quality. Also, inadequate computational capacity can cause faster battery draining and unnecessary heating of the device, causing discomfort to the user. Motion or cyber–sickness to the user will be prevalent if there is an unnecessary delay in streaming. These circumstances will hinder providing im- mersive streaming experiences to the much-needed communities, especially those who do not have enough network resources. To address the above challenges, we believe that enhancements to the three main components in video streaming pipeline, server, network and client, are essential. Starting from network, it is beneficial for network providers to identify 360° video flows as early as possible and understand their behaviour in the network to effec- tively allocate sufficient resources for this video delivery without compromising the quality of other services. Content servers, at one end of this streaming pipeline, re- quire efficient 360° video frame processing mechanisms to support adaptive video streaming mechanisms such as ABR (Adaptive Bit Rate) based streaming, VP aware streaming, a streaming paradigm unique to 360° videos that select only part of the larger video frame that fall within the user-visible region, etc. On the other end, the client can be combined with edge-assisted streaming to deliver 360° video content with reduced latency and higher quality. Following the above optimization strategies, in this thesis, first, we propose a mech- anism named 360NorVic to extract 360° video flows from encrypted video traffic and analyze their traffic characteristics. We propose Machine Learning (ML) mod- els to classify 360° and normal videos under different scenarios such as offline, near real-time, VP-aware streaming and Mobile Network Operator (MNO) level stream- ing. Having extracted 360° video traffic traces both in packet and flow level data at higher accuracy, we analyze and understand the differences between 360° and normal video patterns in the encrypted traffic domain that is beneficial for effec- tive resource optimization for enhancing 360° video delivery. Second, we present a WGAN (Wesserstien Generative Adversarial Network) based data generation mechanism (namely VideoTrain++) to synthesize encrypted network video traffic, taking minimal data. Leveraging synthetic data, we show improved performance in 360° video traffic analysis, especially in ML-based classification in 360NorVic. Thirdly, we propose an effective 360° video frame partitioning mechanism (namely VASTile) at the server side to support VP-aware 360° video streaming with dy- namic tiles (or variable tiles) of different sizes and locations on the frame. VASTile takes a visual attention map on the video frames as the input and applies a com- putational geometric approach to generate a non-overlapping tile configuration to cover the video frames adaptive to the visual attention. We present VASTile as a scalable approach for video frame processing at the servers and a method to re- duce bandwidth consumption in network data transmission. Finally, by applying VASTile to the individual user VP at the client side and utilizing cache storage of Multi Access Edge Computing (MEC) servers, we propose OpCASH, a mech- anism to personalize the 360° video streaming with dynamic tiles with the edge assistance. While proposing an ILP based solution to effectively select cached variable tiles from MEC servers that might not be identical to the requested VP tiles by user, but still effectively cover the same VP region, OpCASH maximize the cache utilization and reduce the number of requests to the content servers in congested core network. With this approach, we demonstrate the gain in latency and bandwidth saving and video quality improvement in personalized 360° video streaming.
APA, Harvard, Vancouver, ISO, and other styles
5

Abernot, Madeleine. "Digital oscillatory neural network implementation on FPGA for edge artificial intelligence applications and learning." Electronic Thesis or Diss., Université de Montpellier (2022-....), 2023. http://www.theses.fr/2023UMONS074.

Full text
Abstract:
Au cours des dernières décennies, la multiplication des objets embarqués dans de nombreux domaines a considérablement augmenté la quantité de données à traiter et la complexité des tâches à résoudre, motivant l'émergence d'algorithmes probabilistes d'apprentissage tels que l'intelligence artificielle (IA) et les réseaux de neurones artificiels (ANN). Cependant, les systèmes matériels pour le calcul embarqué basés sur l'architecture von Neuman ne sont pas efficace pour traiter cette quantité de données. C'est pourquoi des paradigmes neuromorphiques dotés d'une mémoire distribuée sont étudiés, s'inspirant de la structure et de la représentation de l'information des réseaux de neurones biologiques. Dernièrement, la plupart de la recherche autour des paradigmes neuromorphiques ont exploré les réseaux de neurones à impulsion ou spiking neural networks (SNNs), qui s'inspirent des impulsions utilisées pour transmettre l'information dans les réseaux biologiques. Les SNNs encodent l'information temporellement à l'aide d'impulsions pour assurer un calcul de données continues naturel et à faible énergie. Récemment, les réseaux de neurones oscillatoires (ONN) sont apparu comme un paradigme neuromorphique alternatif pour du calcul temporel, rapide et efficace, à basse consommation. Les ONNs sont des réseaux d'oscillateurs couplés qui émulent les propriétés de calcul collectif des zones du cerveau par le biais d'oscillations. Les récentes implémentations d'ONN combinées à l'émergence de composants compacts à faible consommation d'énergie encouragent le développement des ONNs pour le calcul embarqué. L’état de l’art de l'ONN le configure comme un réseau de Hopfield oscillatoire (OHN) avec une architecture d’oscillateurs entièrement couplés pour effectuer de la reconnaissance de formes avec une précision limitée. Cependant, le grand nombre de synapses de l'architecture limite l’implémentation de larges ONNs et le champs des applications de l'ONN. Cette thèse se concentre pour étudier si et comment l'ONN peut résoudre des applications significatives d'IA embarquée à l'aide d'une preuve de concept de l'ONN implémenté en digital sur FPGA. Tout d'abord, ce travail explore de nouveaux algorithmes d'apprentissages pour OHN, non supervisé et supervisé, pour améliorer la précision et pour intégrer de l'apprentissage continu sur puce. Ensuite, cette thèse étudie de nouvelles architectures pour l'ONN en s'inspirant des architectures en couches des ANNs pour créer dans un premier temps des couches d'OHN en cascade puis des réseaux ONN multi-couche. Les nouveaux algorithmes d'apprentissage et les nouvelles architectures sont démontrées avec l'ONN digital pour des applications d'IA embarquée, telles que pour la robotique avec de l'évitement d'obstacles et pour le traitement d'images avec de la reconnaissance de formes, de la détection de contour, de l'extraction d'amers, ou de la classification
In the last decades, the multiplication of edge devices in many industry domains drastically increased the amount of data to treat and the complexity of tasks to solve, motivating the emergence of probabilistic machine learning algorithms with artificial intelligence (AI) and artificial neural networks (ANNs). However, classical edge hardware systems based on von Neuman architecture cannot efficiently handle this large amount of data. Thus, novel neuromorphic computing paradigms with distributed memory are explored, mimicking the structure and data representation of biological neural networks. Lately, most of the neuromorphic paradigm research has focused on Spiking neural networks (SNNs), taking inspiration from signal transmission through spikes in biological networks. In SNNs, information is transmitted through spikes using the time domain to provide a natural and low-energy continuous data computation. Recently, oscillatory neural networks (ONNs) appeared as an alternative neuromorphic paradigm for low-power, fast, and efficient time-domain computation. ONNs are networks of coupled oscillators emulating the collective computational properties of brain areas through oscillations. The recent ONN implementations combined with the emergence of low-power compact devices for ONN encourage novel attention over ONN for edge computing. State-of-the-art ONN is configured as an oscillatory Hopfield network (OHN) with fully coupled recurrent connections to perform pattern recognition with limited accuracy. However, the large number of OHN synapses limits the scalability of ONN implementation and the ONN application scope. The focus of this thesis is to study if and how ONN can solve meaningful AI edge applications using a proof-of-concept of the ONN paradigm with a digital implementation on FPGA. First, it explores novel learning algorithms for OHN, unsupervised and supervised, to improve accuracy performances and to provide continual on-chip learning. Then, it studies novel ONN architectures, taking inspiration from state-of-the-art layered ANN models, to create cascaded OHNs and multi-layer ONNs. Novel learning algorithms and architectures are demonstrated with the digital design performing edge AI applications, from image processing with pattern recognition, image edge detection, feature extraction, or image classification, to robotics applications with obstacle avoidance
APA, Harvard, Vancouver, ISO, and other styles
6

Laroui, Mohammed. "Distributed edge computing for enhanced IoT devices and new generation network efficiency." Electronic Thesis or Diss., Université Paris Cité, 2022. http://www.theses.fr/2022UNIP7078.

Full text
Abstract:
Dans le cloud computing, les services et les ressources sont centralisés dans des centres de données auxquels l’utilisateur peut accéder à partir de ses appareils connectés. L’infrastructure cloud traditionnelle sera confrontée à une série de défis en raison de la centralisation de calcul, du stockage et de la mise en réseau dans un petit nombre de centres de données, et de la longue distance entre les appareils connectés et les centres de données distants. Pour répondre à ce besoin, l’edge computing s’appuie sur un modèle dans lequel les ressources de calcul sont distribuées dans le edge de réseau selon les besoins, tout en décentralisant le traitement des données du cloud vers le edge autant que possible. Ainsi, il est possible d’avoir rapidement des informations exploitables basées sur des données qui varient dans le temps. Dans cette thèse, nous proposons de nouveaux modèles d’optimisation pour optimiser l’utilisation des ressources dans le edge de réseau pour deux domaines de recherche de l’edge computing, le "service offloading" et "vehicular edge computing". Nous étudions différents cas d’utilisation dans chaque domaine de recherche. Pour les solutions optimales, Premièrement, pour le "service offloading", nous proposons des algorithmes optimaux pour le placement des services dans les serveurs edge (Tasks, Virtual Network Functions (VNF), Service Function Chain (SFC)) en tenant compte des contraintes de ressources de calcul. De plus, pour "vehicular edge computing", nous proposons des modèles exacts liés à la maximisation de la couverture des véhicules par les taxis et les Unmanned Aerial Vehicle (UAV) pour les applications de streaming vidéo en ligne. De plus, nous proposons un edge- autopilot VNFs offloading dans le edge de réseau pour la conduite autonome. Les résultats de l’évaluation montrent l’efficacité des algorithmes proposés dans les réseaux avec un nombre limité d’appareils en termes de temps, de coût et d’utilisation des ressources. Pour faire face aux réseaux denses avec un nombre élevé d’appareils et des problèmes d’évolutivité, nous proposons des algorithmes à grande échelle qui prennent en charge une énorme quantité d’appareils, de données et de demandes d’utilisateurs. Des algorithmes heuristiques sont proposés pour l’orchestration SFC, couverture maximale des serveurs edge mobiles (véhicules). De plus, les algorithmes d’intelligence artificielle (apprentissage automatique, apprentissage en profondeur et apprentissage par renforcement en profondeur) sont utilisés pour le placement des "5G VNF slices", le placement des "VNF-autopilot" et la navigation autonome des drones. Les résultats numériques donnent de bons résultats par rapport aux algorithmes exacts avec haute efficacité en temps
Traditional cloud infrastructure will face a series of challenges due to the centralization of computing, storage, and networking in a small number of data centers, and the long-distance between connected devices and remote data centers. To meet this challenge, edge computing seems to be a promising possibility that provides resources closer to IoT devices. In the cloud computing model, compute resources and services are often centralized in large data centers that end-users access from the network. This model has an important economic value and more efficient resource-sharing capabilities. New forms of end-user experience such as the Internet of Things require computing resources near to the end-user devices at the network edge. To meet this need, edge computing relies on a model in which computing resources are distributed to the edge of a network as needed, while decentralizing the data processing from the cloud to the edge as possible. Thus, it is possible to quickly have actionable information based on data that varies over time. In this thesis, we propose novel optimization models to optimize the resource utilization at the network edge for two edge computing research directions, service offloading and vehicular edge computing. We study different use cases in each research direction. For the optimal solutions, First, for service offloading we propose optimal algorithms for services placement at the network edge (Tasks, Virtual Network Functions (VNF), Service Function Chain (SFC)) by taking into account the computing resources constraints. Moreover, for vehicular edge computing, we propose exact models related to maximizing the coverage of vehicles by both Taxis and Unmanned Aerial Vehicle (UAV) for online video streaming applications. In addition, we propose optimal edge-autopilot VNFs offloading at the network edge for autonomous driving. The evaluation results show the efficiency of the proposed algorithms in small-scale networks in terms of time, cost, and resource utilization. To deal with dense networks with a high number of devices and scalability issues, we propose large-scale algorithms that support a huge amount of devices, data, and users requests. Heuristic algorithms are proposed for SFC orchestration, maximum coverage of mobile edge servers (vehicles). Moreover, The artificial intelligence algorithms (machine learning, deep learning, and deep reinforcement learning) are used for 5G VNF slices placement, edge-autopilot VNF placement, and autonomous UAV navigation. The numerical results give good results compared with exact algorithms with high efficiency in terms of time
APA, Harvard, Vancouver, ISO, and other styles
7

Minerva, Roberto. "Will the Telco survive to an ever changing world ? Technical considerations leading to disruptive scenarios." Thesis, Evry, Institut national des télécommunications, 2013. http://www.theses.fr/2013TELE0011/document.

Full text
Abstract:
Le secteur des télécommunications passe par une phase délicate en raison de profondes mutations technologiques, principalement motivées par le développement de l'Internet. Elles ont un impact majeur sur l'industrie des télécommunications dans son ensemble et, par conséquent, sur les futurs déploiements des nouveaux réseaux, plateformes et services. L'évolution de l'Internet a un impact particulièrement fort sur les opérateurs des télécommunications (Telcos). En fait, l'industrie des télécommunications est à la veille de changements majeurs en raison de nombreux facteurs, comme par exemple la banalisation progressive de la connectivité, la domination dans le domaine des services de sociétés du web (Webcos), l'importance croissante de solutions à base de logiciels et la flexibilité qu'elles introduisent (par rapport au système statique des opérateurs télécoms). Cette thèse élabore, propose et compare les scénarios possibles basés sur des solutions et des approches qui sont technologiquement viables. Les scénarios identifiés couvrent un large éventail de possibilités: 1) Telco traditionnel; 2) Telco transporteur de Bits; 3) Telco facilitateur de Plateforme; 4) Telco fournisseur de services; 5) Disparition des Telco. Pour chaque scénario, une plateforme viable (selon le point de vue des opérateurs télécoms) est décrite avec ses avantages potentiels et le portefeuille de services qui pourraient être fournis
The telecommunications industry is going through a difficult phase because of profound technological changes, mainly originated by the development of the Internet. They have a major impact on the telecommunications industry as a whole and, consequently, the future deployment of new networks, platforms and services. The evolution of the Internet has a particularly strong impact on telecommunications operators (Telcos). In fact, the telecommunications industry is on the verge of major changes due to many factors, such as the gradual commoditization of connectivity, the dominance of web services companies (Webcos), the growing importance of software based solutions that introduce flexibility (compared to static system of telecom operators). This thesis develops, proposes and compares plausible future scenarios based on future solutions and approaches that will be technologically feasible and viable. Identified scenarios cover a wide range of possibilities: 1) Traditional Telco; 2) Telco as Bit Carrier; 3) Telco as Platform Provider; 4) Telco as Service Provider; 5) Telco Disappearance. For each scenario, a viable platform (from the point of view of telecom operators) is described highlighting the enabled service portfolio and its potential benefits
APA, Harvard, Vancouver, ISO, and other styles
8

PELUSO, VALENTINO. "Optimization Tools for ConvNets on the Edge." Doctoral thesis, Politecnico di Torino, 2020. http://hdl.handle.net/11583/2845792.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Busacca, Fabio Antonino. "AI for Resource Allocation and Resource Allocation for AI: a two-fold paradigm at the network edge." Doctoral thesis, Università degli Studi di Palermo, 2022. https://hdl.handle.net/10447/573371.

Full text
Abstract:
5G-and-beyond and Internet of Things (IoT) technologies are pushing a shift from the classic cloud-centric view of the network to a new edge-centric vision. In such a perspective, the computation, communication and storage resources are moved closer to the user, to the benefit of network responsiveness/latency, and of an improved context-awareness, that is, the ability to tailor the network services to the live user's experience. However, these improvements do not come for free: edge networks are highly constrained, and do not match the resource abundance of their cloud counterparts. In such a perspective, the proper management of the few available resources is of crucial importance to improve the network performance in terms of responsiveness, throughput, and power consumption. However, networks in the so-called Age of Big Data result from the dynamic interactions of massive amounts of heterogeneous devices. As a consequence, traditional model-based Resource Allocation algorithms fail to cope with this dynamic and complex networks, and are being replaced by more flexible AI-based techniques as a result. In such a way, it is possible to design intelligent resource allocation frameworks, able to quickly adapt to the everchanging dynamics of the network edge, and to best exploit the few available resources. Hence, Artificial Intelligence (AI), and, more specifically Machine Learning (ML) techniques, can clearly play a fundamental role in boosting and supporting resource allocation techniques at the edge. But can AI/ML benefit from optimal Resource Allocation? Recently, the evolution towards Distributed and Federated Learning approaches, i.e. where the learning process takes place in parallel at several devices, has brought important advantages in terms of reduction of the computational load of the ML algorithms, in the amount of information transmitted by the network nodes, and in terms of privacy. However, the scarceness of energy, processing, and, possibly, communication resources at the edge, especially in the IoT case, calls for proper resource management frameworks. In such a view, the available resources should be assigned to reduce the learning time, while also keeping an eye on the energy consumption of the network nodes. According to this perspective, a two-fold paradigm can emerge at the network edge, where AI can boost the performance of Resource Allocation, and, vice versa, optimal Resource Allocation techniques can speed up the learning process of AI algorithms. Part I of this work of thesis explores the first topic, i.e. the usage of AI to support Resource Allocation at the edge, with a specific focus on two use-cases, namely UAV-assisted cellular networks, and vehicular networks. Part II deals instead with the topic of Resource Allocation for AI, and, specifically, with the case of the integration between Federated Learning techniques and the LoRa LPWAN protocol. The designed integration framework has been validated on both simulation environments, and, most importantly, on the Colosseum platform, the biggest channel emulator in the world.
APA, Harvard, Vancouver, ISO, and other styles
10

Minerva, Roberto. "Will the Telco survive to an ever changing world ? Technical considerations leading to disruptive scenarios." Electronic Thesis or Diss., Evry, Institut national des télécommunications, 2013. http://www.theses.fr/2013TELE0011.

Full text
Abstract:
Le secteur des télécommunications passe par une phase délicate en raison de profondes mutations technologiques, principalement motivées par le développement de l'Internet. Elles ont un impact majeur sur l'industrie des télécommunications dans son ensemble et, par conséquent, sur les futurs déploiements des nouveaux réseaux, plateformes et services. L'évolution de l'Internet a un impact particulièrement fort sur les opérateurs des télécommunications (Telcos). En fait, l'industrie des télécommunications est à la veille de changements majeurs en raison de nombreux facteurs, comme par exemple la banalisation progressive de la connectivité, la domination dans le domaine des services de sociétés du web (Webcos), l'importance croissante de solutions à base de logiciels et la flexibilité qu'elles introduisent (par rapport au système statique des opérateurs télécoms). Cette thèse élabore, propose et compare les scénarios possibles basés sur des solutions et des approches qui sont technologiquement viables. Les scénarios identifiés couvrent un large éventail de possibilités: 1) Telco traditionnel; 2) Telco transporteur de Bits; 3) Telco facilitateur de Plateforme; 4) Telco fournisseur de services; 5) Disparition des Telco. Pour chaque scénario, une plateforme viable (selon le point de vue des opérateurs télécoms) est décrite avec ses avantages potentiels et le portefeuille de services qui pourraient être fournis
The telecommunications industry is going through a difficult phase because of profound technological changes, mainly originated by the development of the Internet. They have a major impact on the telecommunications industry as a whole and, consequently, the future deployment of new networks, platforms and services. The evolution of the Internet has a particularly strong impact on telecommunications operators (Telcos). In fact, the telecommunications industry is on the verge of major changes due to many factors, such as the gradual commoditization of connectivity, the dominance of web services companies (Webcos), the growing importance of software based solutions that introduce flexibility (compared to static system of telecom operators). This thesis develops, proposes and compares plausible future scenarios based on future solutions and approaches that will be technologically feasible and viable. Identified scenarios cover a wide range of possibilities: 1) Traditional Telco; 2) Telco as Bit Carrier; 3) Telco as Platform Provider; 4) Telco as Service Provider; 5) Telco Disappearance. For each scenario, a viable platform (from the point of view of telecom operators) is described highlighting the enabled service portfolio and its potential benefits
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Intelligent Edge Networks"

1

Groscurth, Chris R. Future-Ready Leadership. ABC-CLIO, LLC, 2018. http://dx.doi.org/10.5040/9798400655357.

Full text
Abstract:
Provides executive leadership teams with information, tools, and advice they need to lead their organizations into the "future of work," characterized by transformative, smart, and connected technologies already under way, including artificial intelligence, the Internet of things, and automation. The technological and economic forces of the fourth industrial revolution (4IR) are shifting organizations in radical new directions. Automation is taking place not only in factories but in retail environments, and it is not just powerful or precise: it is intelligent, and it learns. Leaders must learn to rely on new sources of data, analytics, and intelligence in their efforts to anticipate emerging trends, forecast unforeseen consequences, make sense of systems and complexity, communicate constantly, build strong networks based on trust, and ultimately, win a following. Future-Ready Leadership is an invaluable resource for leaders and leadership educators seeking to transform 4IR trends into a source of collaborative (as opposed to competitive) advantage. A blueprint for reshaping the future of work, the book meets readers' "awareness need" by exploring cutting-edge research on technology's impact on the workplace. Each chapter uses data to set up a specific future of work leadership challenge, offering readers practical solutions and advice, actionable recommendations, and tools for reflection and action that can be put into practice right away.
APA, Harvard, Vancouver, ISO, and other styles
2

Jantsch, Axel, Amir M. Rahmani, Pasi Liljeberg, and Jürgo-Sören Preden. Fog Computing in the Internet of Things: Intelligence at the Edge. Springer, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jantsch, Axel, Amir M. Rahmani, Pasi Liljeberg, and Jürgo-Sören Preden. Fog Computing in the Internet of Things: Intelligence at the Edge. Springer, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Stachnio, Konrad. Civilization in Overdrive: Conversations at the Edge of the Human Future. Clarity Press, Inc., 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Liang-Jie, Bedir Tekinerdogan, Shijun Liu, and Mikio Aoyama. Edge Computing – EDGE 2018: Second International Conference, Held as Part of the Services Conference Federation, SCF 2018, Seattle, WA, USA, June ... Springer, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Muggleton, Stephen, and Nicholas Chater, eds. Human-Like Machine Intelligence. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198862536.001.0001.

Full text
Abstract:
In recent years there has been increasing excitement concerning the potential of Artificial Intelligence to transform human society. This book addresses the leading edge of research in this area. The research described aims to address present incompatibilities of Human and Machine reasoning and learning approaches. According to the influential US funding agency DARPA (originator of the Internet and Self-Driving Cars) this new area represents the Third Wave of Artificial Intelligence (3AI, 2020s–2030s), and is being actively investigated in the US, Europe and China. The EPSRC’s UK network on Human-Like Computing (HLC) was one of the first internationally to initiate and support research specifically in this area. Starting activities in 2018, the network represents around sixty leading UK groups Artificial Intelligence and Cognitive Scientists involved in the development of the inter-disciplinary area of HLC. The research of network groups aims to address key unsolved problems at the interface between Psychology and Computer Science. The chapters of this book have been authored by a mixture of these UK and other international specialists based on recent workshops and discussions at the Machine Intelligence 20 and 21 workshops (2016,2019) and the Third Wave Artificial Intelligence workshop (2019). Some of the key questions addressed by the Human-Like Computing programme include how AI systems might 1) explain their decisions effectively, 2) interact with human beings in natural language, 3) learn from small numbers of examples and 4) learn with minimal supervision. Solving such fundamental problems involves new foundational research in both the Psychology of perception and interaction as well as the development of novel algorithmic approaches in Artificial Intelligence.
APA, Harvard, Vancouver, ISO, and other styles
7

Civilization in Overdrive: Conversations at the Edge of the Human Future. Clarity Press, Inc., 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Taylor, Brian L. Machine Learning: A Quick Guide to Artificial Intelligence, Neural Network and Cutting Edge Deep Learning Techniques for Beginners. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Madhu, G., Sandeep Kautish, A. Govardhan, and Avinash Sharma, eds. Emerging Computational Approaches in Telehealth and Telemedicine: A Look at The Post-COVID-19 Landscape. BENTHAM SCIENCE PUBLISHERS, 2022. http://dx.doi.org/10.2174/97898150792721220101.

Full text
Abstract:
This book gives an overview of innovative approaches in telehealth and telemedicine. The Goal of the content is to inform readers about recent computer applications in e-health, including Internet of Things (IoT) and Internet of Medical Things (IoMT) technology. The 9 chapters will guide readers to determine the urgency to intervene in specific medical cases, and to assess risk to healthcare workers. The focus on telehealth along with telemedicine, encompasses a broader spectrum of remote healthcare services for the reader to understand. Chapters cover the following topics: - A COVID-19 care system for virus precaution, prevention, and treatment - The Internet of Things (IoT) in Telemedicine, - Artificial Intelligence for Remote Patient Monitoring systems - Machine Learning in Telemedicine - Convolutional Neural Networks for the detection and prediction of melanoma in skin lesions - COVID-19 virus contact tracing via mobile apps - IoT and Cloud convergence in healthcare - Lung cancer classification and detection using deep learning - Telemedicine in India This book will assist students, academics, and medical professionals in learning about cutting-edge telemedicine technologies. It will also inform beginner researchers in medicine about upcoming trends, problems, and future research paths in telehealth and telemedicine for infectious disease control and cancer diagnosis.
APA, Harvard, Vancouver, ISO, and other styles
10

Falco, Gregory J., and Eric Rosenbach. Confronting Cyber Risk. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780197526545.001.0001.

Full text
Abstract:
Confronting Cyber Risk: An Embedded Endurance Strategy for Cybersecurity is a practical leadership handbook defining a new strategy for improving cybersecurity and mitigating cyber risk. Written by two leading experts with extensive professional experience in cybersecurity, the book provides CEOs and cyber newcomers alike with novel, concrete guidance on how to implement a cutting-edge strategy to mitigate an organization’s overall risk to malicious cyberattacks. Using short, real-world case studies, the book highlights the need to address attack prevention and the resilience of each digital asset while also accounting for an incident’s potential impact on overall operations. In a world of hackers, artificial intelligence, and persistent ransomware attacks, the Embedded Endurance strategy embraces the reality of interdependent digital assets and provides an approach that addresses cyber risk at both the micro level (people, networks, systems and data) and the macro level (the organization). Most books about cybersecurity focus entirely on technology; the Embedded Endurance strategy recognizes the need for sophisticated thinking about hardware and software while also extending beyond to address operational, reputational and litigation risk. This book both provides the reader with a solid grounding in important prevention-focused technologies—such as cloud-based security and intrusion detection—and emphasizes the important role of incident response. By implementing an Embedded Endurance strategy, you can guide your team to blunt major cyber incidents with preventative and resilience measures engaged systematically across your organization.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Intelligent Edge Networks"

1

Yao, Haipeng, and Mohsen Guizani. "Mobile Edge Computing Enabled Intelligent IoT." In Wireless Networks, 271–350. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-26987-5_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Thiruvasagam, Prabhu Kaliyammal, and Manikantan Srinivasan. "Intelligent edge computing for B5G networks." In AI in Wireless for Beyond 5G Networks, 122–46. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003303527-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jin, Wenquan, Minh Quang Hoang, Luong Trung Kien, and Le Anh Ngoc. "Continuous Deep Learning Based on Knowledge Transfer in Edge Computing." In Intelligent Systems and Networks, 488–95. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-4725-6_59.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Michail-Alexandros, Kourtis, Christinakis Dimitris, Xilouris George, Thanos Sarlas, Soenen Thomas, and Kourtis Anastasios. "Evaluation of Edge Technologies Over 5G Networks." In Advances in Intelligent Systems and Computing, 407–17. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-40690-5_40.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jin, Wenquan, Vijender Kumar Solanki, Anh Ngoc Le, and Dohyeun Kim. "Real-Time Inference Approach Based on Gateway-Centric Edge Computing for Intelligent Services." In Intelligent Systems and Networks, 355–61. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-2094-2_44.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

To, Hai-Thien, Trung-Kien Le, and Chi-Luan Le. "Real-Time End-to-End 3D Human Pose Prediction on AI Edge Devices." In Intelligent Systems and Networks, 248–55. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-2094-2_31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Nguyen, Thuong H. N., Quy C. Nguyen, Viet H. H. Ngo, Fabien Ferrero, and Tuan V. Pham. "Edge AI Implementation for Recognizing Sounds Created by Human Activities in Smart Offices Design Concepts." In Intelligent Systems and Networks, 608–14. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-3394-3_70.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Dong, Hoang-Nhu, Nguyen-Xuan Ha, and Dang-Minh Tuan. "A New Approach for Large-Scale Face-Feature Matching Based on LSH and FPGA for Edge Processing." In Intelligent Systems and Networks, 337–44. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-2094-2_42.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gaurav, Akshat, B. B. Gupta, and Kwok Tai Chui. "Edge Computing-Based DDoS Attack Detection for Intelligent Transportation Systems." In Lecture Notes in Networks and Systems, 175–84. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-8664-1_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Márquez-Sánchez, Sergio, Sergio Alonso-Rollán, Francisco Pinto-Santos, Aiman Erbad, Muhammad Hanan Abdul Ibrar, Javier Hernandez Fernandez, Mahdi Houchati, and Juan Manuel Corchado. "Adaptive and Intelligent Edge Computing Based Building Energy Management System." In Lecture Notes in Networks and Systems, 37–48. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-36957-5_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Intelligent Edge Networks"

1

Tang, Jianhang, Jiangtian Nie, Wei Yang, Bryan Lim, Yang Zhang, Zehui Xiong, Dusit Niyato, and Mohsen Guizani. "Intelligent Edge-Aided Network Slicing for 5G and Beyond Networks." In ICC 2022 - IEEE International Conference on Communications. IEEE, 2022. http://dx.doi.org/10.1109/icc45855.2022.9882270.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hu, Haoji, Hangguan Shan, Zhuolin Zheng, Zhuojun Huang, Chengfei Cai, Chuankun Wang, Xiaojian Zhen, Lu Yu, Zhaoyang Zhang, and Tony Q. S. Quek. "Intelligent Video Surveillance based on Mobile Edge Networks." In 2018 IEEE International Conference on Communication Systems (ICCS). IEEE, 2018. http://dx.doi.org/10.1109/iccs.2018.8689194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lloret, Jaime. "Intelligent systems for multimedia delivery in software defined networks." In 2018 Third International Conference on Fog and Mobile Edge Computing (FMEC). IEEE, 2018. http://dx.doi.org/10.1109/fmec.2018.8364037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Zhidu, Ji Lv, and Dapeng Wu. "Intelligent Emotion Detection Method in Mobile Edge Computing Networks." In 2020 IEEE/CIC International Conference on Communications in China (ICCC). IEEE, 2020. http://dx.doi.org/10.1109/iccc49849.2020.9238777.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hesselbach, Xavier. "Intelligent Network Slicing in the Multi-Access Edge Computing for 6G Networks." In 2023 23rd International Conference on Transparent Optical Networks (ICTON). IEEE, 2023. http://dx.doi.org/10.1109/icton59386.2023.10207532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kabir, Maliha, Teja Sree Mummadi, and Prabha Sundaravadivel. "Poster: Towards Edge-Intelligent Wearable for early Drowning Detection." In WUWNet'22: The 16th International Conference on Underwater Networks & Systems. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3567600.3569547.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Shi, Zhen-gang, and Qin-zi Li. "Edge Detection for Medical Image Based on PSO Algorithm." In 2010 3rd International Conference on Intelligent Networks and Intelligent Systems (ICINIS). IEEE, 2010. http://dx.doi.org/10.1109/icinis.2010.23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Xiuli, and Wei Liu. "The Research on the Methods of Image Edge Detection." In 2010 3rd International Conference on Intelligent Networks and Intelligent Systems (ICINIS). IEEE, 2010. http://dx.doi.org/10.1109/icinis.2010.70.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Horváth, Márton Áron. "Utilization of AI in 5G Edge Networks." In 1st Workshop on Intelligent Infocommunication Networks, Systems and Services (WI2NS2). Online: Budapest University of Technology and Economics, 2023. http://dx.doi.org/10.3311/wins2023-017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yang, Huapeng, Zhangqin Huang, Yu Liang, Xiaobo Zhang, Ling Huang, and Shen Qiu. "IVAS: An Intelligent Video Analysis System based on Edge Computing." In 2023 IEEE 48th Conference on Local Computer Networks (LCN). IEEE, 2023. http://dx.doi.org/10.1109/lcn58197.2023.10223361.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Intelligent Edge Networks"

1

Ruvinsky, Alicia, Timothy Garton, Daniel Chausse, Rajeev Agrawal, Harland Yu, and Ernest Miller. Accelerating the tactical decision process with High-Performance Computing (HPC) on the edge : motivation, framework, and use cases. Engineer Research and Development Center (U.S.), September 2021. http://dx.doi.org/10.21079/11681/42169.

Full text
Abstract:
Managing the ever-growing volume and velocity of data across the battlefield is a critical problem for warfighters. Solving this problem will require a fundamental change in how battlefield analyses are performed. A new approach to making decisions on the battlefield will eliminate data transport delays by moving the analytical capabilities closer to data sources. Decision cycles depend on the speed at which data can be captured and converted to actionable information for decision making. Real-time situational awareness is achieved by locating computational assets at the tactical edge. Accelerating the tactical decision process leverages capabilities in three technology areas: (1) High-Performance Computing (HPC), (2) Machine Learning (ML), and (3) Internet of Things (IoT). Exploiting these areas can reduce network traffic and shorten the time required to transform data into actionable information. Faster decision cycles may revolutionize battlefield operations. Presented is an overview of an artificial intelligence (AI) system design for near-real-time analytics in a tactical operational environment executing on co-located, mobile HPC hardware. The report contains the following sections, (1) an introduction describing motivation, background, and state of technology, (2) descriptions of tactical decision process leveraging HPC problem definition and use case, and (3) HPC tactical data analytics framework design enabling data to decisions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography