Academic literature on the topic 'Edge computing with artificial intelligence'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Edge computing with artificial intelligence.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Edge computing with artificial intelligence"

1

Songlin Chen, Songlin Chen, Hong Wen Songlin Chen, and Jinsong Wu Hong Wen. "Artificial Intelligence Based Traffic Control for Edge Computing Assisted Vehicle Networks." 網際網路技術學刊 23, no. 5 (September 2022): 989–96. http://dx.doi.org/10.53106/160792642022092305007.

Full text
Abstract:
<p>Edge computing supported vehicle networks have attracted considerable attention in recent years both from industry and academia due to their extensive applications in urban traffic control systems. We present a general overview of Artificial Intelligence (AI)-based traffic control approaches which focuses mainly on dynamic traffic control via edge computing devices. A collaborative edge computing network embedded in the AI-based traffic control system is proposed to process the massive data from roadside sensors to shorten the real-time response time, which supports efficient traffic control and maximizes the utilization of computing resources in terms of incident levels associated with different rescue schemes. Furthermore, several open research issues and indicated future directions are discussed.</p> <p>&nbsp;</p>
APA, Harvard, Vancouver, ISO, and other styles
2

Deng, Shuiguang, Hailiang Zhao, Weijia Fang, Jianwei Yin, Schahram Dustdar, and Albert Y. Zomaya. "Edge Intelligence: The Confluence of Edge Computing and Artificial Intelligence." IEEE Internet of Things Journal 7, no. 8 (August 2020): 7457–69. http://dx.doi.org/10.1109/jiot.2020.2984887.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Debauche, Olivier, Meryem Elmoulat, Saïd Mahmoudi, Sidi Ahmed Mahmoudi, Adriano Guttadauria, Pierre Manneback, and Frédéric Lebeau. "Towards Landslides Early Warning System With Fog - Edge Computing And Artificial Intelligence**." Journal of Ubiquitous Systems and Pervasive Networks 15, no. 02 (March 1, 2021): 11–17. http://dx.doi.org/10.5383/juspn.15.02.002.

Full text
Abstract:
Landslides are phenomena that cause significant human and economic losses. Researchers have investigated the prediction of high landslides susceptibility with various methodologies based upon statistical and mathematical models, in addition to artificial intelligence tools. These methodologies allow to determine the areas that could present a serious risk of landslides. Monitoring these risky areas is particularly important for developing an Early Warning Systems (EWS). As matter of fact, the variety of landslides’ types make their monitoring a sophisticated task to accomplish. Indeed, each landslide area has its own specificities and potential triggering factors; therefore, there is no single device that can monitor all types of landslides. Consequently, Wireless Sensor Networks (WSN) combined with Internet of Things (IoT) allow to set up large-scale data acquisition systems. In addition, recent advances in Artificial Intelligence (AI) and Federated Learning (FL) allow to develop performant algorithms to analyze this data and predict early landslides events at edge level (on gateways). These algorithms are trained in this case at fog level on specific hardware. The novelty of the work proposed in this paper is the integration of Federated Learning based on Fog-Edge approaches to continuously improve prediction models.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhou, Zhi, Xu Chen, En Li, Liekang Zeng, Ke Luo, and Junshan Zhang. "Edge Intelligence: Paving the Last Mile of Artificial Intelligence With Edge Computing." Proceedings of the IEEE 107, no. 8 (August 2019): 1738–62. http://dx.doi.org/10.1109/jproc.2019.2918951.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sathish. "Artificial Intelligence based Edge Computing Framework for Optimization of Mobile Communication." Journal of ISMAC 2, no. 3 (July 9, 2020): 160–65. http://dx.doi.org/10.36548/jismac.2020.3.004.

Full text
Abstract:
For improving the mobile service quality and acceleration of content delivery, edge computing techniques have been providing optimal solution to bridge the device requirements and cloud capacity by network edges. The advancements of technologies like edge computing and mobile communication has contributed greatly towards these developments. The mobile edge system is enabled with Machine Learning techniques in order to improve the edge system intelligence, optimization of communication, caching and mobile edge computing. For this purpose, a smart framework is developed based on artificial intelligence enabling reduction of unwanted communication load of the system as well as enhancement of applications and optimization of the system dynamically. The models can be trained more accurately using the learning parameters that are exchanged between the edge nodes and the collaborating devices. The adaptivity and cognitive ability of the system is enhanced towards the mobile communication system despite the low learning overhead and helps in attaining a near optimal performance. The opportunities and challenges of smart systems in the near future are also discussed in this paper.
APA, Harvard, Vancouver, ISO, and other styles
6

Sun, Junlei. "The Legal Regulation of Artificial Intelligence and Edge Computing Automation Decision-Making Risk in Wireless Network Communication." Wireless Communications and Mobile Computing 2022 (March 12, 2022): 1–13. http://dx.doi.org/10.1155/2022/1303252.

Full text
Abstract:
This article is aimed at studying the legal regulation of artificial intelligence and edge computing automated decision-making risks in wireless network communications. The data under artificial intelligence is full of flexibility and vitality, which has changed the way of data existence in the whole society. Its core is various algorithm programs, which determine the existence of artificial intelligence. In this environment, society develops rapidly with unstoppable momentum. However, from a legal perspective, artificial intelligence has algorithmic discrimination, such as gender discrimination, clothing discrimination, and racial discrimination. It does not possess openness, objectivity, and accountability. The consequences are sometimes serious enough to endanger the public interest of the entire society, leading to market disorder, etc. Therefore, the problem of artificial intelligence algorithm discrimination remains to be solved. This article uses algorithms to adjust algorithm discrimination to reduce the harm caused by artificial intelligence algorithm discrimination to a certain extent. First of all, this article introduces a regulatory-based edge cloud computing architecture model. It is mentioned that distributed cloud computing can use subsystems to calculate various resources and storage resources and can make automated decisions when calculating certain data. In order to reduce the impact of algorithm discrimination and trigger data diversification to reduce the probability of discrimination, an edge computing network data capture system is designed. And this article mentions the BP neural network model. The BP neural network model is divided into input layer, output layer, and hidden layer. The training samples are passed from the input layer to the output layer through the hidden layer. If the output information does not meet expectations, the error will be back-propagated, and the connection weight will be adjusted continuously. This paper proposes a deep learning system model in real-time artificial intelligence driven by edge computing. When this model is applied to legal regulations, it can cooperate with edge computing and artificial intelligence algorithms to provide high-precision automated decision-making. Finally, this paper designs an artificial intelligence-assisted automated decision-making experiment based on the theory of legal computing. This paper proposes a Bayesian algorithm that uses edge algorithms to merge into artificial intelligence and verifies the feasibility of this hypothesis through experiments. The experimental results show that it has a certain ability to regulate algorithmic discrimination caused by artificial intelligence in legal regulations. It can improve the regulatory effects of laws and regulations to a certain extent, and the improved artificial intelligence Bayesian algorithm clustering effect of edge computing is increased by about 7.2%.
APA, Harvard, Vancouver, ISO, and other styles
7

Elmoulat, Meryem, Olivier Debauche, Saïd Mahmoudi, Sidi Ahmed Mahmoudi, Pierre Manneback, and Frédéric Lebeau. "Edge Computing and Artificial Intelligence for Landslides Monitoring." Procedia Computer Science 177 (2020): 480–87. http://dx.doi.org/10.1016/j.procs.2020.10.066.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Huh, Jun-Ho, and Yeong-Seok Seo. "Understanding Edge Computing: Engineering Evolution With Artificial Intelligence." IEEE Access 7 (2019): 164229–45. http://dx.doi.org/10.1109/access.2019.2945338.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Shanshan. "Enterprise Management Optimization by Using Artificial Intelligence and Edge Computing." International Journal of Distributed Systems and Technologies 13, no. 3 (July 1, 2022): 1–9. http://dx.doi.org/10.4018/ijdst.307994.

Full text
Abstract:
In the internet era, huge data is generated every day. With the help of cloud computing, enterprises can store and analyze these data more conveniently. With the emergence of the internet of things, more hardware devices have accessed the network and produced massive data. The data heavily relies on cloud computing for centralized data processing and analysis. However, the rapid growth of data volume has exceeded the network throughput capacity of cloud computing. By deploying computing nodes at the edge of the local network, edge computing allows devices to complete data collection and preprocessing in the local network. Thus, it can overcome the problems of low efficiency and large transmission delay of cloud computing for massive native data. This paper designs a human trajectory training system for enterprise management. The simulation demonstrates that the system can support human trajectory tracing and prediction for enterprise management.
APA, Harvard, Vancouver, ISO, and other styles
10

Radanliev, Petar, David De Roure, Kevin Page, Max Van Kleek, Omar Santos, La’Treall Maddox, Pete Burnap, Eirini Anthi, and Carsten Maple. "Design of a dynamic and self-adapting system, supported with artificial intelligence, machine learning and real-time intelligence for predictive cyber risk analytics in extreme environments – cyber risk in the colonisation of Mars." Safety in Extreme Environments 2, no. 3 (October 2020): 219–30. http://dx.doi.org/10.1007/s42797-021-00025-1.

Full text
Abstract:
AbstractMultiple governmental agencies and private organisations have made commitments for the colonisation of Mars. Such colonisation requires complex systems and infrastructure that could be very costly to repair or replace in cases of cyber-attacks. This paper surveys deep learning algorithms, IoT cyber security and risk models, and established mathematical formulas to identify the best approach for developing a dynamic and self-adapting system for predictive cyber risk analytics supported with Artificial Intelligence and Machine Learning and real-time intelligence in edge computing. The paper presents a new mathematical approach for integrating concepts for cognition engine design, edge computing and Artificial Intelligence and Machine Learning to automate anomaly detection. This engine instigates a step change by applying Artificial Intelligence and Machine Learning embedded at the edge of IoT networks, to deliver safe and functional real-time intelligence for predictive cyber risk analytics. This will enhance capacities for risk analytics and assists in the creation of a comprehensive and systematic understanding of the opportunities and threats that arise when edge computing nodes are deployed, and when Artificial Intelligence and Machine Learning technologies are migrated to the periphery of the internet and into local IoT networks.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Edge computing with artificial intelligence"

1

Antonini, Mattia. "From Edge Computing to Edge Intelligence: exploring novel design approaches to intelligent IoT applications." Doctoral thesis, Università degli studi di Trento, 2021. http://hdl.handle.net/11572/308630.

Full text
Abstract:
The Internet of Things (IoT) has deeply changed how we interact with our world. Today, smart homes, self-driving cars, connected industries, and wearables are just a few mainstream applications where IoT plays the role of enabling technology. When IoT became popular, Cloud Computing was already a mature technology able to deliver the computing resources necessary to execute heavy tasks (e.g., data analytic, storage, AI tasks, etc.) on data coming from IoT devices, thus practitioners started to design and implement their applications exploiting this approach. However, after a hype that lasted for a few years, cloud-centric approaches have started showing some of their main limitations when dealing with the connectivity of many devices with remote endpoints, like high latency, bandwidth usage, big data volumes, reliability, privacy, and so on. At the same time, a few new distributed computing paradigms emerged and gained attention. Among all, Edge Computing allows to shift the execution of applications at the edge of the network (a partition of the network physically close to data-sources) and provides improvement over the Cloud Computing paradigm. Its success has been fostered by new powerful embedded computing devices able to satisfy the everyday-increasing computing requirements of many IoT applications. Given this context, how can next-generation IoT applications take advantage of the opportunity offered by Edge Computing to shift the processing from the cloud toward the data sources and exploit everyday-more-powerful devices? This thesis provides the ingredients and the guidelines for practitioners to foster the migration from cloud-centric to novel distributed design approaches for IoT applications at the edge of the network, addressing the issues of the original approach. This requires the design of the processing pipeline of applications by considering the system requirements and constraints imposed by embedded devices. To make this process smoother, the transition is split into different steps starting with the off-loading of the processing (including the Artificial Intelligence algorithms) at the edge of the network, then the distribution of computation across multiple edge devices and even closer to data-sources based on system constraints, and, finally, the optimization of the processing pipeline and AI models to efficiently run on target IoT edge devices. Each step has been validated by delivering a real-world IoT application that fully exploits the novel approach. This paradigm shift leads the way toward the design of Edge Intelligence IoT applications that efficiently and reliably execute Artificial Intelligence models at the edge of the network.
APA, Harvard, Vancouver, ISO, and other styles
2

WoldeMichael, Helina Getachew. "Deployment of AI Model inside Docker on ARM-Cortex-based Single-Board Computer : Technologies, Capabilities, and Performance." Thesis, Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17267.

Full text
Abstract:
IoT has become tremendously popular. It provides information access, processing and connectivity for a huge number of devices or sensors. IoT systems, however, often do not process the information locally, rather send the information to remote locations in the Cloud. As a result, it adds huge amount of data traffic to the network and additional delay to data processing. The later feature might have significant impact on applications that require fast response times, such as sophisticated artificial intelligence (AI) applications including Augmented reality, face recognition, and object detection. Consequently, edge computing paradigm that enables computation of data near the source has gained a significant importance in achieving a fast response time in the recent years. IoT devices can be employed to provide computational resources at the edge of the network near the sensors and actuators. The aim of this thesis work is to design and implement a kind of edge computing concept that brings AI models to a small embedded IoT device by the use of virtualization concepts. The use of virtualization technology enables the easy packing and shipping of applications to different hardware platforms. Additionally, this enable the mobility of AI models between edge devices and the Cloud. We will implement an AI model inside a Docker container, which will be deployed on a FireflyRK3399 single-board computer (SBC). Furthermore, we will conduct CPU and memory performance evaluations of Docker on Firefly-RK3399. The methodology adopted to reach to our goal is experimental research. First, different literatures have been studied to demonstrate by implementation the feasibility of our concept. Then we setup an experiment that covers measurement of performance metrics by applying synthetic load in multiple scenarios. Results are validated by repeating the experiment and statistical analysis. Results of this study shows that, an AI model can successfully be deployed and executed inside a Docker container on Arm-Cortex-based single-board computer. A Docker image of OpenFace face recognition model is built for ARM architecture of the Firefly SBC. On the other hand, the performance evaluation reveals that the performance overhead of Docker in terms of CPU and Memory is negligible. The research work comprises the mechanisms how AI application can be containerized in ARM architecture. We conclude that the methods can be applied to containerize software application in ARM based IoT devices. Furthermore, the insignificant overhead brought by Docker facilitates for deployment of applications inside a container with less performance overhead. The functionality of IoT device i.e. Firefly-RK3399 is exploited in this thesis. It is shown that the device is capable and powerful and gives an insight for further studies.
APA, Harvard, Vancouver, ISO, and other styles
3

Chen, Hsinchun, Jay F. Nunamaker, Richard E. Orwig, and Olga Titkova. "Information Visualization for Collaborative Computing." IEEE, 1998. http://hdl.handle.net/10150/105495.

Full text
Abstract:
Artificial Intelligence Lab, Department of MIS, University of Arizona
A prototype tool classifies output from an electronic meeting system into a manageable list of concepts, topics, or issues that a group can further evaluate. In an experiment with output from GroupSystems electronic meeting system, the tool's recall ability was comparable to that of a human facilitator, but took roughly a sixth of the time.
APA, Harvard, Vancouver, ISO, and other styles
4

Hutson, Matt 1978. "Artificial intelligence and musical creativity : computing Beethoven's tenth." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/85756.

Full text
Abstract:
Thesis (S.M. in Science Writing)--Massachusetts Institute of Technology, Dept. of Humanities, Program in Writing and Humanistic Studies, 2003.
Includes bibliographical references (p. 47-48).
by Matthew T. Hutson.
S.M.in Science Writing
APA, Harvard, Vancouver, ISO, and other styles
5

Shan, Qingshan. "Artificial intelligence for identifying impacts on smart composites." Thesis, Southampton Solent University, 2004. http://ssudl.solent.ac.uk/600/.

Full text
Abstract:
Identification of low-velocity impacts to composite structures has become increasingly important in the aerospace industry. Knowing when impacts have occurred would allow inspections to be scheduled only when necessary, and knowing the approximate impact location would allow for a localized search, saving time and expense. Additionally, an estimation of the impact magnitude could be used for damage prediction. This study experimentally investigated a methodology for impact identification. To achieve the approach, the following issues were covered in this study: impact detecting; signal processing; feature extractioon; impact identification. In impact detection, them smart stuctures, two piezoelectric sensors embedded in composite structures, are designed to measure impact signals caused by foreign object impact events. The impact signals were stored in computer system memory through the impact monitoring system developed in this study. In signal processing, the cross correlation method was used to process the measured impact signals. This processing built the correlation between the impact signals and location of impacts as well as impact magnitude. In feature extraction, the initial feature data were gained from the cross correlation results through the point and segmentation processing. thie final feature data were selected from the initial feature data with a fuzzy clustering method. In impact identification, the adaptive deuro fuzzy inference systems (ANFIS) were built with the feature data to identify abscissas of impact location, ordinates of impact location and impact magnitude. The parameters of the ANFISs were refined with a hybrid learning rule i.e. the combination of the Least Square Estimation and the Steepest Descent algorithm. Real time software developed in Visual Basic code manipulated the monitoring and control system for the impact experiments. Also a software package developed with MATLAB, implemented the impact identification and the system simulation.
APA, Harvard, Vancouver, ISO, and other styles
6

Tsui, Kwok Ching. "Neural network design using evolutionary computing." Thesis, King's College London (University of London), 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299918.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Samikwa, Eric. "Flood Prediction System Using IoT and Artificial Neural Networks with Edge Computing." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280299.

Full text
Abstract:
Flood disasters affect millions of people across the world by causing severe loss of life and colossal damage to property. Internet of things (IoT) has been applied in areas such as flood prediction, flood monitoring, flood detection, etc. Although IoT technologies cannot stop the occurrence of flood disasters, they are exceptionally valuable apparatus for conveyance of catastrophe readiness and counteractive action data. Advances have been made in flood prediction using artificial neural networks (ANN). Despite the various advancements in flood prediction systems through the use of ANN, there has been less focus on the utilisation of edge computing for improved efficiency and reliability of such systems. In this thesis, a system for short-term flood prediction that uses IoT and ANN, where the prediction computation is carried out on a low power edge device is proposed. The system monitors real-time rainfall and water level sensor data and predicts ahead of time flood water levels using long short-term memory. The system can be deployed on battery power as it uses low power IoT devices and communication technology. The results of evaluating a prototype of the system indicate a good performance in terms of flood prediction accuracy and response time. The application of ANN with edge computing will help improve the efficiency of real-time flood early warning systems by bringing the prediction computation close to where data is collected.
Översvämningar drabbar miljontals människor över hela världen genom att orsaka dödsfall och förstöra egendom. Sakernas Internet (IoT) har använts i områden som översvämnings förutsägelse, översvämnings övervakning, översvämning upptäckt, etc. Även om IoT-teknologier inte kan stoppa förekomsten av översvämningar, så är de mycket användbara när det kommer till transport av katastrofberedskap och motverkande handlingsdata. Utveckling har skett när det kommer till att förutspå översvämningar med hjälp av artificiella neuronnät (ANN). Trots de olika framstegen inom system för att förutspå översvämningar genom ANN, så har det varit mindre fokus på användningen av edge computing vilket skulle kunna förbättra effektivitet och tillförlitlighet. I detta examensarbete föreslås ett system för kortsiktig översvämningsförutsägelse genom IoT och ANN, där gissningsberäkningen utförs över en låg effekt edge enhet. Systemet övervakar sensordata från regn och vattennivå i realtid och förutspår översvämningsvattennivåer i förtid genom att använda långt korttidsminne. Systemet kan köras på batteri eftersom det använder låg effekt IoT-enheter och kommunikationsteknik. Resultaten från en utvärdering av en prototyp av systemet indikerar en bra prestanda när det kommer till noggrannhet att förutspå översvämningar och responstid. Användningen av ANN med edge computing kommer att förbättra effektiviteten av tidiga varningssystem för översvämningar i realtid genom att ta gissningsberäkningen närmare till där datan samlas.
APA, Harvard, Vancouver, ISO, and other styles
8

Na, Jongwhoa. "Design and simulation of digital optical computing systems for artificial intelligence." Diss., The University of Arizona, 1994. http://hdl.handle.net/10150/186989.

Full text
Abstract:
Rule-based systems (RBSs) are one of the problem solving methodologies in artificial intelligence. Although RBSs have a vast potential in many application areas the slow execution speed of current RBSs has prohibited them from the full exploitation of their vast potential. In this dissertation, to improve the speed of RBSs, we explore the use of optics for the fast and parallel RBS architectures. First, we propose an electro-optical rule-based system (EORBS). Using two-dimensional knowledge representation and a monotonic reasoning scheme, EORBS provides highly efficient implementation of the basic operations needed in rule-based systems, namely, matching, selection, and rule firing. The execution speed of the proposed system is theoretically estimated and is shown to be two orders of magnitude faster than the current electronic systems. Although EORBS shows the best performance in execution speed compared to other RBSs, the monotonic reasoning scheme restricts the application domains of EORBS. In order to overcome this limitation on the application domain in EORBS, a general purpose RBS, called an Optical Content-Addressable Parallel Processor for Expert Systems (OCAPP-ES) is proposed. Using a general knowledge representation scheme and a parallel conflict resolution scheme, OCAPP-ES executes the three basic RBS operations on general knowledge (including variables, symbols, and numbers) in a highly parallel fashion. The performance of OCAPP-ES is theoretically estimated and is shown to be an order of magnitude slower than that of EORBS. However, the performance of OCAPP-ES is still an order of magnitude faster than any other RBS. Furthermore, OCAPP-ES is designed to support the general knowledge representation scheme so that it can be a high speed general purpose RBS. To verify the proposed architectures, we developed a modeling and simulation methodology for digital optical computing systems. The methodology predicts maximum performance of a given optical computing architecture and evaluates its feasibility. As an application example, we apply this methodology to evaluate the feasibility and performance of OCAPP which is the optical match unit of OCAPP-ES. The proposed methodology is intended to reduce optical computing systems' design time as well as the design risk associated with building a prototype system.
APA, Harvard, Vancouver, ISO, and other styles
9

Wagy, Mark David. "Enabling Machine Science through Distributed Human Computing." ScholarWorks @ UVM, 2016. http://scholarworks.uvm.edu/graddis/618.

Full text
Abstract:
Distributed human computing techniques have been shown to be effective ways of accessing the problem-solving capabilities of a large group of anonymous individuals over the World Wide Web. They have been successfully applied to such diverse domains as computer security, biology and astronomy. The success of distributed human computing in various domains suggests that it can be utilized for complex collaborative problem solving. Thus it could be used for "machine science": utilizing machines to facilitate the vetting of disparate human hypotheses for solving scientific and engineering problems. In this thesis, we show that machine science is possible through distributed human computing methods for some tasks. By enabling anonymous individuals to collaborate in a way that parallels the scientific method -- suggesting hypotheses, testing and then communicating them for vetting by other participants -- we demonstrate that a crowd can together define robot control strategies, design robot morphologies capable of fast-forward locomotion and contribute features to machine learning models for residential electric energy usage. We also introduce a new methodology for empowering a fully automated robot design system by seeding it with intuitions distilled from the crowd. Our findings suggest that increasingly large, diverse and complex collaborations that combine people and machines in the right way may enable problem solving in a wide range of fields.
APA, Harvard, Vancouver, ISO, and other styles
10

Hasanaj, Enis, Albert Aveler, and William Söder. "Cooperative edge deepfake detection." Thesis, Jönköping University, JTH, Avdelningen för datateknik och informatik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-53790.

Full text
Abstract:
Deepfakes are an emerging problem in social media and for celebrities and political profiles, it can be devastating to their reputation if the technology ends up in the wrong hands. Creating deepfakes is becoming increasingly easy. Attempts have been made at detecting whether a face in an image is real or not but training these machine learning models can be a very time-consuming process. This research proposes a solution to training deepfake detection models cooperatively on the edge. This is done in order to evaluate if the training process, among other things, can be made more efficient with this approach.  The feasibility of edge training is evaluated by training machine learning models on several different types of iPhone devices. The models are trained using the YOLOv2 object detection system.  To test if the YOLOv2 object detection system is able to distinguish between real and fake human faces in images, several models are trained on a computer. Each model is trained with either different number of iterations or different subsets of data, since these metrics have been identified as important to the performance of the models. The performance of the models is evaluated by measuring the accuracy in detecting deepfakes.  Additionally, the deepfake detection models trained on a computer are ensembled using the bagging ensemble method. This is done in order to evaluate the feasibility of cooperatively training a deepfake detection model by combining several models.  Results show that the proposed solution is not feasible due to the time the training process takes on each mobile device. Additionally, each trained model is about 200 MB, and the size of the ensemble model grows linearly by each model added to the ensemble. This can cause the ensemble model to grow to several hundred gigabytes in size.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Edge computing with artificial intelligence"

1

Misra, Sanjay, Amit Kumar Tyagi, Vincenzo Piuri, and Lalit Garg, eds. Artificial Intelligence for Cloud and Edge Computing. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-80821-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Dubey, Hari Mohan, Manjaree Pandit, Laxmi Srivastava, and Bijaya Ketan Panigrahi, eds. Artificial Intelligence and Sustainable Computing. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-1220-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Rutkowski, Leszek, Rafał Scherer, Marcin Korytkowski, Witold Pedrycz, Ryszard Tadeusiewicz, and Jacek M. Zurada, eds. Artificial Intelligence and Soft Computing. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-87986-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rutkowski, Leszek, Rafał Scherer, Marcin Korytkowski, Witold Pedrycz, Ryszard Tadeusiewicz, and Jacek M. Zurada, eds. Artificial Intelligence and Soft Computing. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-87897-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rutkowski, Leszek, Rafał Scherer, Marcin Korytkowski, Witold Pedrycz, Ryszard Tadeusiewicz, and Jacek M. Zurada, eds. Artificial Intelligence and Soft Computing. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-87986-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Rutkowski, Leszek, Rafał Scherer, Marcin Korytkowski, Witold Pedrycz, Ryszard Tadeusiewicz, and Jacek M. Zurada, eds. Artificial Intelligence and Soft Computing. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-87897-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Nourani, Cyrus F. Artificial Intelligence and Computing Logic. Boca Raton: Apple Academic Press, 2021. http://dx.doi.org/10.1201/9781003180487.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

de Leon F. de Carvalho, Andre Ponce, Sara Rodríguez-González, Juan F. De Paz Santana, and Juan M. Corchado Rodríguez, eds. Distributed Computing and Artificial Intelligence. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-14883-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Rutkowski, Leszek, Marcin Korytkowski, Rafał Scherer, Ryszard Tadeusiewicz, Lotfi A. Zadeh, and Jacek M. Zurada, eds. Artificial Intelligence and Soft Computing. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-59060-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Rutkowski, Leszek, Marcin Korytkowski, Rafał Scherer, Ryszard Tadeusiewicz, Lotfi A. Zadeh, and Jacek M. Zurada, eds. Artificial Intelligence and Soft Computing. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-59063-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Edge computing with artificial intelligence"

1

Wang, Xiaofei, Yiwen Han, Victor C. M. Leung, Dusit Niyato, Xueqiang Yan, and Xu Chen. "Edge Computing for Artificial Intelligence." In Edge AI, 97–115. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-6186-3_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Parekh, Brinda, and Kiran Amin. "Edge Intelligence: A Robust Reinforcement of Edge Computing and Artificial Intelligence." In Innovations in Information and Communication Technologies (IICT-2020), 461–68. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-66218-9_55.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Keerthy, A. S., and S. Manju Priya. "Artificial Intelligence in Healthcare Databases." In Deep Learning and Edge Computing Solutions for High Performance Computing, 19–34. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-60265-9_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sakib, Sadman, Mostafa M. Fouda, and Zubair Md Fadlullah. "Harnessing Artificial Intelligence for Secure ECG Analytics at the Edge for Cardiac Arrhythmia Classification." In Secure Edge Computing, 137–53. Boca Raton: CRC Press, 2021. http://dx.doi.org/10.1201/9781003028635-11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

El Ghmary, Mohamed, Tarik Chanyour, Youssef Hmimz, and Mohammed Ouçamah Cherkaoui Malki. "Processing Time and Computing Resources Optimization in a Mobile Edge Computing Node." In Embedded Systems and Artificial Intelligence, 99–108. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-0947-6_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Galkowski, Tomasz, and Miroslaw Pawlak. "Nonparametric Estimation of Edge Values of Regression Functions." In Artificial Intelligence and Soft Computing, 49–59. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-39384-1_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Grycuk, Rafał, Patryk Najgebauer, and Rafał Scherer. "Edge Detection-Based Full-Disc Solar Image Hashing." In Artificial Intelligence and Soft Computing, 243–51. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-23480-4_20.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kangra, Kirti, and Jaswinder Singh. "Explainable Artificial Intelligence: Concepts and Current Progression." In Explainable Edge AI: A Futuristic Computing Perspective, 1–17. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-18292-1_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Grycuk, Rafał, Marcin Gabryel, Magdalena Scherer, and Sviatoslav Voloshynovskiy. "Image Descriptor Based on Edge Detection and Crawler Algorithm." In Artificial Intelligence and Soft Computing, 647–59. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-39384-1_57.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Najgebauer, Patryk, Tomasz Nowak, Jakub Romanowski, Janusz Rygał, and Marcin Korytkowski. "Representation of Edge Detection Results Based on Graph Theory." In Artificial Intelligence and Soft Computing, 588–601. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38658-9_54.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Edge computing with artificial intelligence"

1

Glavan, Alina Florina, and Constantin Viorel Marian. "Cognitive edge computing through artificial intelligence." In 2020 13th International Conference on Communications (COMM). IEEE, 2020. http://dx.doi.org/10.1109/comm48946.2020.9142010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Goswami, Siddharth, and Sachin Sharma. "DNA Sequencing using Artificial Intelligence." In 2022 International Conference on Edge Computing and Applications (ICECAA). IEEE, 2022. http://dx.doi.org/10.1109/icecaa55415.2022.9936101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ibn-Khedher, Hatem, Mohammed Laroui, Mouna Ben Mabrouk, Hassine Moungla, Hossam Afifi, Alberto Nai Oleari, and Ahmed E. Kamal. "Edge Computing Assisted Autonomous Driving Using Artificial Intelligence." In 2021 International Wireless Communications and Mobile Computing (IWCMC). IEEE, 2021. http://dx.doi.org/10.1109/iwcmc51323.2021.9498627.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Waguie, Francxa Tagne, and Fadi Al-Turjman. "Artificial Intelligence for Edge Computing Security: A Survey." In 2022 International Conference on Artificial Intelligence in Everything (AIE). IEEE, 2022. http://dx.doi.org/10.1109/aie57029.2022.00091.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Zhuang, Qian He, Lei Liu, Dapeng Lan, Hwei-Ming Chung, and Zhifei Mao. "An Artificial Intelligence Perspective on Mobile Edge Computing." In 2019 IEEE International Conference on Smart Internet of Things (SmartIoT). IEEE, 2019. http://dx.doi.org/10.1109/smartiot.2019.00024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Riggio, Roberto, Estefania Coronado, Neiva Linder, Adzic Jovanka, Gianpiero Mastinu, Leonardo Goratti, Miguel Rosa, Hans Schotten, and Marco Pistore. "AI@EDGE: A Secure and Reusable Artificial Intelligence Platform for Edge Computing." In 2021 Joint European Conference on Networks and Communications & 6G Summit (EuCNC/6G Summit). IEEE, 2021. http://dx.doi.org/10.1109/eucnc/6gsummit51104.2021.9482440.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fragkos, Georgios, Nicholas Kemp, Eirini Eleni Tsiropoulou, and Symeon Papavassiliou. "Artificial Intelligence Empowered UAVs Data Offloading in Mobile Edge Computing." In ICC 2020 - 2020 IEEE International Conference on Communications (ICC). IEEE, 2020. http://dx.doi.org/10.1109/icc40277.2020.9149115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Thalluri, Lakshmi Narayana, Srikhakolanu Naga Venkat, Chintha Veera Venkata Durga Prasad, Dakavarapu Vinay Kumar, Kumpati Pavan Kumar, Addepalli V. S. Y. Narayana Sarma, and Sai Divya Adapa. "Artificial Intelligence Enabled Smart City IoT System using Edge Computing." In 2021 2nd International Conference on Smart Electronics and Communication (ICOSEC). IEEE, 2021. http://dx.doi.org/10.1109/icosec51865.2021.9591732.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Fan, Bing, and Youdan Shi. "Revenue-Aware Edge Server Deployment Algorithm in Edge Computing." In 2021 2nd International Conference on Big Data & Artificial Intelligence & Software Engineering (ICBASE). IEEE, 2021. http://dx.doi.org/10.1109/icbase53849.2021.00057.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Madhurika, B., and D. Naga Malleswari. "A Systematic Review on Artificial Intelligence-based Opinion Mining Models." In 2022 International Conference on Edge Computing and Applications (ICECAA). IEEE, 2022. http://dx.doi.org/10.1109/icecaa55415.2022.9936138.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Edge computing with artificial intelligence"

1

Hunt, Will, and Owen Daniels. Sustaining and Growing the U.S. Semiconductor Advantage: A Primer. Center for Security and Emerging Technology, June 2022. http://dx.doi.org/10.51593/20220006.

Full text
Abstract:
As an integral player in advanced semiconductor supply chains, the United States enjoys advantages over China in producing and accessing chips for artificial intelligence and other leading-edge computing technologies. However, a lack of domestic production capacity threatens U.S. semiconductor access. The United States can strengthen its advantages by working with allies and partners to prevent China from producing leading-edge chips and by reshoring its own domestic chipmaking capacity.
APA, Harvard, Vancouver, ISO, and other styles
2

Ruvinsky, Alicia, Timothy Garton, Daniel Chausse, Rajeev Agrawal, Harland Yu, and Ernest Miller. Accelerating the tactical decision process with High-Performance Computing (HPC) on the edge : motivation, framework, and use cases. Engineer Research and Development Center (U.S.), September 2021. http://dx.doi.org/10.21079/11681/42169.

Full text
Abstract:
Managing the ever-growing volume and velocity of data across the battlefield is a critical problem for warfighters. Solving this problem will require a fundamental change in how battlefield analyses are performed. A new approach to making decisions on the battlefield will eliminate data transport delays by moving the analytical capabilities closer to data sources. Decision cycles depend on the speed at which data can be captured and converted to actionable information for decision making. Real-time situational awareness is achieved by locating computational assets at the tactical edge. Accelerating the tactical decision process leverages capabilities in three technology areas: (1) High-Performance Computing (HPC), (2) Machine Learning (ML), and (3) Internet of Things (IoT). Exploiting these areas can reduce network traffic and shorten the time required to transform data into actionable information. Faster decision cycles may revolutionize battlefield operations. Presented is an overview of an artificial intelligence (AI) system design for near-real-time analytics in a tactical operational environment executing on co-located, mobile HPC hardware. The report contains the following sections, (1) an introduction describing motivation, background, and state of technology, (2) descriptions of tactical decision process leveraging HPC problem definition and use case, and (3) HPC tactical data analytics framework design enabling data to decisions.
APA, Harvard, Vancouver, ISO, and other styles
3

Lohn, Andrew, and Micah Musser. AI and Compute: How Much Longer Can Computing Power Drive Artificial Intelligence Progress? Center for Security and Emerging Technology, January 2022. http://dx.doi.org/10.51593/2021ca009.

Full text
Abstract:
Between 2012 and 2018, the amount of computing power used by record-breaking artificial intelligence models doubled every 3.4 months. Even with money pouring into the AI field, this trendline is unsustainable. Because of cost, hardware availability and engineering difficulties, the next decade of AI can't rely exclusively on applying more and more computing power to drive further progress.
APA, Harvard, Vancouver, ISO, and other styles
4

Hwang, Tim, and Emily Weinstein. Decoupling in Strategic Technologies: From Satellites to Artificial Intelligence. Center for Security and Emerging Technology, July 2022. http://dx.doi.org/10.51593/20200085.

Full text
Abstract:
Geopolitical tensions between the United States and China have sparked an ongoing dialogue in Washington about the phenomenon of “decoupling”—the use of public policy tools to separate the multifaceted economic ties that connect the two powers. This issue brief provides a historical lens on the efficacy of one specific aspect of this broader decoupling phenomenon: using export controls and related trade policies to prevent a rival from acquiring the equipment and know-how to catch up to the United States in cutting-edge, strategically important technologies.
APA, Harvard, Vancouver, ISO, and other styles
5

Grossberg, Stephen. Instrumentation for Scientific Computing in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics. Fort Belvoir, VA: Defense Technical Information Center, October 1987. http://dx.doi.org/10.21236/ada189981.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Biagioni, David, John Farrell, Venu Garikapati, Peter Graf, Nalinrat Guba, Yi Hou, Wesley Jones, et al. Advanced Computing, Data Science, and Artificial Intelligence Research Opportunities for Energy-Focused Transportation Science. Office of Scientific and Technical Information (OSTI), July 2021. http://dx.doi.org/10.2172/1812196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Perdigão, Rui A. P. Course on Nonlinear Frontiers: From Dynamical Systems, Information and Complexity to Cutting-Edge Physically Cognitive Artificial Intelligence. Meteoceanics, February 2021. http://dx.doi.org/10.46337/uc.210211.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Johns, Janet F. Proceedings of the Association for Computing Machinery Special Interest Group for Ada Artificial Intelligence Working Group, 1992 Summer Workshop Held in Seattle, Washington on June 24-27, 1992. Fort Belvoir, VA: Defense Technical Information Center, June 1993. http://dx.doi.org/10.21236/ada266422.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Buchanan, Ben. The AI Triad and What It Means for National Security Strategy. Center for Security and Emerging Technology, August 2020. http://dx.doi.org/10.51593/20200021.

Full text
Abstract:
One sentence summarizes the complexities of modern artificial intelligence: Machine learning systems use computing power to execute algorithms that learn from data. This AI triad of computing power, algorithms, and data offers a framework for decision-making in national security policy.
APA, Harvard, Vancouver, ISO, and other styles
10

Cary, Dakota. China’s CyberAI Talent Pipeline. Center for Security and Emerging Technology, July 2021. http://dx.doi.org/10.51593/2020ca017.

Full text
Abstract:
To what extent does China’s cultivation of talent in cybersecurity and AI matter in terms of competitiveness with other countries? Right now, it seems to have an edge: China’s 11 World-Class Cybersecurity Schools offer more classes on artificial intelligence and machine learning than do the 20 U.S. universities certified as Centers of Academic Excellence in Cyber Operations. This policy brief recommends tracking 13 research grants from the National Science Foundation that attempt to integrate AI into cybersecurity curricula.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography