Journal articles on the topic 'Support secure data processing'

To see the other types of publications on this topic, follow the link: Support secure data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Support secure data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Sun, Yan, and Shambhu Upadhyaya. "Secure and privacy preserving data processing support for active authentication." Information Systems Frontiers 17, no. 5 (July 29, 2015): 1007–15. http://dx.doi.org/10.1007/s10796-015-9587-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jariwala, Vivaksha, Himanshu Patel, Parth Patel, and Devesh C. Jinwala. "Integrity and Privacy Preserving Secure Data Aggregation in Wireless Sensor Networks." International Journal of Distributed Systems and Technologies 5, no. 3 (July 2014): 77–99. http://dx.doi.org/10.4018/ijdst.2014070104.

Full text
Abstract:
Data aggregation based on in-network processing is useful in improving the communications overhead in Wireless Sensor Networks (WSNs) by reducing the overall number of packets communicated to the base station. However, due to the fusion of data items sourced at different nodes into a single one, the security of the aggregated data as well as that of the aggregating node, demands critical investigation. This paper observes that though there have been substantial numbers of research attempts recently in proposing techniques for secure data aggregation, there is still a need for a coherent, integrated framework for secure data aggregation. It proposes such an integrated framework for secure data aggregation in this paper. The proposed framework implements a secure data aggregation protocol, offering the attributes viz. confidentiality, privacy, authentication, robustness, as well as data integrity using a defined data aggregation topology. In addition to that, the proposed framework is based on a zero configuration protocol that supports a generation of a defined aggregation topology and a key exchange procedure. This work supports the framework with detailed security analysis and performance evaluation on TinyOS platform using TOSSIM as the base simulator. Through humble observations, this is an unique attempt that integrates the support for security features, topology generation and key management in a single, secure data aggregation protocol; substantiating the proposal with elaborate experimental evaluation, too.
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Ziheng, Heng Chen, and Weiguo Wu. "Client-Aware Negotiation for Secure and Efficient Data Transmission." Energies 13, no. 21 (November 4, 2020): 5777. http://dx.doi.org/10.3390/en13215777.

Full text
Abstract:
In Wireless Sensor Networks (WSNs), server clusters, and other systems requiring secure transmission, the overhead of data encryption and transmission is often not negligible. Unfortunately, a conflict exists between security and efficiency in processing data. Therefore, this paper proposes a strategy to overcome this conflict, called Client-Aware Negotiation for Secure and Efficient Data Transmission (CAN-SEAT). This strategy allows a client with different security transmission requirements to use the appropriate data security transmission without modifying the client. Two methods are designed for different clients. The first method is based on two-way authentication and renegotiation. After handshakes, the appropriate data security transmission scheme is selected according to the client requirements. Another method is based on redirection, which can be applied when the client does not support two-way authentication or renegotiation. For the characteristics of different architecture, this paper classifies and discusses symmetric key algorithms, asymmetric key algorithms, and hardware encryption instructions. In four application scenarios, the CAN-SEAT strategy is tested. Compared with the general transmission strategy, when only software encryption is used, the data processing and transmission cost can be reduced by 89.41% in the best case and by 15.40% in the worst case. When supporting hardware encryption, the cost can be reduced by 85.30% and 24.63%, respectively. A good effect was produced on the experimental platforms XiLinx, FT-2000+, and Intel processors. To the best of our knowledge, for Client-Aware Negotiation (CAN), this is the first method to be successfully deployed on a general system. CAN-SEAT can be easily combined with other energy-efficient strategies.
APA, Harvard, Vancouver, ISO, and other styles
4

Seniman, Seniman, Baihaqi Siregar, Rani Masyithah Pelle, and Fahmi Fahmi. "Securing sensor data transmission with ethernet elliptic curve cryptography secure socket layer on STM32F103 device." Indonesian Journal of Electrical Engineering and Computer Science 22, no. 1 (April 1, 2021): 507. http://dx.doi.org/10.11591/ijeecs.v22.i1.pp507-515.

Full text
Abstract:
Currently there is no method, feature, or ability in securing data transmission in microcontroller systems and applications with client-server scheme communication, while major modern computer systems using secure socket layer (SSL) for establishing secure communication. However, ESP espressif based microcontroller has supported SSL communication to secure data transmission, but only works on the Wi-Fi network. A single-board computer based embedded system has fully supported SSL communication, but it costs a very high price. On the other hand, STM32F103 microcontrollers with a very affordable price even cheaper than the Arduino board has the opportunity to build secure data communication using SSL protocol based on MbedTLS library. In addition to wiznet W5100/W5500 ethernet shield, an STM32F103 SSL client device has been successfully built in this study. The SSL client device supports ECDHE ECDHA AES128 CBC SHA256 SSL cipher suite. The Apache web server must also be configured to support this cipher suite by generating OpenSSL ECC (elliptic curve cryptography) certificate. The system was tested with the LM35 analog temperature sensor, and as a result, the STM32F103 SSL client has successfully secured the data transmission to the Apache SSL web server. The communication time was 3 seconds for the first connection and 42 ms for the next data transmission.
APA, Harvard, Vancouver, ISO, and other styles
5

Marwan, Mbarek, Ali Karti, and Hassan Ouahmane. "Proposal for a Secure Data Sharing and Processing in Cloud Applications for Healthcare Domain." International Journal of Information Technology and Applied Sciences (IJITAS) 3, no. 1 (January 31, 2021): 10–17. http://dx.doi.org/10.52502/ijitas.v3i1.15.

Full text
Abstract:
Information Technology (IT) services have become an inherent component in almost all sectors. Similarly, the health sector has been recently integrating IT to meet the growing demand for medical data exchange and storage. Currently, cloud has become a real hosting alternative for traditional on-permise software. In this model, not only do health organizations have access to a wide range of services but most importantly they are charged based on the usage of these cloud applications. However, especially in the healthcare domain, cloud computing deems challenging as to the sensitivity of health data. This work aims at improving access to medical data and securely sharing them across healthcare professionals, allowing real-time collaboration. From these perspectives, they propose a hybrid cryptosystem based on AES and Paillier to prevent the disclosure of confidential data, as well as computing encrypted data. Unlike most other solutions, the proposed framework adopts a proxy-based architecture to tackle some issues regarding privacy concerns and access control. Subsequently, this system typically guarantees that only authorized users can view or use specific resources in a computing environment. To this aim, they use eXtensible Access Control Markup Language (XACML) standard to properly design and manage access control policies. In this study, they opt for the (Abbreviated Language for Authorization) ALFA tool to easily formulate XACML policies and define complex rules. The simulation results show that the proposal offers simple and efficient mechanisms for the secure use of cloud services within the healthcare domain. Consequently, this framework is an appropriate method to support collaboration among all entities involved in medical information exchange.
APA, Harvard, Vancouver, ISO, and other styles
6

Alenezi, Mamdouh, Muhammad Usama, Khaled Almustafa, Waheed Iqbal, Muhammad Ali Raza, and Tanveer Khan. "An Efficient, Secure, and Queryable Encryption for NoSQL-Based Databases Hosted on Untrusted Cloud Environments." International Journal of Information Security and Privacy 13, no. 2 (April 2019): 14–31. http://dx.doi.org/10.4018/ijisp.2019040102.

Full text
Abstract:
NoSQL-based databases are attractive to store and manage big data mainly due to high scalability and data modeling flexibility. However, security in NoSQL-based databases is weak which raises concerns for users. Specifically, security of data at rest is a high concern for the users deployed their NoSQL-based solutions on the cloud because unauthorized access to the servers will expose the data easily. There have been some efforts to enable encryption for data at rest for NoSQL databases. However, existing solutions do not support secure query processing, and data communication over the Internet and performance of the proposed solutions are also not good. In this article, the authors address NoSQL data at rest security concern by introducing a system which is capable to dynamically encrypt/decrypt data, support secure query processing, and seamlessly integrate with any NoSQL- based database. The proposed solution is based on a combination of chaotic encryption and Order Preserving Encryption (OPE). The experimental evaluation showed excellent results when integrated the solution with MongoDB and compared with the state-of-the-art existing work.
APA, Harvard, Vancouver, ISO, and other styles
7

Hamza, Rafik, Alzubair Hassan, Awad Ali, Mohammed Bakri Bashir, Samar M. Alqhtani, Tawfeeg Mohmmed Tawfeeg, and Adil Yousif. "Towards Secure Big Data Analysis via Fully Homomorphic Encryption Algorithms." Entropy 24, no. 4 (April 6, 2022): 519. http://dx.doi.org/10.3390/e24040519.

Full text
Abstract:
Privacy-preserving techniques allow private information to be used without compromising privacy. Most encryption algorithms, such as the Advanced Encryption Standard (AES) algorithm, cannot perform computational operations on encrypted data without first applying the decryption process. Homomorphic encryption algorithms provide innovative solutions to support computations on encrypted data while preserving the content of private information. However, these algorithms have some limitations, such as computational cost as well as the need for modifications for each case study. In this paper, we present a comprehensive overview of various homomorphic encryption tools for Big Data analysis and their applications. We also discuss a security framework for Big Data analysis while preserving privacy using homomorphic encryption algorithms. We highlight the fundamental features and tradeoffs that should be considered when choosing the right approach for Big Data applications in practice. We then present a comparison of popular current homomorphic encryption tools with respect to these identified characteristics. We examine the implementation results of various homomorphic encryption toolkits and compare their performances. Finally, we highlight some important issues and research opportunities. We aim to anticipate how homomorphic encryption technology will be useful for secure Big Data processing, especially to improve the utility and performance of privacy-preserving machine learning.
APA, Harvard, Vancouver, ISO, and other styles
8

Alsaig, Alaa, Vangalur Alagar, Zaki Chammaa, and Nematollaah Shiri. "Characterization and Efficient Management of Big Data in IoT-Driven Smart City Development." Sensors 19, no. 11 (May 28, 2019): 2430. http://dx.doi.org/10.3390/s19112430.

Full text
Abstract:
Smart city is an emerging initiative for integrating Information and Communication Technologies (ICT) in effective ways to support development of smart cities with enhanced quality of life for its citizens through safe and secure context-aware services. Major technical challenges to realize smart cities include resource use optimization, service delivery without interruption at all times in all aspects, minimization of costs, and reduction of resource consumption. To address these challenges, new techniques and technologies are required for modeling and processing the big data generated and used through the underlying Internet of Things (IoT). To this end, we propose a data-centric approach to IoT in conceptualizing the “things” from a service-oriented perspective and investigate efficient ways to identify, integrate, and manage big data. The data-centric approach is expected to better support efficient management of data with complexities inherent in IoT-generated big data. Furthermore, it supports efficient and scalable query processing and reasoning techniques required in development of smart city applications. This article redresses the literature and contributes to the foundations of smart cities applications.
APA, Harvard, Vancouver, ISO, and other styles
9

Meister, Sam, and Alexandra Chassanoff. "Integrating Digital Forensics Techniques into Curatorial Tasks: A Case Study." International Journal of Digital Curation 9, no. 2 (September 9, 2014): 6–16. http://dx.doi.org/10.2218/ijdc.v9i2.325.

Full text
Abstract:
In this paper, we investigate how digital forensics tools can support digital curation tasks around the acquisition, processing, management and analysis of born-digital materials. Using a real world born-digital collection as our use case, we describe how BitCurator, a digital forensics open source software environment, supports fundamental curatorial activities such as secure data transfer, assurance of authenticity and integrity, and the identification and elimination of private and/or sensitive information. We also introduce a workflow diagram that articulates the processing steps for institutions processing born-digital materials. Finally, we review possibilities for further integration, development and use of digital forensic tools.
APA, Harvard, Vancouver, ISO, and other styles
10

Sánchez, Daniel, Andrés López, Florina Mendoza, and Patricia Arias Cabarcos. "DNS-Based Dynamic Authentication for Microservices in IoT." Proceedings 2, no. 19 (October 25, 2018): 1233. http://dx.doi.org/10.3390/proceedings2191233.

Full text
Abstract:
IoT devices provide with real-time data to a rich ecosystems of services and applications that will be of uttermost importance for ubiquitous computing. The volume of data and the involved subscribe/notify signaling will likely become a challenge also for access and core netkworks. Designers may opt for microservice architectures and fog computing to address this challenge while offering the required flexibility for the main players of ubiquitous computing: nomadic users. Microservices require strong security support for Fog computing, to rely on nodes in the boundary of the network for secure data collection and processing. IoT low cost devices face outdated certificates and security support, due to the elapsed time from manufacture to deployment. In this paper we propose a solution based on microservice architectures and DNSSEC, DANE and chameleon signatures to overcome these difficulties. We will show how trap doors included in the certificates allow a secure and flexible delegation for off-loading data collection and processing to the fog. The main result is showing this requires minimal manufacture device configuration, thanks to DNSSEC support.
APA, Harvard, Vancouver, ISO, and other styles
11

Przytarski, Dennis, Christoph Stach, Clémentine Gritti, and Bernhard Mitschang. "Query Processing in Blockchain Systems: Current State and Future Challenges." Future Internet 14, no. 1 (December 21, 2021): 1. http://dx.doi.org/10.3390/fi14010001.

Full text
Abstract:
When, in 2008, Satoshi Nakamoto envisioned the first distributed database management system that relied on cryptographically secured chain of blocks to store data in an immutable and tamper-resistant manner, his primary use case was the introduction of a digital currency. Owing to this use case, the blockchain system was geared towards efficient storage of data, whereas the processing of complex queries, such as provenance analyses of data history, is out of focus. The increasing use of Internet of Things technologies and the resulting digitization in many domains, however, have led to a plethora of novel use cases for a secure digital ledger. For instance, in the healthcare sector, blockchain systems are used for the secure storage and sharing of electronic health records, while the food industry applies such systems to enable a reliable food-chain traceability, e.g., to prove compliance with cold chains. In these application domains, however, querying the current state is not sufficient—comprehensive history queries are required instead. Due to these altered usage modes involving more complex query types, it is questionable whether today’s blockchain systems are prepared for this type of usage and whether such queries can be processed efficiently by them. In our paper, we therefore investigate novel use cases for blockchain systems and elicit their requirements towards a data store in terms of query capabilities. We reflect the state of the art in terms of query support in blockchain systems and assess whether it is capable of meeting the requirements of such more sophisticated use cases. As a result, we identify future research challenges with regard to query processing in blockchain systems.
APA, Harvard, Vancouver, ISO, and other styles
12

Sun, Shuang, Rong Du, and Shudong Chen. "A Secure and Computable Blockchain-Based Data Sharing Scheme in IoT System." Information 12, no. 2 (January 20, 2021): 47. http://dx.doi.org/10.3390/info12020047.

Full text
Abstract:
The internet of things (IoT) devices are expected to collect vast amounts of data that support different kinds of applications such as health monitor, smart home, and traffic management. However, its characteristics such as resource-constrained nature, dynamicity, and large-scale growth bring challenges to secure IoT data sharing. Nowadays, blockchain-based ciphertext-policy attribute-based encryption (CP-ABE) was proposed to realize secure IoT data sharing. In blockchain-based CP-ABE data sharing schemes, the data are encrypted and stored in the cloud. Once users want to process the data, they should download and then decrypt the ciphertext in the client-end, and after processing the data, users encrypt and upload the ciphertext onto the cloud. This outweighs the advantage of using cloud computing resources. Fully homomorphic encryption (FHE) and homomorphic signature technology may be adopted to realize ciphertext computation and for correctness checking of ciphertext computation results. In this paper, we propose a secure and computable IoT data sharing system to ensure users enjoying the computation convenience of the cloud-end. Specifically, the proposed system integrates CP-ABE and FHE to realize secure IoT data sharing and ciphertext computation. In addition, we generated homomorphic signatures of ciphertexts to enable users to check the correctness of the ciphertext computation results. Moreover, to supervise the cloud, providing the honest IoT data access control, storage, and computing services for users, we recorded the access policy of the data, the hash of the data, the signature of the ciphertext, and the homomorphic signature of the ciphertext on the blockchain. The performance evaluation and security analysis show the proposed scheme is practical and secure.
APA, Harvard, Vancouver, ISO, and other styles
13

Štufi, Martin, Boris Bačić, and Leonid Stoimenov. "Big Data Analytics and Processing Platform in Czech Republic Healthcare." Applied Sciences 10, no. 5 (March 2, 2020): 1705. http://dx.doi.org/10.3390/app10051705.

Full text
Abstract:
Big data analytics (BDA) in healthcare has made a positive difference in the integration of Artificial Intelligence (AI) in advancements of analytical capabilities, while lowering the costs of medical care. The aim of this study is to improve the existing healthcare eSystem by implementing a Big Data Analytics (BDA) platform and to meet the requirements of the Czech Republic National Health Service (Tender-Id. VZ0036628, No. Z2017-035520). In addition to providing analytical capabilities on Linux platforms supporting current and near-future AI with machine-learning and data-mining algorithms, there is the need for ethical considerations mandating new ways to preserve privacy, all of which are preconditioned by the growing body of regulations and expectations. The presented BDA platform, has met all requirements (N > 100), including the healthcare industry-standard Transaction Processing Performance Council (TPC-H) decision support benchmark in compliance with the European Union (EU) and the Czech Republic legislations. Currently, the presented Proof of Concept (PoC) that has been upgraded to a production environment has unified isolated parts of Czech Republic healthcare over the past seven months. The reported PoC BDA platform, artefacts, and concepts are transferrable to healthcare systems in other countries interested in developing or upgrading their own national healthcare infrastructure in a cost-effective, secure, scalable and high-performance manner.
APA, Harvard, Vancouver, ISO, and other styles
14

Shang, Jian, Runmin Guan, and Yuhao Tong. "Microgrid Data Security Sharing Method Based on Blockchain under Internet of Things Architecture." Wireless Communications and Mobile Computing 2022 (April 4, 2022): 1–10. http://dx.doi.org/10.1155/2022/9623934.

Full text
Abstract:
The efficient and secure data sharing mechanism can support the microgrid to achieve more accurate business control, while the current data processing methods have the problems of large computing overhead and low data sharing security. Aiming at the current problems, this paper proposes a microgrid data sharing method based on blockchain technology based on the processing mode of cloud-edge-terminal architecture. Firstly, the elliptic curve encryption algorithm is used on the edge side to encrypt the data collected by the terminal equipment reliably, so as to improve the security and efficiency of microgrid key management. Then, in the cloud, the Reputation-Evaluation Practical Byzantine Fault Tolerant mechanism (REPBFT) based on smart contract and reputation evaluation can effectively manage the data sharing of edge computing devices, avoid the waste of network computing resources, and further improve the efficiency of microgrid data sharing. The simulation results show that when the number of edge devices reaches 25, the calculation and communication overhead of the proposed method are 63.46 ms and 2.66 KB, respectively, and when the processing data reaches 1024 KB, the security of the microgrid system is still 95%, which can realize safe and reliable data sharing and interaction, and can stably support the optimal operation of the microgrid.
APA, Harvard, Vancouver, ISO, and other styles
15

Zhang, Yinghui, Jiangfan Zhao, Dong Zheng, Kaixin Deng, Fangyuan Ren, Xiaokun Zheng, and Jiangang Shu. "Privacy-Preserving Data Aggregation against False Data Injection Attacks in Fog Computing." Sensors 18, no. 8 (August 13, 2018): 2659. http://dx.doi.org/10.3390/s18082659.

Full text
Abstract:
As an extension of cloud computing, fog computing has received more attention in recent years. It can solve problems such as high latency, lack of support for mobility and location awareness in cloud computing. In the Internet of Things (IoT), a series of IoT devices can be connected to the fog nodes that assist a cloud service center to store and process a part of data in advance. Not only can it reduce the pressure of processing data, but also improve the real-time and service quality. However, data processing at fog nodes suffers from many challenging issues, such as false data injection attacks, data modification attacks, and IoT devices’ privacy violation. In this paper, based on the Paillier homomorphic encryption scheme, we use blinding factors to design a privacy-preserving data aggregation scheme in fog computing. No matter whether the fog node and the cloud control center are honest or not, the proposed scheme ensures that the injection data is from legal IoT devices and is not modified and leaked. The proposed scheme also has fault tolerance, which means that the collection of data from other devices will not be affected even if certain fog devices fail to work. In addition, security analysis and performance evaluation indicate the proposed scheme is secure and efficient.
APA, Harvard, Vancouver, ISO, and other styles
16

Al Ridhawi, Ismaeel, Ouns Bouachir, Moayad Aloqaily, and Azzedine Boukerche. "Design Guidelines for Cooperative UAV-supported Services and Applications." ACM Computing Surveys 54, no. 9 (December 31, 2022): 1–35. http://dx.doi.org/10.1145/3467964.

Full text
Abstract:
Internet of Things (IoT) systems have advanced greatly in the past few years, especially with the support of Machine Learning (ML) and Artificial Intelligence (AI) solutions. Numerous AI-supported IoT devices are playing a significant role in providing complex and user-specific smart city services. Given the multitude of heterogeneous wireless networks, the plethora of computer and storage architectures and paradigms, and the abundance of mobile and vehicular IoT devices, true smart city experiences are only attainable through a cooperative intelligent and secure IoT framework. This article provides an extensive study on different cooperative systems and envisions a cooperative solution that supports the integration and collaboration among both centralized and distributed systems, in which intelligent AI-supported IoT devices such as smart UAVs provide support in the data collection, processing and service provisioning process. Moreover, secure and collaborative decentralized solutions such as Blockchain are considered in the service provisioning process to enable enhanced privacy and authentication features for IoT applications. As such, user-specific complex services and applications within smart city environments will be delivered and made available in a timely, secure, and efficient manner.
APA, Harvard, Vancouver, ISO, and other styles
17

Navale, Vivek, Denis von Kaeppler, and Matthew McAuliffe. "An overview of biomedical platforms for managing research data." Journal of Data, Information and Management 3, no. 1 (January 23, 2021): 21–27. http://dx.doi.org/10.1007/s42488-020-00040-0.

Full text
Abstract:
AbstractBiomedical platforms provide the hardware and software to securely ingest, process, validate, curate, store, and share data. Many large-scale biomedical platforms use secure cloud computing technology for analyzing, integrating, and storing phenotypic, clinical, and genomic data. Several web-based platforms are available for researchers to access services and tools for biomedical research. The use of bio-containers can facilitate the integration of bioinformatics software with various data analysis pipelines. Adoption of Common Data Models, Common Data Elements, and Ontologies can increase the likelihood of data reuse. Managing biomedical Big Data will require the development of strategies that can efficiently leverage public cloud computing resources. The use of the research community developed standards for data collection can foster the development of machine learning methods for data processing and analysis. Increasingly platforms will need to support the integration of data from multiple disease area research.
APA, Harvard, Vancouver, ISO, and other styles
18

Pascoal, Túlio, Jérémie Decouchant, Antoine Boutet, and Paulo Esteves-Verissimo. "DyPS: Dynamic, Private and Secure GWAS." Proceedings on Privacy Enhancing Technologies 2021, no. 2 (January 29, 2021): 214–34. http://dx.doi.org/10.2478/popets-2021-0025.

Full text
Abstract:
Abstract Genome-Wide Association Studies (GWAS) identify the genomic variations that are statistically associated with a particular phenotype (e.g., a disease). The confidence in GWAS results increases with the number of genomes analyzed, which encourages federated computations where biocenters would periodically share the genomes they have sequenced. However, for economical and legal reasons, this collaboration will only happen if biocenters cannot learn each others’ data. In addition, GWAS releases should not jeopardize the privacy of the individuals whose genomes are used. We introduce DyPS, a novel framework to conduct dynamic privacy-preserving federated GWAS. DyPS leverages a Trusted Execution Environment to secure dynamic GWAS computations. Moreover, DyPS uses a scaling mechanism to speed up the releases of GWAS results according to the evolving number of genomes used in the study, even if individuals retract their participation consent. Lastly, DyPS also tolerates up to all-but-one colluding biocenters without privacy leaks. We implemented and extensively evaluated DyPS through several scenarios involving more than 6 million simulated genomes and up to 35,000 real genomes. Our evaluation shows that DyPS updates test statistics with a reasonable additional request processing delay (11% longer) compared to an approach that would update them with minimal delay but would lead to 8% of the genomes not being protected. In addition, DyPS can result in the same amount of aggregate statistics as a static release (i.e., at the end of the study), but can produce up to 2.6 times more statistics information during earlier dynamic releases. Besides, we show that DyPS can support a larger number of genomes and SNP positions without any significant performance penalty.
APA, Harvard, Vancouver, ISO, and other styles
19

Hong, Jun, Tao Wen, Quan Guo, and Zhengwang Ye. "Secure kNN Computation and Integrity Assurance of Data Outsourcing in the Cloud." Mathematical Problems in Engineering 2017 (2017): 1–15. http://dx.doi.org/10.1155/2017/8109730.

Full text
Abstract:
As cloud computing has been popularized massively and rapidly, individuals and enterprises prefer outsourcing their databases to the cloud service provider (CSP) to save the expenditure for managing and maintaining the data. The outsourced databases are hosted, and query services are offered to clients by the CSP, whereas the CSP is not fully trusted. Consequently, the security shall be violated by multiple factors. Data privacy and query integrity are perceived as two major factors obstructing enterprises from outsourcing their databases. A novel scheme is proposed in this paper to effectuate k-nearest neighbors (kNN) query and kNN query authentication on an encrypted outsourced spatial database. An asymmetric scalar-product-preserving encryption scheme is elucidated, in which data points and query points are encrypted with diverse encryption keys, and the CSP can determine the distance relation between encrypted data points and query points. Furthermore, the similarity search tree is extended to build a novel verifiable SS-tree that supports efficient kNN query and kNN query verification. It is indicated from the security analysis and experiment results that our scheme not only maintains the confidentiality of outsourced confidential data and query points but also has a lower kNN query processing and verification overhead than the MR-tree.
APA, Harvard, Vancouver, ISO, and other styles
20

Chen, Yingwen, Bowen Hu, Hujie Yu, Zhimin Duan, and Junxin Huang. "A Threshold Proxy Re-Encryption Scheme for Secure IoT Data Sharing Based on Blockchain." Electronics 10, no. 19 (September 27, 2021): 2359. http://dx.doi.org/10.3390/electronics10192359.

Full text
Abstract:
The IoT devices deployed in various application scenarios will generate massive data with immeasurable value every day. These data often contain the user’s personal privacy information, so there is an imperative need to guarantee the reliability and security of IoT data sharing. We proposed a new encrypted data storing and sharing architecture by combining proxy re-encryption with blockchain technology. The consensus mechanism based on threshold proxy re-encryption eliminates dependence on the third-party central service providers. Multiple consensus nodes in the blockchain network act as proxy service nodes to re-encrypt data and combine converted ciphertext, and personal information will not be disclosed in the whole procedure. That eliminates the restrictions of using decentralized network to store and distribute private encrypted data safely. We implemented a lot of simulated experiments to evaluate the performance of the proposed framework. The results show that the proposed architecture can meet the extensive data access demands and increase a tolerable time latency. Our scheme is one of the essays to utilize the threshold proxy re-encryption and blockchain consensus algorithm to support IoT data sharing.
APA, Harvard, Vancouver, ISO, and other styles
21

Cho, Jae Hyuk, Yunhee Kang, and Young B. Park. "Secure Delivery Scheme of Common Data Model for Decentralized Cloud Platforms." Applied Sciences 10, no. 20 (October 13, 2020): 7134. http://dx.doi.org/10.3390/app10207134.

Full text
Abstract:
The Common Data Model (CDM) is being used to deal with problems caused by the various electronic medical record structures in the distributed hospital information system. The concept of CDM is emerging as a collaborative method of exchanging data from each hospital in the same format and conducting various clinical studies based on shared data. The baseline of a CDM system is centralized with an infrastructure typically controlled by a single entity with full authority. The characteristics of this centralized system can pose serious security issues. Therefore, the proposed SC-CDM system is designed as a platform for distributed ledger and provides data with a high level of confidentiality, security, and scalability. This framework provides a reference model that supports multiple channels, using secure CDM as an encryption method. The data confidentiality of CDM is guaranteed by asymmetric and symmetric protocols. Delivering CDM is protected by a symmetric key signed by the CDM creator and maintains lightweight distributed ledger transactions on Inter Planetary File System (IPFS), which acts as a file share. To deliver an encrypted CDM on the SC-CDM platform, the CDM is encrypted with a block cipher by a random symmetric key and Initialization Vector (IV). The symmetric key protocol is used for the fast encryption of large-capacity data. The SC-CDM is implemented the repository with IPFS for storing the encrypted CDM, in which symmetric key, two hash values, and IV are shared through blockchain. Data confidentiality of SC-CDM is guaranteed by only registered users accessing the data. In conclusion, the SC-CDM is the first approach to demultiplexing with the data confidentiality proof based on asymmetric key cryptography. We analyze and verify the security of SC-CDM by comparing qualitative factors and performance with existing CDM. Moreover, we adopt a byte-level processing method with encryption to ensure efficiency while handling a large CDM.
APA, Harvard, Vancouver, ISO, and other styles
22

Khalid, Haqi, Shaiful Jahari Hashim, Sharifah Mumtazah Syed Ahmad, Fazirulhisyam Hashim, and Muhammad Akmal Chaudhary. "Cross-SN: A Lightweight Authentication Scheme for a Multi-Server Platform Using IoT-Based Wireless Medical Sensor Network." Electronics 10, no. 7 (March 26, 2021): 790. http://dx.doi.org/10.3390/electronics10070790.

Full text
Abstract:
Several wireless devices and applications can be connected through wireless communication technologies to exchange data in future intelligent health systems (e.g., the Internet of Medical Things (IoMT)). Smart healthcare requires ample bandwidth, reliable and effective communications networks, energy-efficient operations, and quality of service support (QoS). Healthcare service providers host multi-servers to ensure seamless services are provided to the end-users. By supporting a multi-server environment, healthcare medical sensors produce many data transmitted via servers, which is impossible in a single-server architecture. To ensure data security, secure online communication must be considered since the transmitted data are sensitive. Hence, the adversary may try to interrupt the transmission and drop or modify the message. Many researchers have proposed an authentication scheme to secure the data, but the schemes are vulnerable to specific attacks (modification attacks, replay attacks, server spoofing attacks, Man-in-the middle (MiTM) attacks, etc.). However, the absence of an authentication scheme that supports a multi-server security in such a comprehensive development in a distributed server is still an issue. In this paper, a secure authentication scheme using wireless medical sensor networks for a multi-server environment is proposed (Cross-SN). The scheme is implemented with a smart card, password, and user identity. Elliptic curve cryptography is utilized in the scheme, and Burrows–Abadi–Needham (BAN) logic is utilized to secure mutual authentication and to analyse the proposed scheme’s security. It offers adequate protection against replies, impersonation, and privileged insider attacks and secure communication in multi-server parties that communicate with each other.
APA, Harvard, Vancouver, ISO, and other styles
23

Vaghela, Uddhav, Simon Rabinowicz, Paris Bratsos, Guy Martin, Epameinondas Fritzilas, Sheraz Markar, Sanjay Purkayastha, et al. "Using a Secure, Continually Updating, Web Source Processing Pipeline to Support the Real-Time Data Synthesis and Analysis of Scientific Literature: Development and Validation Study." Journal of Medical Internet Research 23, no. 5 (May 6, 2021): e25714. http://dx.doi.org/10.2196/25714.

Full text
Abstract:
Background The scale and quality of the global scientific response to the COVID-19 pandemic have unquestionably saved lives. However, the COVID-19 pandemic has also triggered an unprecedented “infodemic”; the velocity and volume of data production have overwhelmed many key stakeholders such as clinicians and policy makers, as they have been unable to process structured and unstructured data for evidence-based decision making. Solutions that aim to alleviate this data synthesis–related challenge are unable to capture heterogeneous web data in real time for the production of concomitant answers and are not based on the high-quality information in responses to a free-text query. Objective The main objective of this project is to build a generic, real-time, continuously updating curation platform that can support the data synthesis and analysis of a scientific literature framework. Our secondary objective is to validate this platform and the curation methodology for COVID-19–related medical literature by expanding the COVID-19 Open Research Dataset via the addition of new, unstructured data. Methods To create an infrastructure that addresses our objectives, the PanSurg Collaborative at Imperial College London has developed a unique data pipeline based on a web crawler extraction methodology. This data pipeline uses a novel curation methodology that adopts a human-in-the-loop approach for the characterization of quality, relevance, and key evidence across a range of scientific literature sources. Results REDASA (Realtime Data Synthesis and Analysis) is now one of the world’s largest and most up-to-date sources of COVID-19–related evidence; it consists of 104,000 documents. By capturing curators’ critical appraisal methodologies through the discrete labeling and rating of information, REDASA rapidly developed a foundational, pooled, data science data set of over 1400 articles in under 2 weeks. These articles provide COVID-19–related information and represent around 10% of all papers about COVID-19. Conclusions This data set can act as ground truth for the future implementation of a live, automated systematic review. The three benefits of REDASA’s design are as follows: (1) it adopts a user-friendly, human-in-the-loop methodology by embedding an efficient, user-friendly curation platform into a natural language processing search engine; (2) it provides a curated data set in the JavaScript Object Notation format for experienced academic reviewers’ critical appraisal choices and decision-making methodologies; and (3) due to the wide scope and depth of its web crawling method, REDASA has already captured one of the world’s largest COVID-19–related data corpora for searches and curation.
APA, Harvard, Vancouver, ISO, and other styles
24

Fu, Yanxia, Yanli Ren, Guorui Feng, Xinpeng Zhang, and Chuan Qin. "Non-Interactive and Secure Data Aggregation Scheme for Internet of Things." Electronics 10, no. 20 (October 11, 2021): 2464. http://dx.doi.org/10.3390/electronics10202464.

Full text
Abstract:
The popularity of mobile devices in Internet of Things has brought great convenience to the lives of the people. Massive data generated in the IoT are outsourced and stored on cloud platforms so that data aggregation and analysis can be performed on the massive data. However, these data often contain sensitive information of mobile devices, so effective protection of mobile user privacy is the primary condition for further development of IoT. Most of the current data aggregation schemes require a lot of interactions between users, and thus this paper designs a non-interactive secure multidimensional data aggregation scheme. This scheme adopts an additive secret sharing technique to mask the shared data and send it to two non-colluding servers, and then the servers aggregate the ciphertext respectively. Different from the existing schemes, our proposed scheme achieves non-interaction between users, and the aggregation result is kept confidential to the server and supports mobile users offline. Finally, we perform an experimental evaluation which proves the effectiveness of our scheme.
APA, Harvard, Vancouver, ISO, and other styles
25

Rehman, Amjad, Khalid Haseeb, Tanzila Saba, Jaime Lloret, and Usman Tariq. "Secured Big Data Analytics for Decision-Oriented Medical System Using Internet of Things." Electronics 10, no. 11 (May 27, 2021): 1273. http://dx.doi.org/10.3390/electronics10111273.

Full text
Abstract:
The Internet of Medical Things (IoMT) has shown incredible development with the growth of medical systems using wireless information technologies. Medical devices are biosensors that can integrate with physical things to make smarter healthcare applications that are collaborated on the Internet. In recent decades, many applications have been designed to monitor the physical health of patients and support expert teams for appropriate treatment. The medical devices are attached to patients’ bodies and connected with a cloud computing system for obtaining and analyzing healthcare data. However, such medical devices operate on battery powered sensors with limiting constraints in terms of memory, transmission, and processing resources. Many healthcare solutions are helping the community with the efficient monitoring of patients’ conditions using cloud computing, however, mostly incur latency in data collection and storage. Therefore, this paper presents a model for the Secured Big Data analytics using Edge–Cloud architecture (SBD-EC), which aims to provide distributed and timely computation of a decision-oriented medical system. Moreover, the mobile edges cooperate with the cloud level to present a secure algorithm, achieving reliable availability of medical data with privacy and security against malicious actions. The performance of the proposed model is evaluated in simulations and the results obtained demonstrate significant improvement over other solutions.
APA, Harvard, Vancouver, ISO, and other styles
26

Wang, Jing, Libing Wu, Sherali Zeadally, Muhammad Khurram Khan, and Debiao He. "Privacy-preserving Data Aggregation against Malicious Data Mining Attack for IoT-enabled Smart Grid." ACM Transactions on Sensor Networks 17, no. 3 (June 21, 2021): 1–25. http://dx.doi.org/10.1145/3440249.

Full text
Abstract:
Internet of Things (IoT)-enabled smart grids can achieve more reliable and high-frequency data collection and transmission compared with existing grids. However, this frequent data processing may consume a lot of bandwidth, and even put the user’s privacy at risk. Although many privacy-preserving data aggregation schemes have been proposed to solve the problem, they still suffer from some security weaknesses or performance deficiency, such as lack of satisfactory data confidentiality and resistance to malicious data mining attack. To address these issues, we propose a novel privacy-preserving data aggregation scheme (called PDAM) for IoT-enabled smart grids, which can support efficient data source authentication and integrity checking, secure dynamic user join and exit. Unlike existing schemes, the PDAM is resilient to the malicious data mining attack launched by internal or external attackers and can achieve perfect data confidentiality against not only a malicious aggregator but also a curious control center for an authorized user. The detailed security and performance analysis show that our proposed PDAM can satisfy several well-known security properties and desirable efficiency for a smart grid system. Moreover, the comparative studies and experiments demonstrate that the PDAM is superior to other recently proposed works in terms of both security and performance.
APA, Harvard, Vancouver, ISO, and other styles
27

Sajid, Faiqa, Muhammad Abul Hassan, Ayaz Ali Khan, Muhammad Rizwan, Natalia Kryvinska, Karovič Vincent, and Inam Ullah Khan. "Secure and Efficient Data Storage Operations by Using Intelligent Classification Technique and RSA Algorithm in IoT-Based Cloud Computing." Scientific Programming 2022 (April 14, 2022): 1–10. http://dx.doi.org/10.1155/2022/2195646.

Full text
Abstract:
In mobile cloud services, smartphones may depend on IoT-based cloud infrastructure and information storage tools to conduct technical errands, such as quest, information processing, and combined networks. In addition to traditional finding institutions, the smart IoT-cloud often upgrades the normal impromptu structure by treating mobile devices as corporate hubs, e.g., by identifying institutions. This has many benefits from the start, with several significant problems to be overcome in order to enhance the unwavering consistency of the cloud environment while Internet of things connects and improves decision support system of the entire network. In fact, similar issues apply to monitor loading, resistance, and other security risks in the cloud state. Right now, we are looking at changed arrangement procedures in MATLAB utilizing cardiovascular failure information and afterward protecting that information with the assistance of RSA calculation in mobile cloud. The calculations tried are SVM, RF, DT, NB, and KNN. In the outcome, the order strategies that have the best exactness result to test respiratory failure information will be recommended for use for enormous scope information. Instead, the collected data will be transferred to the mobile cloud for preservation using the RSA encryption algorithm.
APA, Harvard, Vancouver, ISO, and other styles
28

Li, Jia, and Jie Zhang. "Privacy-Preserving Sports Wearable Data Fusion Framework." Computational Intelligence and Neuroscience 2022 (May 4, 2022): 1–7. http://dx.doi.org/10.1155/2022/6131971.

Full text
Abstract:
When the sports industry has access to advanced training and preparation techniques, the sports sector is entering a new era, where real-time data processing services have a crucial priority in improving physical fitness and avoiding injuries to athletes. The primary sports support methodology is based on multiple sensors, mainly wearables, often of different types and technology, which collect somatometric data in real time and are usually analyzed with deep learning technologies. And while modern athletes train and prepare intelligently using the innovative techniques of available technology, there is considerable concern about the use of personal data. There is great concern about cyberattacks and possible data leaks that could affect the sports industry and sports in general. To secure the personal data of athletes collected and analyzed by sports wearables, this paper presents a privacy-preserving sports wearable data fusion framework. This is an advanced methodology based on Lagrange's relaxation method for the problem of multiple assignments and synthesis of information by numerous sensors and the use of differential privacy to access databases with personal information, ensuring that this information will remain personal without a third entity may disclose the identity of the athlete who provided the data.
APA, Harvard, Vancouver, ISO, and other styles
29

Vidal, Ivan, Borja Nogales, Diego Lopez, Juan Rodríguez, Francisco Valera, and Arturo Azcorra. "A Secure Link-Layer Connectivity Platform for Multi-Site NFV Services." Electronics 10, no. 15 (August 3, 2021): 1868. http://dx.doi.org/10.3390/electronics10151868.

Full text
Abstract:
Network Functions Virtualization (NFV) is a key technology for network automation and has been instrumental to materialize the disruptive view of 5G and beyond mobile networks. In particular, 5G embraces NFV to support the automated and agile provision of telecommunication and vertical services as a composition of versatile virtualized components, referred to as Virtual Network Functions (VNFs). It provides a high degree of flexibility in placing these components on distributed NFV infrastructures (e.g., at the network edge, close to end users). Still, this flexibility creates new challenges in terms of VNF connectivity. To address these challenges, we introduce a novel secure link-layer connectivity platform, L2S. Our solution can automatically be deployed and configured as a regular multi-site NFV service, providing the abstraction of a layer-2 switch that offers link-layer connectivity to VNFs deployed on remote NFV sites. Inter-site communications are effectively protected using existing security solutions and protocols, such as IP security (IPsec). We have developed a functional prototype of L2S using open-source software technologies. Our evaluation results indicate that this prototype can perform IP tunneling and cryptographic operations at Gb/s data rates. Finally, we have validated L2S using a multi-site NFV ecosystem at the Telefonica Open Network Innovation Centre (5TONIC), using our solution to support a multicast-based IP television service.
APA, Harvard, Vancouver, ISO, and other styles
30

Marcos, Carlos, Arturo González-Ferrer, Mor Peleg, and Carlos Cavero. "Solving the interoperability challenge of a distributed complex patient guidance system: a data integrator based on HL7’s Virtual Medical Record standard." Journal of the American Medical Informatics Association 22, no. 3 (April 16, 2015): 587–99. http://dx.doi.org/10.1093/jamia/ocv003.

Full text
Abstract:
Abstract Objective We show how the HL7 Virtual Medical Record (vMR) standard can be used to design and implement a data integrator (DI) component that collects patient information from heterogeneous sources and stores it into a personal health record, from which it can then retrieve data. Our working hypothesis is that the HL7 vMR standard in its release 1 version can properly capture the semantics needed to drive evidence-based clinical decision support systems. Materials and Methods To achieve seamless communication between the personal health record and heterogeneous data consumers, we used a three-pronged approach. First, the choice of the HL7 vMR as a message model for all components accompanied by the use of medical vocabularies eases their semantic interoperability. Second, the DI follows a service-oriented approach to provide access to system components. Third, an XML database provides the data layer. Results The DI supports requirements of a guideline-based clinical decision support system implemented in two clinical domains and settings, ensuring reliable and secure access, high performance, and simplicity of integration, while complying with standards for the storage and processing of patient information needed for decision support and analytics. This was tested within the framework of a multinational project (www.mobiguide-project.eu) aimed at developing a ubiquitous patient guidance system (PGS). Discussion The vMR model with its extension mechanism is demonstrated to be effective for data integration and communication within a distributed PGS implemented for two clinical domains across different healthcare settings in two nations.
APA, Harvard, Vancouver, ISO, and other styles
31

Madan, Suman, and Puneet Goswami. "A Technique for Securing Big Data Using K-Anonymization With a Hybrid Optimization Algorithm." International Journal of Operations Research and Information Systems 12, no. 4 (October 2021): 1–21. http://dx.doi.org/10.4018/ijoris.20211001.oa3.

Full text
Abstract:
The recent techniques built on cloud computing for data processing is scalable and secure, which increasingly attracts the infrastructure to support big data applications. This paper proposes an effective anonymization based privacy preservation model using k-anonymization criteria and Grey wolf-Cat Swarm Optimization (GWCSO) for attaining privacy preservation in big data. The anonymization technique is processed by adapting k- anonymization criteria for duplicating k records from the original database. The proposed GWCSO is developed by integrating Grey Wolf Optimizer (GWO) and Cat Swarm Optimization (CSO) for constructing the k-anonymized database, which reveals only the essential details to the end users by hiding the confidential information. The experimental results of the proposed technique are compared with various existing techniques based on the performance metrics, such as Classification accuracy (CA) and Information loss (IL). The experimental results show that the proposed technique attains an improved CA value of 0.005 and IL value of 0.798, respectively.
APA, Harvard, Vancouver, ISO, and other styles
32

Tually, Peter, Johan Janssen, Simon Cowell, and John Walker. "A preliminary assessment of Internet-based nuclear telecardiology to support the clinical management of cardiac disease in a remote community." Journal of Telemedicine and Telecare 9, no. 1_suppl (June 2003): 69–71. http://dx.doi.org/10.1258/135763303322196411.

Full text
Abstract:
summary A portable nuclear medicine (NM) processing system was established in Kalgoorlie and an acute myocardial perfusion scintigraphy (MPS) service was provided for the local regional hospital. After scanning the patient, the data were processed on a laptop computer and JPEG images were transmitted to a secure Web server. A secure email message, with the URL link enclosed and a provisional indication of normal or abnormal findings, was sent to the referring clinician from the NM facility. Use of the Internet allowed for a group consultation between the NM technician, the referrer and the cardiologist in Perth. During a three-month study period, 42 patients were referred for exclusion of acute coronary syndrome. Of these, 21 (50%) demonstrated abnormal perfusion studies, two of which were classified as requiring urgent medical intervention. Seventeen studies were normal (41%) and four (10%) were designated equivocal. There was an alteration in the treatment plan for 32 patients (76%), including four for whom admission or further investigation was deemed unwarranted. The results suggest that MPS findings, distributed via the Internet, allow for earlier risk stratification and have a direct affect on clinical decision making.
APA, Harvard, Vancouver, ISO, and other styles
33

Hanisah Kamaruzaman, Siti, Wan Nor Shuhadah Wan Nik, Mohamad Afendee Mohamed, and Zarina Mohamad. "Design and Implementation of Data-at-Rest Encryption for Hadoop." International Journal of Engineering & Technology 7, no. 2.15 (April 6, 2018): 54. http://dx.doi.org/10.14419/ijet.v7i2.15.11212.

Full text
Abstract:
The manuscript should contain an abstract. The security aspects in Cloud computing is paramount in order to ensure high quality of Service Level Agreement (SLA) to the cloud computing customers. This issue is more apparent when very large amount of data is involved in this emerging computing environment. Hadoop is an open source software framework that supports large data sets storage and processing in a distributed computing environment and well-known implementation of Map Reduce. Map Reduce is one common programming model to process and handle a large amount of data, specifically in big data analysis. Further, Hadoop Distributed File System (HDFS) is a distributed, scalable and portable file system that is written in java for Hadoop framework. However, the main problem is that the data at rest is not secure where intruders can steal or converts the data stored in this computing environment. Therefore, the AES encryption algorithm has been implemented in HDFS to ensure the security of data stored in HDFS. It is shown that the implementation of AES encryption algorithm is capable to secure data stored in HDFS to some extent.
APA, Harvard, Vancouver, ISO, and other styles
34

Guan, Zhitao, Jing Li, Longfei Wu, Yue Zhang, Jun Wu, and Xiaojiang Du. "Achieving Efficient and Secure Data Acquisition for Cloud-Supported Internet of Things in Smart Grid." IEEE Internet of Things Journal 4, no. 6 (December 2017): 1934–44. http://dx.doi.org/10.1109/jiot.2017.2690522.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

SHYU, MEI-LING, SHU-CHING CHEN, QIBIN SUN, and HEATHER YU. "OVERVIEW AND FUTURE TRENDS OF MULTIMEDIA RESEARCH FOR CONTENT ACCESS AND DISTRIBUTION." International Journal of Semantic Computing 01, no. 01 (March 2007): 29–66. http://dx.doi.org/10.1142/s1793351x07000044.

Full text
Abstract:
The advances in information technology, computational capability, and communication networks have enabled large-scale data collection and distribution of vast amounts of multimedia data available to consumer and enterprise applications. With the proliferation of multimedia data and ever-growing requests for multimedia applications, reliable and efficient tools and techniques are urgently sought for multimedia content analysis and retrieval, as well as secure media streaming, distribution or communication. Though many research efforts have been devoted to these aspects, it is still far from maturity and there exist many open issues. In this paper, an overview of the challenges and issues as well as future trends on multimedia content access and distribution will be discussed. The focuses include: (i) how to bridge the gaps between the semantic meaning and low-level media characteristics; (ii) how to handle user perception subjectivity problem; (iii) how to provide multimedia network communication support for media streaming and P2P (peer-to-peer) media distribution; and (iv) how to ensure various aspects of secure media, including acquisition, processing, storage, and communication.
APA, Harvard, Vancouver, ISO, and other styles
36

Jaiswal, Rajyalakshmi, Sailesh Suryanarayan Iyer, and Kavita Arora. "Intensify Assured PHR Sharing in Cloud Using Entropy Based Signature Cryptosystem." ECS Transactions 107, no. 1 (April 24, 2022): 2593–98. http://dx.doi.org/10.1149/10701.2593ecst.

Full text
Abstract:
Personal Health Record (PHR) in cloud database is stored, manipulated, and shared with users for high value of health care. The challenge is privacy of sensitive personal information and protection of data. In this paper, the attribute entropy key along with signature cryptosystem is proposed to move the data’s stored in third party cloud server in a secure way. In the proposed work, the PHR data is taken as an input and an attribute is selected for which attribute key generated using KEK (Key Encryption Key) algorithm. To secure the generated key, entropy is calculated for PHR data. This key is used to encrypt the PHR data. Then signature based on support value is generated for the PHR data and it is stored with the encrypted data in the cloud. Now, when the users want to access data, decryption is allowed after verification of both generated private key and signature which basically is for data integrity. Once the verifications are successful, then the authorized users are granted permission to access data from the cloud. The results display that the suggested method performs better when compared with other existing methods on parameters like cipher text size, query processing time, communication cost, and decryption time.
APA, Harvard, Vancouver, ISO, and other styles
37

Abbas, Khizar, Lo’Ai A. Tawalbeh, Ahsan Rafiq, Ammar Muthanna, Ibrahim A. Elgendy, and Ahmed A. Abd El-Latif. "Convergence of Blockchain and IoT for Secure Transportation Systems in Smart Cities." Security and Communication Networks 2021 (April 22, 2021): 1–13. http://dx.doi.org/10.1155/2021/5597679.

Full text
Abstract:
Smart cities provide citizens with smart and advanced services to improve their quality of life. However, it has been observed that the collection, storage, processing, and analysis of heterogeneous data that are usually borne by citizens will bear certain difficulties. The development of the Internet of Things, cloud computing, social media, and other Industry 4.0 influencers pushed technology into a smart society’s framework, bringing potential vulnerabilities to sensor data, services, and smart city applications. These vulnerabilities lead to data security problems. We propose a decentralized data management system for smart and secure transportation that uses blockchain and the Internet of Things in a sustainable smart city environment to solve the data vulnerability problem. A smart transportation mobility system demands creating an interconnected transit system to ensure flexibility and efficiency. This article introduces prior knowledge and then provides a Hyperledger Fabric-based data architecture that supports a secure, trusted, smart transportation system. The simulation results show the balance between the blockchain mining time and the number of blocks created. We also use the average transaction delay evaluation model to evaluate the model and to test the proposed system’s performance. The system will address residents’ and authorities’ security challenges of the transportation system in smart, sustainable cities and lead to better governance.
APA, Harvard, Vancouver, ISO, and other styles
38

Chang, Jan-Kai, Hui Fang, Christopher A. Bower, Enming Song, Xinge Yu, and John A. Rogers. "Materials and processing approaches for foundry-compatible transient electronics." Proceedings of the National Academy of Sciences 114, no. 28 (June 26, 2017): E5522—E5529. http://dx.doi.org/10.1073/pnas.1707849114.

Full text
Abstract:
Foundry-based routes to transient silicon electronic devices have the potential to serve as the manufacturing basis for “green” electronic devices, biodegradable implants, hardware secure data storage systems, and unrecoverable remote devices. This article introduces materials and processing approaches that enable state-of-the-art silicon complementary metal-oxide-semiconductor (CMOS) foundries to be leveraged for high-performance, water-soluble forms of electronics. The key elements are (i) collections of biodegradable electronic materials (e.g., silicon, tungsten, silicon nitride, silicon dioxide) and device architectures that are compatible with manufacturing procedures currently used in the integrated circuit industry, (ii) release schemes and transfer printing methods for integration of multiple ultrathin components formed in this way onto biodegradable polymer substrates, and (iii) planarization and metallization techniques to yield interconnected and fully functional systems. Various CMOS devices and circuit elements created in this fashion and detailed measurements of their electrical characteristics highlight the capabilities. Accelerated dissolution studies in aqueous environments reveal the chemical kinetics associated with the underlying transient behaviors. The results demonstrate the technical feasibility for using foundry-based routes to sophisticated forms of transient electronic devices, with functional capabilities and cost structures that could support diverse applications in the biomedical, military, industrial, and consumer industries.
APA, Harvard, Vancouver, ISO, and other styles
39

Navale, Vivek, and Matthew McAuliffe. "Long-term preservation of biomedical research data." F1000Research 7 (August 29, 2018): 1353. http://dx.doi.org/10.12688/f1000research.16015.1.

Full text
Abstract:
Genomics and molecular imaging, along with clinical and translational research have transformed biomedical science into a data-intensive scientific endeavor. For researchers to benefit from Big Data sets, developing long-term biomedical digital data preservation strategy is very important. In this opinion article, we discuss specific actions that researchers and institutions can take to make research data a continued resource even after research projects have reached the end of their lifecycle. The actions involve utilizing an Open Archival Information System model comprised of six functional entities: Ingest, Access, Data Management, Archival Storage, Administration and Preservation Planning. We believe that involvement of data stewards early in the digital data life-cycle management process can significantly contribute towards long term preservation of biomedical data. Developing data collection strategies consistent with institutional policies, and encouraging the use of common data elements in clinical research, patient registries and other human subject research can be advantageous for data sharing and integration purposes. Specifically, data stewards at the onset of research program should engage with established repositories and curators to develop data sustainability plans for research data. Placing equal importance on the requirements for initial activities (e.g., collection, processing, storage) with subsequent activities (data analysis, sharing) can improve data quality, provide traceability and support reproducibility. Preparing and tracking data provenance, using common data elements and biomedical ontologies are important for standardizing the data description, making the interpretation and reuse of data easier. The Big Data biomedical community requires scalable platform that can support the diversity and complexity of data ingest modes (e.g. machine, software or human entry modes). Secure virtual workspaces to integrate and manipulate data, with shared software programs (e.g., bioinformatics tools), can facilitate the FAIR (Findable, Accessible, Interoperable and Reusable) use of data for near- and long-term research needs.
APA, Harvard, Vancouver, ISO, and other styles
40

Mehdi, Mubarak, Muhammad Taha Ajani, Hasan Tahir, Shahzaib Tahir, Zahoor Alizai, Fawad Khan, Qaiser Riaz, and Mehdi Hussain. "PUF-Based Key Generation Scheme for Secure Group Communication Using MEMS." Electronics 10, no. 14 (July 15, 2021): 1691. http://dx.doi.org/10.3390/electronics10141691.

Full text
Abstract:
Consumer electronics manufacturers have been incorporating support for 4G/5G communication technologies into many electronic devices. Thus, highly capable Internet of Things (IoT)-ready versions of electronic devices are being purchased which will eventually replace traditional consumer electronics. With the goal of creating a smart environment, the IoT devices enable data sharing, sensing, awareness, increased control. Enabled by high-speed networks, the IoT devices function in a group setting thus compounding the attack surface leading to security and privacy concerns. This research is a study on the possibility of incorporating PUF as a basis for group key generation. The challenge here lies in identifying device features that are unique, stable, reproducible and unpredictable by an adversary. Each device generates its own identity leading to collaborative cryptographic key generation in a group setting. The research uses a comprehensive hardware testbed to demonstrate the viability of PUFs for the generation of a symmetric key through collaboration. Detailed analysis of the proposed setup and the symmetric key generation scheme has shown that the system is scalable and offers unrivalled advantages compared to conventional cryptographic implementations.
APA, Harvard, Vancouver, ISO, and other styles
41

Abdullah, Lamya, and Juan Quintero. "Sealed computation: a mechanism to support privacy-aware trustworthy cloud service." Information & Computer Security 27, no. 5 (November 11, 2019): 601–20. http://dx.doi.org/10.1108/ics-11-2018-0133.

Full text
Abstract:
Purpose The purpose of this study is to propose an approach to avoid having to trust a single entity in cloud-based applications. In cloud computing, data processing is delegated to a remote party for efficiency and flexibility reasons. A practical user requirement usually is data privacy; hence, the confidentiality and integrity of data processing needs to be protected. In the common scenarios of cloud computing today, this can only be achieved by assuming that the remote party does not in any form act maliciously. Design/methodology/approach An approach that avoids having to trust a single entity is proposed. This approach is based on two concepts: the technical abstraction of sealed computation, i.e. a technical mechanism to confine a privacy-aware processing of data within a tamper-proof hardware container, and the role of an auditing party that itself cannot add functionality to the system but is able to check whether the system (including the mechanism for sealed computation) works as expected. Findings Discussion and analysis of the abstract, technical and procedural requirements of these concepts and how they can be applied in practice are explained. Originality/value A preliminary version of this paper was published in the proceedings of the second International Workshop on SECurity and Privacy Requirements Engineering (SECPRE, 2018).
APA, Harvard, Vancouver, ISO, and other styles
42

Nedeljković, Dušan, Živana Jakovljević, Zoran Miljković, and Miroslav Pajić. "Detection of cyber-attacks in systems with distributed control based on support vector regression." Telfor Journal 12, no. 2 (2020): 104–9. http://dx.doi.org/10.5937/telfor2002104n.

Full text
Abstract:
Concept of Industry 4.0 and implementation of Cyber Physical Systems (CPS) and Internet of Things (IoT) in industrial plants are changing the way we manufacture. Introduction of industrial IoT leads to ubiquitous communication (usually wireless) between devices in industrial control systems, thus introducing numerous security concerns and opening up wide space for potential malicious threats and attacks. As a consequence of various cyber-attacks, fatal failures can occur on system parts or the system as a whole. Therefore, security mechanisms must be developed to provide sufficient resilience to cyber-attacks and keep the system safe and protected. In this paper we present a method for detection of attacks on sensor signals, based on e insensitive support vector regression (e-SVR). The method is implemented on publicly available data obtained from Secure Water Treatment (SWaT) testbed as well as on a real-world continuous time controlled electro-pneumatic positioning system. In both cases, the method successfully detected all considered attacks (without false positives).
APA, Harvard, Vancouver, ISO, and other styles
43

Kaushik, Shweta, and Charu Gandhi. "Capability Based Outsourced Data Access Control with Assured File Deletion and Efficient Revocation with Trust Factor in Cloud Computing." International Journal of Cloud Applications and Computing 10, no. 1 (January 2020): 64–84. http://dx.doi.org/10.4018/ijcac.2020010105.

Full text
Abstract:
Cloud computing has introduced a paradigm which support data outsourcing to third parties for processing using commodity clusters. It allows the owner to outsource sensitive data and share it with the authorized user while reducing the computation and management cost. Since owners store sensitive data over the cloud, the requirements of access control and data security have also been increasing. To alleviate all the problem requirements, the need has arisen for providing a safe, secure, and sound model. The existing solutions for these problems use pure cryptographic techniques, which increases the computation cost. In this article, the security problems are solved by using a trusted third party and a quorum of key managers. A service provider is responsible for capability-based access control to ensure that only authorized users will be able to access the data. Whenever any data revocation is required, the data owner simply updates this information to the master key manager to revoke a specific number of shares. The model for the proposed work has been presented and its analysis shows how it introduces security features.
APA, Harvard, Vancouver, ISO, and other styles
44

Sarajcev, Petar, Antonijo Kunac, Goran Petrovic, and Marin Despalatovic. "Power System Transient Stability Assessment Using Stacked Autoencoder and Voting Ensemble." Energies 14, no. 11 (May 27, 2021): 3148. http://dx.doi.org/10.3390/en14113148.

Full text
Abstract:
Increased integration of renewable energy sources brings new challenges to the secure and stable power system operation. Operational challenges emanating from the reduced system inertia, in particular, will have important repercussions on the power system transient stability assessment (TSA). At the same time, a rise of the “big data” in the power system, from the development of wide area monitoring systems, introduces new paradigms for dealing with these challenges. Transient stability concerns are drawing attention of various stakeholders as they can be the leading causes of major outages. The aim of this paper is to address the power system TSA problem from the perspective of data mining and machine learning (ML). A novel 3.8 GB open dataset of time-domain phasor measurements signals is built from dynamic simulations of the IEEE New England 39-bus test case power system. A data processing pipeline is developed for features engineering and statistical post-processing. A complete ML model is proposed for the TSA analysis, built from a denoising stacked autoencoder and a voting ensemble classifier. Ensemble consist of pooling predictions from a support vector machine and a random forest. Results from the classifier application on the test case power system are reported and discussed. The ML application to the TSA problem is promising, since it is able to ingest huge amounts of data while retaining the ability to generalize and support real-time decisions.
APA, Harvard, Vancouver, ISO, and other styles
45

Kamaliah, Naily, and Alpha Fadila Juliana Rahman. "Menjaga Kualitas Pembelajaran Praktikum Pengolahan Data Secara Daring pada Mata Pelatihan Analisis dan Interpretasi Data." Andragogi: Jurnal Diklat Teknis Pendidikan dan Keagamaan 9, no. 1 (September 13, 2021): 24–32. http://dx.doi.org/10.36052/andragogi.v9i1.208.

Full text
Abstract:
[MAINTAINING THE QUALITY OF LEARNING DATA PROCESSING PRACTICUM IN SUBJECT OF TRAINING DATA ANALYSIS AND INTERPRETATION]. Online learning during the SARS-CoV2 virus pandemic is a necessity in order to avoid the spread of the virus. But in its implementation, it becomes a challenge, especially in practicum learning. The implementation of online training is a response to the situation and conditions in the SARS-CoV2 virus pandemic which forces classical training to migrate to online training. The purpose of this study was conducted to find out what factors support the achievement of learning objectives, in classical learning, e-learning, and whether the learning strategies that have been applied in e-learning can maintain the quality of training. The instrument in this study was an assessment of the evaluation of learning from participants for the facilitator, on 3 waves of classical PPJFP and 3 waves of PPJFP e-learning, with a total of 126 training participants as respondents. Data processing and analysis were carried out by simple regression analysis, to find out the factors that influence the achievement of learning objectives; as well as an independent sample t-test to compare classical learning methods (face to face) and e-learning. The results showed that the systematic presentation of learning materials and the ability to present the material were the spearheads in the achievement of face-to-face learning. In synchronous learning through the E-learning Platform, aspects of how the facilitator answers questions are important points in learning through e-learning; and e-learning at the syncronus stage is able to maintain the quality of learning.
APA, Harvard, Vancouver, ISO, and other styles
46

Tene Koyazo, Jacques, Moise Avoci Ugwiri, Aimé Lay-Ekuakille, Maria Fazio, Massimo Villari, and Consolatina Liguori. "Collaborative systems for telemedicine diagnosis accuracy." ACTA IMEKO 10, no. 3 (September 30, 2021): 192. http://dx.doi.org/10.21014/acta_imeko.v10i3.1133.

Full text
Abstract:
The transmission of medical data and the possibility for distant healthcare structures to share experiments about a given medical case raises several conceptual and technical questions. Good remote healthcare monitoring deals with more problems in personalized heath data processing compared to the traditional methods nowadays used in several parts of hospitals in the world. The adoption of telemedicine in the healthcare sector has significantly changed medical collaboration. However, to provide good telemedicine services through new technologies such as cloud computing, cloud storage, and so on, a suitable and adaptable framework should be designed. Moreover, in the chain of medical information exchange, between requesting agencies, including physicians, a secure and collaborative platform enhanced the decision-making process. This paper provides an in-depth literature review on the interaction that telemedicine has with cloud-based computing. On the other hand, the paper proposes a framework that can allow various research organizations, healthcare sectors, and government agencies to log data, develop collaborative analysis, and support decision-making. The electrocardiogram (ECG) and electroencephalogram EEG case studies demonstrate the benefit of the proposed approach in data reduction and high-fidelity signal processing to a local level; this can make possible the extracted characteristic features to be communicated to the cloud database.
APA, Harvard, Vancouver, ISO, and other styles
47

Mao, Jian, Wenqian Tian, Yan Zhang, Jian Cui, Hanjun Ma, Jingdong Bian, Jianwei Liu, and Jianhong Zhang. "Co-Check: Collaborative Outsourced Data Auditing in Multicloud Environment." Security and Communication Networks 2017 (2017): 1–13. http://dx.doi.org/10.1155/2017/2948025.

Full text
Abstract:
With the increasing demand for ubiquitous connectivity, wireless technology has significantly improved our daily lives. Meanwhile, together with cloud-computing technology (e.g., cloud storage services and big data processing), new wireless networking technology becomes the foundation infrastructure of emerging communication networks. Particularly, cloud storage has been widely used in services, such as data outsourcing and resource sharing, among the heterogeneous wireless environments because of its convenience, low cost, and flexibility. However, users/clients lose the physical control of their data after outsourcing. Consequently, ensuring the integrity of the outsourced data becomes an important security requirement of cloud storage applications. In this paper, we present Co-Check, a collaborative multicloud data integrity audition scheme, which is based on BLS (Boneh-Lynn-Shacham) signature and homomorphic tags. According to the proposed scheme, clients can audit their outsourced data in a one-round challenge-response interaction with low performance overhead. Our scheme also supports dynamic data maintenance. The theoretical analysis and experiment results illustrate that our scheme is provably secure and efficient.
APA, Harvard, Vancouver, ISO, and other styles
48

Aleksandrov, Mitko, Sisi Zlatanova, and David J. Heslop. "Voxelisation Algorithms and Data Structures: A Review." Sensors 21, no. 24 (December 9, 2021): 8241. http://dx.doi.org/10.3390/s21248241.

Full text
Abstract:
Voxel-based data structures, algorithms, frameworks, and interfaces have been used in computer graphics and many other applications for decades. There is a general necessity to seek adequate digital representations, such as voxels, that would secure unified data structures, multi-resolution options, robust validation procedures and flexible algorithms for different 3D tasks. In this review, we evaluate the most common properties and algorithms for voxelisation of 2D and 3D objects. Thus, many voxelisation algorithms and their characteristics are presented targeting points, lines, triangles, surfaces and solids as geometric primitives. For lines, we identify three groups of algorithms, where the first two achieve different voxelisation connectivity, while the third one presents voxelisation of curves. We can say that surface voxelisation is a more desired voxelisation type compared to solid voxelisation, as it can be achieved faster and requires less memory if voxels are stored in a sparse way. At the same time, we evaluate in the paper the available voxel data structures. We split all data structures into static and dynamic grids considering the frequency to update a data structure. Static grids are dominated by SVO-based data structures focusing on memory footprint reduction and attributes preservation, where SVDAG and SSVDAG are the most advanced methods. The state-of-the-art dynamic voxel data structure is NanoVDB which is superior to the rest in terms of speed as well as support for out-of-core processing and data management, which is the key to handling large dynamically changing scenes. Overall, we can say that this is the first review evaluating the available voxelisation algorithms for different geometric primitives as well as voxel data structures.
APA, Harvard, Vancouver, ISO, and other styles
49

Patel, Rashmi, Soon Nan Wee, Rajagopalan Ramaswamy, Simran Thadani, Jesisca Tandi, Ruchir Garg, Nathan Calvanese, et al. "NeuroBlu, an electronic health record (EHR) trusted research environment (TRE) to support mental healthcare analytics with real-world data." BMJ Open 12, no. 4 (April 2022): e057227. http://dx.doi.org/10.1136/bmjopen-2021-057227.

Full text
Abstract:
PurposeNeuroBlu is a real-world data (RWD) repository that contains deidentified electronic health record (EHR) data from US mental healthcare providers operating the MindLinc EHR system. NeuroBlu enables users to perform statistical analysis through a secure web-based interface. Structured data are available for sociodemographic characteristics, mental health service contacts, hospital admissions, International Classification of Diseases ICD-9/ICD-10 diagnosis, prescribed medications, family history of mental disorders, Clinical Global Impression—Severity and Improvement (CGI-S/CGI-I) and Global Assessment of Functioning (GAF). To further enhance the data set, natural language processing (NLP) tools have been applied to obtain mental state examination (MSE) and social/environmental data. This paper describes the development and implementation of NeuroBlu, the procedures to safeguard data integrity and security and how the data set supports the generation of real-world evidence (RWE) in mental health.ParticipantsAs of 31 July 2021, 562 940 individuals (48.9% men) were present in the data set with a mean age of 33.4 years (SD: 18.4 years). The most frequently recorded diagnoses were substance use disorders (1 52 790 patients), major depressive disorder (1 29 120 patients) and anxiety disorders (1 03 923 patients). The median duration of follow-up was 7 months (IQR: 1.3 to 24.4 months).Findings to dateThe data set has supported epidemiological studies demonstrating increased risk of psychiatric hospitalisation and reduced antidepressant treatment effectiveness among people with comorbid substance use disorders. It has also been used to develop data visualisation tools to support clinical decision-making, evaluate comparative effectiveness of medications, derive models to predict treatment response and develop NLP applications to obtain clinical information from unstructured EHR data.Future plansThe NeuroBlu data set will be further analysed to better understand factors related to poor clinical outcome, treatment responsiveness and the development of predictive analytic tools that may be incorporated into the source EHR system to support real-time clinical decision-making in the delivery of mental healthcare services.
APA, Harvard, Vancouver, ISO, and other styles
50

Becker, Christoph, Luis Faria, and Kresimir Duretec. "Scalable decision support for digital preservation: an assessment." OCLC Systems & Services: International digital library perspectives 31, no. 1 (February 9, 2015): 11–34. http://dx.doi.org/10.1108/oclc-06-2014-0026.

Full text
Abstract:
Purpose – This article aims to evaluate a new architecture for scalable decision-making and control in preservation environments for its ability to address five key goals: scalable content profiling; monitoring of compliance, risks and opportunities; efficient creation of trustworthy plans; context awareness; and loosely coupled preservation ecosystems. Scalable decision support and business intelligence capabilities are required to effectively secure content over time. Design/methodology/approach – We conduct a systematic evaluation of the contributions of the SCAPE Planning and Watch suite to provide effective and scalable decision support capabilities. We discuss the quantitative and qualitative evaluation of advancing the state of art and report on a case study with a national library. Findings – The system provides substantial capabilities for semi-automated, scalable decision-making and control of preservation functions in repositories. Well-defined interfaces allow a flexible integration with diverse institutional environments. The free and open nature of the tool suite further encourages global take-up in the repository communities. Research limitations/implications – The article discusses a number of bottlenecks and factors limiting the real-world scalability of preservation environments. This includes data-intensive processing of large volumes of information, automated quality assurance for preservation actions, and the element of human decision-making. We outline open issues and future work. Practical implications – The open nature of the software suite enables stewardship organizations to integrate the components with their own preservation environments and to contribute to the ongoing improvement of the systems. Originality/value – The paper reports on innovative research and development to provide preservation capabilities. The results of the assessment demonstrate how the system advances the control of digital preservation operations from ad hoc decision-making to proactive, continuous preservation management, through a context-aware planning and monitoring cycle integrated with operational systems.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography