Journal articles on the topic 'Data storage reduction'

To see the other types of publications on this topic, follow the link: Data storage reduction.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Data storage reduction.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Bostoen, Tom, Sape Mullender, and Yolande Berbers. "Power-reduction techniques for data-center storage systems." ACM Computing Surveys 45, no. 3 (June 2013): 1–38. http://dx.doi.org/10.1145/2480741.2480750.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Singhal, Shubhanshi, Pooja Sharma, Rajesh Kumar Aggarwal, and Vishal Passricha. "A Global Survey on Data Deduplication." International Journal of Grid and High Performance Computing 10, no. 4 (October 2018): 43–66. http://dx.doi.org/10.4018/ijghpc.2018100103.

Full text
Abstract:
This article describes how data deduplication efficiently eliminates the redundant data by selecting and storing only single instance of it and becoming popular in storage systems. Digital data is growing much faster than storage volumes, which shows the importance of data deduplication among scientists and researchers. Data deduplication is considered as most successful and efficient technique of data reduction because it is computationally efficient and offers a lossless data reduction. It is applicable to various storage systems, i.e. local storage, distributed storage, and cloud storage. This article discusses the background, components, and key features of data deduplication which helps the reader to understand the design issues and challenges in this field.
APA, Harvard, Vancouver, ISO, and other styles
3

Tong, Yulai, Jiazhen Liu, Hua Wang, Ke Zhou, Rongfeng He, Qin Zhang, and Cheng Wang. "Sieve: A Learned Data-Skipping Index for Data Analytics." Proceedings of the VLDB Endowment 16, no. 11 (July 2023): 3214–26. http://dx.doi.org/10.14778/3611479.3611520.

Full text
Abstract:
Modern data analytics services are coupled with external data storage services, making I/O from remote cloud storage one of the dominant costs for query processing. Techniques such as columnar block-based data organization and compression have become standard practices for these services to save storage and processing cost. However, the problem of effectively skipping irrelevant blocks at low overhead is still open. Existing data-skipping efforts maintain lightweight summaries (e.g., min/max, histograms) for each block to filter irrelevant data. However, such techniques ignore patterns in real-world data, enabling ineffective use of the storage budget and may cause serious false positives. This paper presents Sieve, a learning-enhanced index designed to efficiently filter out irrelevant blocks by capturing data patterns. Specifically, Sieve utilizes piece-wise linear functions to capture block distribution trends over the key space. Based on the captured trends, Sieve trades off storage consumption and false positives by grouping neighboring keys with similar block distributions into a single region. We have evaluated Sieve using Presto, and experiments on real-world datasets demonstrate that Sieve achieves up to 80% reduction in blocks accessed and 42% reduction in query times compared to its counterparts.
APA, Harvard, Vancouver, ISO, and other styles
4

Sheetal, Annabathula Phani, Giddaluru Lalitha, Arepalli Peda Gopi, and Vejendla Lakshman Narayana. "Secured Data Transmission with Integrated Fault Reduction Scheduling in Cloud Computing." Ingénierie des systèmes d information 26, no. 2 (April 30, 2021): 225–30. http://dx.doi.org/10.18280/isi.260209.

Full text
Abstract:
Cloud computing offers end users a scalable and cost-effective way to access multi-platform data. While the Cloud Storage features endorse it, resource loss is also likely. A fault-tolerant mechanism is therefore required to achieve uninterrupted cloud service performances. The two widely used defect-tolerant mechanisms are task relocation and replication. But the replication approach leads to enormous overhead storage and computing as the number of tasks gradually increases. When a large number of defects occur, it creates more overhead storage and time complexity depending on task criticalities. An Integrated Fault Reduction Scheduling (IFRS) cloud computing model is used to resolve these problems. The probability of failure of a VM is calculated by finding the previous failures and active executions in this model. Then a fault-related adaptive recovery timer is retained, modified depending on the fault type. Experimental findings showed that IFRS reached 67% lower storage costs and 24% less response time when comparing with the current technique for sensitive tasks.
APA, Harvard, Vancouver, ISO, and other styles
5

Ming-Huang Chiang, David, Chia-Ping Lin, and Mu-Chen Chen. "Data mining based storage assignment heuristics for travel distance reduction." Expert Systems 31, no. 1 (December 26, 2012): 81–90. http://dx.doi.org/10.1111/exsy.12006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Szekely, Geza, Th Lindblad, L. Hildingsson, and W. Klamra. "On the reduction of data storage from high-dispersion experiments." Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment 292, no. 2 (July 1990): 431–34. http://dx.doi.org/10.1016/0168-9002(90)90398-p.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yasuda, Shin, Jiro Minabe, and Katsunori Kawano. "Optical noise reduction for dc-removed coaxial holographic data storage." Optics Letters 32, no. 2 (December 23, 2006): 160. http://dx.doi.org/10.1364/ol.32.000160.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Firtha, Ferenc. "Development of data reduction function for hyperspectral imaging." Progress in Agricultural Engineering Sciences 3, no. 1 (December 1, 2007): 67–88. http://dx.doi.org/10.1556/progress.3.2007.4.

Full text
Abstract:
Algorithms have been developed for controlling the calibration and the measurement cycles of hyperspectral equipment. Special calibration and preprocessing methods were necessary to obtain suitable signal level and acceptable repeatability of the measurements. Therefore, the effect of the noise of NIR sensor was decreased, the signal level was enhanced and stability was ensured simultaneously. In order to investigate the properties of the number of objects suitable for statistical analysis, the enormous size of acquired hypercube (gigabytes per object) should be reduced by vector-to-scalar mathematical operators in real-time to extract the desired features. The algorithm developed was able to calculate the score of operators during scanning and the matrices were displayed as pseudo-images to show the distribution of the properties on the surface. The operators had to be determined by analysis of a sample set in preliminary experiments. Stored carrot was chosen as a model sample for investigation of the detection of moisture loss by hyperspectral properties. Determination of the proper operator on different tissues could help to analyze and model drying process and to control storage. Hyperspectral data of different carrot cultivars were tested under different storage conditions. Using improved measurement method the spectral parameter of the suitable operator described quite well the moisture loss of the different carrot tissues.
APA, Harvard, Vancouver, ISO, and other styles
9

Abd Manan, Wan Nurazieelin Wan, and Mohamad Aizi Salamat. "Concept of minimizing the response time for reducing dynamic data redundancy in cloud computing." Indonesian Journal of Electrical Engineering and Computer Science 15, no. 3 (September 1, 2019): 1597. http://dx.doi.org/10.11591/ijeecs.v15.i3.pp1597-1602.

Full text
Abstract:
<span>Reduction of dynamic data redundancy in cloud computing is one of the best ways to maintain the storage capacity from being fully utilized. Cloud storage is a part of cloud computing technology which holds a high demand in any organization for reducing the cost of purchasing and maintaining storage infrastructures. Increase in the number of users will require a larger storage capacity for storing their data. Reduction of dynamic data redundancy allows service providers to be energy savvy and minimize maintenance cost. Recent researches focus more on static data nature despite its limited capability as compared to dynamic data characteristic in cloud storage. Therefore, this paper theoretically compares various techniques for reduction of redundant dynamic data in cloud computing and suggests the best technique for completing the task in terms of response time.</span>
APA, Harvard, Vancouver, ISO, and other styles
10

Kim, Jang Hyun, and Hyunseok Yang. "TuC-1-4 NOISE REDUCTION METHOD USING EXTENDED KALMAN FILTER FOR TILT SERVO CONTROL IN HOLOGRAPHIC DATA STORAGE SYSTEM." Proceedings of JSME-IIP/ASME-ISPS Joint Conference on Micromechatronics for Information and Precision Equipment : IIP/ISPS joint MIPE 2015 (2015): _TuC—1–4–1—_TuC—1–4–3. http://dx.doi.org/10.1299/jsmemipe.2015._tuc-1-4-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

., Tin Thein Thwel, and G. R. Sinha . "Efficient Data Deduplication Mechanism for Genomic Data." CSVTU International Journal of Biotechnology Bioinformatics and Biomedical 4, no. 2 (September 3, 2019): 52–58. http://dx.doi.org/10.30732/ijbbb.20190402004.

Full text
Abstract:
During the data science age, many people tend to access health concerned information and diagnosis using information technology, including telemedicine. Therefore, many researchers attempting to work with medical experts as well as bioinformatics area. In the bioinformatics field, handling the genomic data of human beings becomes essential such as collecting, storing and processing. Genomic data refers to the genome and DNA data of an organism. Unavoidably, genomic data require huge amount of storage for the customized software to analyze. Recently, genome researchers are rising the alarms over big data.This research papers attempts in significant amount of reduction of data storage by applying data deduplication process in genomic data set. Data deduplication, ‘dedupe’ in short can reduce the amount of storage because of its single instance storage nature.Therefore, data deduplication becomes one of the solutions for optimizing the huge amount of storage spaces for genome storage.We have implemented data deduplication method and applied it to genomic data and the deduplication performed successfully by using secure hash algorithm, B++ tree and sub-file level chunking algorithm. The methods were implemented in integrated approach. The files are separated into different chunks with the help of Two Threshold Two Divisors algorithm and hash function is used to get chunk identifiers. Indexing keys are constructed using the identifiersin B+ tree like index structure.Thissystem can reduce the storage space significantly when there exist duplicated data. The preliminary testing is made using NCBI datasets
APA, Harvard, Vancouver, ISO, and other styles
12

Ragavan, N., and C. Yesubai Rubavathi. "A Novel Big Data Storage Reduction Model for Drill Down Search." Computer Systems Science and Engineering 41, no. 1 (2022): 373–87. http://dx.doi.org/10.32604/csse.2022.020452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Kim, Jang Hyun, Sang-Hoon Kim, Junho Yang, Hyunseok Yang, Jin Bae Park, and Young-Pil Park. "Integration of Overall Error Reduction Algorithms for Holographic Data Storage System." Japanese Journal of Applied Physics 46, no. 6B (June 22, 2007): 3802–11. http://dx.doi.org/10.1143/jjap.46.3802.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Ajdari, Mohammadamin, Patrick Raaf, Mostafa Kishani, Reza Salkhordeh, Hossein Asadi, and André Brinkmann. "An Enterprise-Grade Open-Source Data Reduction Architecture for All-Flash Storage Systems." ACM SIGMETRICS Performance Evaluation Review 50, no. 1 (June 20, 2022): 59–60. http://dx.doi.org/10.1145/3547353.3530963.

Full text
Abstract:
Data reduction technologies have proven their effectiveness to decrease the ever-growing demands on storage system capacities, but also introduce new complexity in the system I/O stack that can easily invalidate well-known best practices. In this paper, we conduct an extensive set of experiments on an enterprise all-flash storage (AFS) system equipped with an open-source data reduction module, i.e., RedHat VDO, and reveal novel observations on the performance gap between the state-of-the-art and the optimal AFS stack with integrated data reduction. We then offer cross-layer optimizations to enhance the performance of AFS, which range from deriving new optimal hardware RAID configurations up to modifications of the enterprise storage stack tailored to the major bottlenecks observed. By implementing all proposed optimizations in an enterprise AFS, we show up to 12.5x speedup over the baseline AFS with integrated data reduction, and up to 57x performance/cost improvement over an optimized AFS (with no data reduction) for 100% write, high-reduction workload scenarios.
APA, Harvard, Vancouver, ISO, and other styles
15

Babu, P. Suresh, and Dr Madhavi Kasa. "Reduction of Spatial Overhead in Decentralized Cloud Storage using IDA." International Journal of Engineering and Advanced Technology 10, no. 3 (February 28, 2021): 176–79. http://dx.doi.org/10.35940/ijeat.c2275.0210321.

Full text
Abstract:
Decentralized cloud storage reflects a significant change in large-scale storage performance and economy. Removing central control enables users to store and exchange data without depending on third-party storage [16] providers. Decentralization reduces the possibility of data failures and outages while increasing object storage protection and privacy at the same time. Decentralized cloud storage is where individuals or groups encouraged to enter, store, and keep data available are stored on a decentralized network through several locations. The servers used, rather than a single organization, are hosted by individuals. In this paper, an information dispersal algorithm is applied on decentralized cloud storage to reduce spatial overhead, which provides more efficient performance compared to the existing methodologies.
APA, Harvard, Vancouver, ISO, and other styles
16

Hilth, William, David Ryckelynck, and Claire Menet. "Data Pruning of Tomographic Data for the Calibration of Strain Localization Models." Mathematical and Computational Applications 24, no. 1 (January 28, 2019): 18. http://dx.doi.org/10.3390/mca24010018.

Full text
Abstract:
The development and generalization of Digital Volume Correlation (DVC) on X-ray computed tomography data highlight the issue of long-term storage. The present paper proposes a new model-free method for pruning experimental data related to DVC, while preserving the ability to identify constitutive equations (i.e., closure equations in solid mechanics) reflecting strain localizations. The size of the remaining sampled data can be user-defined, depending on the needs concerning storage space. The proposed data pruning procedure is deeply linked to hyper-reduction techniques. The DVC data of a resin-bonded sand tested in uniaxial compression is used as an illustrating example. The relevance of the pruned data was tested afterwards for model calibration. A Finite Element Model Updating (FEMU) technique coupled with an hybrid hyper-reduction method aws used to successfully calibrate a constitutive model of the resin bonded sand with the pruned data only.
APA, Harvard, Vancouver, ISO, and other styles
17

Kieffer, J., S. Petitdemange, and T. Vincent. "Real-time diffraction computed tomography data reduction." Journal of Synchrotron Radiation 25, no. 2 (February 20, 2018): 612–17. http://dx.doi.org/10.1107/s1600577518000607.

Full text
Abstract:
Diffraction imaging is an X-ray imaging method which uses the crystallinity information (cell parameter, orientation) as a signal to create an image pixel by pixel: a pencil beam is raster-scanned onto a sample and the (powder) diffraction signal is recorded by a large area detector. With the flux provided by third-generation synchrotrons and the speed of hybrid pixel detectors, the acquisition speed of these experiments is now limited by the transfer rate to the local storage as the data reduction can hardly be performed in real time. This contribution presents the benchmarking of a typical data analysis pipeline for a diffraction imaging experiment like the ones performed at ESRF ID15a and proposes some disruptive techniques to decode CIF binary format images using the computational power of graphics cards to be able to perform data reduction in real time.
APA, Harvard, Vancouver, ISO, and other styles
18

Ajdari, Mohammadamin, Patrick Raaf, Mostafa Kishani, Reza Salkhordeh, Hossein Asadi, and André Brinkmann. "An Enterprise-Grade Open-Source Data Reduction Architecture for All-Flash Storage Systems." Proceedings of the ACM on Measurement and Analysis of Computing Systems 6, no. 2 (May 26, 2022): 1–27. http://dx.doi.org/10.1145/3530896.

Full text
Abstract:
All-flash storage (AFS) systems have become an essential infrastructure component to support enterprise applications, where sub-millisecond latency and very high throughput are required. Nevertheless, the price per capacity ofsolid-state drives (SSDs) is relatively high, which has encouraged system architects to adoptdata reduction techniques, mainlydeduplication andcompression, in enterprise storage solutions. To provide higher reliability and performance, SSDs are typically grouped usingredundant array of independent disk (RAID) configurations. Data reduction on top of RAID arrays, however, adds I/O overheads and also complicates the I/O patterns redirected to the underlying backend SSDs, which invalidates the best-practice configurations used in AFS. Unfortunately, existing works on the performance of data reduction do not consider its interaction and I/O overheads with other enterprise storage components including SSD arrays and RAID controllers. In this paper, using a real setup with enterprise-grade components and based on the open-source data reduction module RedHat VDO, we reveal novel observations on the performance gap between the state-of-the-art and the optimal all-flash storage stack with integrated data reduction. We therefore explore the I/O patterns at the storage entry point and compare them with those at the disk subsystem. Our analysis shows a significant amount of I/O overheads for guaranteeing consistency and avoiding data loss through data journaling, frequent small-sized metadata updates, and duplicate content verification. We accompany these observations with cross-layer optimizations to enhance the performance of AFS, which range from deriving new optimal hardware RAID configurations up to introducing changes to the enterprise storage stack. By analyzing the characteristics of I/O types and their overheads, we propose three techniques: (a) application-aware lazy persistence, (b) a fast, read-only I/O cache for duplicate verification, and (c) disaggregation of block maps and data by offloading block maps to a very fast persistent memory device. By consolidating all proposed optimizations and implementing them in an enterprise AFS, we show 1.3× to 12.5× speedup over the baseline AFS with 90% data reduction, and from 7.8× up to 57× performance/cost improvement over an optimized AFS (with no data reduction) running applications ranging from 100% read-only to 100% write-only accesses.
APA, Harvard, Vancouver, ISO, and other styles
19

Londhe, Soudagar, and Manasi Patil. "Dimensional Reduction Techniques for Huge Volume of Data." International Journal for Research in Applied Science and Engineering Technology 10, no. 3 (March 31, 2022): 125–35. http://dx.doi.org/10.22214/ijraset.2022.40572.

Full text
Abstract:
Abstract: Huge volume of data and information is needed with the expanding advancement in the current collection of tools, cloud storage, strategic techniques and increasing development of science technology. With the appearance of complete genome successions, the biomedical area has encountered an exceptional progression. This genomics has prompted the advancement of new high-produced strategies techniques that are huge amounts in measures of information and data, which inferred the exponential development of numerous organic and biological data sets. This paper represents different linear and non-linear dimensionality reduction techniques and their validity for different kinds of data information datasets and application regions. Keywords: High dimensional data, Dimensionality reduction, Linear techniques, Non-linear techniques, feature extraction, feature selection, Machine Learning
APA, Harvard, Vancouver, ISO, and other styles
20

Levine, Harold G. "Principles of Data Storage and Retrieval for Use in Qualitative Evaluations." Educational Evaluation and Policy Analysis 7, no. 2 (June 1985): 169–86. http://dx.doi.org/10.3102/01623737007002169.

Full text
Abstract:
In spite of increased interest in recent years in the mechanics of qualitative evaluation, few evaluators or other qualitative researchers have addressed the related issues of qualitative data storage-retrieval and data reduction-analysis. Generalized procedures for these tasks, which are generally known within the field of anthropology, have never been systematically examined or codified. This paper borrows from developments in information and library science to construct general principles of data storage and retrieval for the field-based investigator. The five principles that are examined are formatting, cross-referral, indexing (including thesauri design and cross-referencing), abstracting, and pagination. The principles are illustrated with examples from evaluations of school-based programs in the United States. Although the emphasis here is on techniques for manual data manipulation, the paper also explores the advantages of, and considerations for, computerized data storage and search. Finally, the paper also includes a discussion of the more obvious implications of decisions about data storage and retrieval for data reduction and analysis.
APA, Harvard, Vancouver, ISO, and other styles
21

Bae, Kyongtae T., and Bruce R. Whiting. "CT Data Storage Reduction by Means of Compressing Projection Data Instead of Images: Feasibility Study." Radiology 219, no. 3 (June 2001): 850–55. http://dx.doi.org/10.1148/radiology.219.3.r01jn49850.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Lee, Jae-Seong, Sung-Yong Lim, Nak-Yeong Kim, Do-Hyung Kim, Kyoung-Su Park, No-Cheol Park, Hyun-Seok Yang, and Young-Pil Park. "Inter Pixel Interference Reduction using Interference Ratio Mask for Holographic Data Storage." Transactions of the Society of Information Storage Systems 7, no. 1 (March 25, 2011): 42–46. http://dx.doi.org/10.9797/tsiss.2011.7.1.042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Burr, Geoffrey W., Hans Coufal, Robert K. Grygier, John A. Hoffnagle, and C. Michael Jefferson. "Noise reduction of page-oriented data storage by inverse filtering during recording." Optics Letters 23, no. 4 (February 15, 1998): 289. http://dx.doi.org/10.1364/ol.23.000289.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Jung, J. H. "Data reduction method of sine look-up tables in microprocessor's memory storage." Electronics Letters 46, no. 25 (2010): 1656. http://dx.doi.org/10.1049/el.2010.2546.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Choi, I. S., J. H. Lee, and H. T. Kim. "Efficient Reduction of Data Storage for Correlative Target Recognition Using Impulse Radar." Journal of Electromagnetic Waves and Applications 15, no. 6 (January 2001): 745–53. http://dx.doi.org/10.1163/156939301x00986.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Yen, Miao-Chiang, Shih-Yi Chang, and Li-Pin Chang. "Lightweight, Integrated Data Deduplication for Write Stress Reduction of Mobile Flash Storage." IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 37, no. 11 (November 2018): 2590–600. http://dx.doi.org/10.1109/tcad.2018.2857322.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Kang, Bong Joo, Sung Hun Kim, Yeong Yi An, and Byung Gil Choi. "Full-field digital mammography image data storage reduction using a crop tool." International Journal of Computer Assisted Radiology and Surgery 10, no. 5 (June 10, 2014): 509–15. http://dx.doi.org/10.1007/s11548-014-1087-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Vithya Vijayalakshmi, A., N. Veeraragavan, and L. Arockiam. "A Unified Model for Cloud Data Confidentiality." Asian Journal of Science and Applied Technology 7, no. 1 (May 5, 2018): 23–27. http://dx.doi.org/10.51983/ajsat-2018.7.1.2786.

Full text
Abstract:
Cloud computing is a developing technology which gets more attention from both the industries and academia. The cloud storage is one of the main benefit in cloud computing, which is particularly attractive for the users who needs unpredictable storage for their enterprises and so on. Minimum storage and processing cost is an obligatory requirement of all organization and industries, while analysis of data and information is mandatory in all organization. Although there is a reduction in cloud storage cost, customers has to face more technical and security problems such as data integrity, confidentiality and availability. If there is no confidentiality, then, there is no guarantee for the data on cloud. Many researches have been proposed number of techniques for data security in the cloud. However, there are still many issues in cloud data storage. The optimum solution to ensure the confidentiality in cloud storage is to encrypt the data whereas encryption alone fails to give high security to the data in the cloud storage. To give maximum protection to cloud storage, this paper proposes a unified combined model of both encryption and obfuscation. Encryption is the process of converting original text and Obfuscation is a process of encrypting numerical type of data, the researchers proved that by combining these two data protection techniques, the data will be more protective on cloud storage.
APA, Harvard, Vancouver, ISO, and other styles
29

Chen, Yingying, Bo Liu, Hongbo Liu, and Yudong Yao. "VLC-based Data Transfer and Energy Harvesting Mobile System." Journal of Ubiquitous Systems and Pervasive Networks 15, no. 01 (March 1, 2021): 01–09. http://dx.doi.org/10.5383/juspn.15.01.001.

Full text
Abstract:
This paper explores a low-cost portable visible light communication (VLC) system to support the increasing needs of lightweight mobile applications. VLC grows rapidly in the past decade for many applications (e.g., indoor data transmission, human sensing, and visual MIMO) due to its RF interference immunity and inherent high security. However, most existing VLC systems heavily rely on fixed infrastructures with less adaptability to emerging lightweight mobile applications. This work proposes Light Storage, a portable VLC system takes the advantage of commercial smartphone flashlights as the transmitter and a solar panel equipped with both data reception and energy harvesting modules as the receiver. Light Storage can achieve concurrent data transmission and energy harvesting from the visible light signals. It develops multi-level light intensity data modulation to increase data throughput and integrates the noise reduction functionality to allow portability under various lighting conditions. The system supports synchronization together with adaptive error correction to overcome both the linear and non-linear signal offsets caused by the low time-control ability from the commercial smartphones. Finally, the energy harvesting capability in Light Storage provides sufficient energy support for efficient short range communication. Light Storage is validated in both indoor and outdoor environments and can achieve over 98% data decoding accuracy, demonstrating the potential as an important alternative to support low-cost and portable short range communication.
APA, Harvard, Vancouver, ISO, and other styles
30

Xie, H., L. Longuevergne, C. Ringler, and B. R. Scanlon. "Calibration and evaluation of a semi-distributed watershed model of Sub-Saharan Africa using GRACE data." Hydrology and Earth System Sciences 16, no. 9 (September 3, 2012): 3083–99. http://dx.doi.org/10.5194/hess-16-3083-2012.

Full text
Abstract:
Abstract. Irrigation development is rapidly expanding in mostly rainfed Sub-Saharan Africa. This expansion underscores the need for a more comprehensive understanding of water resources beyond surface water. Gravity Recovery and Climate Experiment (GRACE) satellites provide valuable information on spatio-temporal variability in water storage. The objective of this study was to calibrate and evaluate a semi-distributed regional-scale hydrologic model based on the Soil and Water Assessment Tool (SWAT) code for basins in Sub-Saharan Africa using seven-year (July 2002–April 2009) 10-day GRACE data and multi-site river discharge data. The analysis was conducted in a multi-criteria framework. In spite of the uncertainty arising from the tradeoff in optimising model parameters with respect to two non-commensurable criteria defined for two fluxes, SWAT was found to perform well in simulating total water storage variability in most areas of Sub-Saharan Africa, which have semi-arid and sub-humid climates, and that among various water storages represented in SWAT, water storage variations in soil, vadose zone and groundwater are dominant. The study also showed that the simulated total water storage variations tend to have less agreement with GRACE data in arid and equatorial humid regions, and model-based partitioning of total water storage variations into different water storage compartments may be highly uncertain. Thus, future work will be needed for model enhancement in these areas with inferior model fit and for uncertainty reduction in component-wise estimation of water storage variations.
APA, Harvard, Vancouver, ISO, and other styles
31

Zhang, Shao Min, Hai Pu Dong, and Bao Yi Wang. "Research and Implementation of Optimizing CRS Code for Data Recovery in Cloud Storage System." Applied Mechanics and Materials 644-650 (September 2014): 1915–18. http://dx.doi.org/10.4028/www.scientific.net/amm.644-650.1915.

Full text
Abstract:
With development of computer technology, massive information has brought huge challenge on the storage system reliability. A algorithm called HG(Heuristic greedy) algorithm is proposed to optimal calculation path, reduce XOR operation and computational complexity for data recovery, which applies CRS(Cauchy Reed-Solomon) code to cloud storage system HDFS and turns multiply operation of CRS coding to binary matrix multiplication operation.The performance analysis shows that it improves fault tolerance of cloud file system, storage space effectively and timeliness with reduction of additional storage overhead.
APA, Harvard, Vancouver, ISO, and other styles
32

He, Bin, and Yonggang Li. "Big Data Reduction and Optimization in Sensor Monitoring Network." Journal of Applied Mathematics 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/294591.

Full text
Abstract:
Wireless sensor networks (WSNs) are increasingly being utilized to monitor the structural health of the underground subway tunnels, showing many promising advantages over traditional monitoring schemes. Meanwhile, with the increase of the network size, the system is incapable of dealing with big data to ensure efficient data communication, transmission, and storage. Being considered as a feasible solution to these issues, data compression can reduce the volume of data travelling between sensor nodes. In this paper, an optimization algorithm based on the spatial and temporal data compression is proposed to cope with these issues appearing in WSNs in the underground tunnel environment. The spatial and temporal correlation functions are introduced for the data compression and data recovery. It is verified that the proposed algorithm is applicable to WSNs in the underground tunnel.
APA, Harvard, Vancouver, ISO, and other styles
33

Xie, H., L. Longuevergne, C. Ringler, and B. Scanlon. "Calibration and evaluation of a semi-distributed watershed model of sub-Saharan Africa using GRACE data." Hydrology and Earth System Sciences Discussions 9, no. 2 (February 17, 2012): 2071–120. http://dx.doi.org/10.5194/hessd-9-2071-2012.

Full text
Abstract:
Abstract. Irrigation development is rapidly expanding in mostly rainfed Sub-Saharan Africa. This expansion underscores the need for a more comprehensive understanding of water resources beyond surface water. Gravity Recovery and Climate Experiment (GRACE) satellites provide valuable information on spatio-temporal variability of water storage. The objective of this study was to calibrate and evaluate a semi-distributed regional-scale hydrological model, or a large-scale application of the Soil and Water Assessment Tool (SWAT) model, for basins in Sub-Saharan Africa using seven-year (2002–2009) 10-day GRACE data. Multi-site river discharge data were used as well, and the analysis was conducted in a multi-criteria framework. In spite of the uncertainty arising from the tradeoff in optimizing model parameters with respect to two non-commensurable criteria defined for two fluxes, it is concluded that SWAT can perform well in simulating total water storage variability in most areas of Sub-Saharan Africa, which have semi-arid and sub-humid climates, and that among various water storages represented in SWAT, the water storage variations from soil, the vadose zone, and groundwater are dominant. On the other hand, the study also showed that the simulated total water storage variations tend to have less agreement with the GRACE data in arid and equatorial humid regions, and the model-based partition of total water storage variations into different water storage compartments could be highly uncertain. Thus, future work will be needed for model enhancement in these areas with inferior model fit and for uncertainty reduction in component-wise estimation of water storage variations.
APA, Harvard, Vancouver, ISO, and other styles
34

OSMANLIOǦLU, YUSUF, Y. SİNAN HANAY, and OǦUZ ERGİN. "MODIFYING THE DATA-HOLDING COMPONENTS OF THE MICROPROCESSORS FOR ENERGY EFFICIENCY." Journal of Circuits, Systems and Computers 18, no. 06 (October 2009): 1093–117. http://dx.doi.org/10.1142/s0218126609005599.

Full text
Abstract:
Storage components are a major source of energy dissipation in modern superscalar microprocessors. As the instruction windows get larger with each generation of processors, register files and other structures that hold intermediate data become larger and dissipate more power. Therefore it is important to find new ways to reduce the energy dissipation of data holding components. Many values written to and read from these storage components are known to require fewer bits than is provided by the storage space. Upper order bits of these values are unnecessary and energy savings can be achieved by not writing and reading these bits. Floating value data has different characteristics and offer special energy savings opportunities. In this paper we analyze the bit-level statistics of data values and propose simple modifications to data holding components that save significant energy inside the processors. Our results show that by using simple modifications on the storage components it is possible to achieve 40% reduction in the integer register file energy dissipation and up to 60% reduction in the immediate field of the issue queue. We also show that energy dissipation can be reduced by half for some floating point benchmarks by identifying values that indicate a zero operand.
APA, Harvard, Vancouver, ISO, and other styles
35

limouzade, Esmail. "Algorithm Acceleration and Data Storage Volume Reduction in Reliability Modeling Within Distribution Network." Research Journal of Applied Sciences, Engineering and Technology 5, no. 16 (April 30, 2013): 4149–54. http://dx.doi.org/10.19026/rjaset.5.4641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Luo, Huizhang, Qing Liu, Zhenbo Qiao, Jinzhen Wang, Mengxiao Wang, and Hong Jiang. "DuoModel: Leveraging Reduced Model for Data Reduction and Re-Computation on HPC Storage." IEEE Letters of the Computer Society 1, no. 1 (January 1, 2018): 5–8. http://dx.doi.org/10.1109/lcos.2018.2855118.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Nobukawa, Teruyoshi, Daisuke Barada, Takanori Nomura, and Takashi Fukuda. "Orthogonal polarization encoding for reduction of interpixel cross talk in holographic data storage." Optics Express 25, no. 19 (September 7, 2017): 22425. http://dx.doi.org/10.1364/oe.25.022425.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

van der Kooij, Jeroen, David Righton, Espen Strand, Kathrine Michalsen, Vilhjalmur Thorsteinsson, Henrik Svedäng, Francis C. Neat, and Stefan Neuenfeldt. "Life under pressure: insights from electronic data-storage tags into cod swimbladder function." ICES Journal of Marine Science 64, no. 7 (August 31, 2007): 1293–301. http://dx.doi.org/10.1093/icesjms/fsm119.

Full text
Abstract:
Abstract van der Kooij, J., Righton, D., Strand, E., Michalsen, K., Thorsteinsson, V., Svedäng, H., Neat, F. C., and Neuenfeldt, S. 2007. Life under pressure: insights from electronic data-storage tags into cod swimbladder function. – ICES Journal of Marine Science. 64: 1293–1301. The behavioural response of cod (Gadus morhua) to sudden pressure reductions was investigated in a large electronic-tagging experiment using data collected from 141 cod tagged in five different areas of the Northeast Atlantic. More than 40% of cod exhibited a characteristic equilibration behaviour after a rapid pressure reduction caused either by capture before tagging, or by pressure reduction during a rapid ascent from the seabed, or when migrating to deeper water. The equilibration allowed the cod to regain demersal residence. The rate of descent averaged 10 m d−1 (ranging from 2 to 23 m d−1) over periods of less than a day to 1 month. Descent rates for cod on the Icelandic shelf were inversely related to fish length, i.e. smaller fish descended more rapidly, findings consistent with results achieved in the past under laboratory conditions. Modelling of swimbladder volume during equilibration suggested that cod were negatively buoyant for most of the time. The results imply that swimbladder functionality is retained after the probable barotrauma that would follow a large and rapid ascent, and that rates of gas exchange into the swimbladder may be naturally variable. These findings have implications for assumptions on discard mortality, the interpretation of cod behaviour, and its impact on biomass estimates obtained from acoustic surveys.
APA, Harvard, Vancouver, ISO, and other styles
39

Wang, Shi Gang, Yong Yan, and Feng Juan Wang. "Measured Scan Data Reduction Combination Algorithm Based on Scan Line." Applied Mechanics and Materials 651-653 (September 2014): 2335–38. http://dx.doi.org/10.4028/www.scientific.net/amm.651-653.2335.

Full text
Abstract:
The 3D laser scanning technology is a hot spot in developed measuring in recent years. In the surface reconstruction of reverse engineering, the 3D laser scanning point cloud data is too large, and is not conducive to the computation, storage and surface reconstruction. After understanding the research status of the point cloud data processing streamline method at home and abroad, and through the analysis of minimum distance algorithm and the angle-chord height combined code method applicable to engineering characteristics, at the same time, the combination algorithm, which is based on the minimum distance algorithm and the angle-chord height combined code method, is proposed to simplify the point cloud data. The scanning point cloud is simplified by using matlab line by line.
APA, Harvard, Vancouver, ISO, and other styles
40

Ehatäht, Karl. "NANOAOD: a new compact event data format in CMS." EPJ Web of Conferences 245 (2020): 06002. http://dx.doi.org/10.1051/epjconf/202024506002.

Full text
Abstract:
The CMS Collaboration has recently commissioned a new compact data format, named NANOAOD, reducing the per-event storage space requirement to about 1-2 kB. This represents a factor 20 reduction in storage space compared to the MINIAOD data format used previously for physics analysis at CMS. We envisage that the information stored in the NANOAOD data format is sufficient to support the majority of CMS physics analyses. NANOAOD also facilitates the dissemination of analysis methods and the automation of standard workflows for deriving conditions and object calibrations. The latest developments of this project will be presented.
APA, Harvard, Vancouver, ISO, and other styles
41

Baker, Allison H., Dorit M. Hammerling, Sheri A. Mickelson, Haiying Xu, Martin B. Stolpe, Phillipe Naveau, Ben Sanderson, et al. "Evaluating lossy data compression on climate simulation data within a large ensemble." Geoscientific Model Development 9, no. 12 (December 7, 2016): 4381–403. http://dx.doi.org/10.5194/gmd-9-4381-2016.

Full text
Abstract:
Abstract. High-resolution Earth system model simulations generate enormous data volumes, and retaining the data from these simulations often strains institutional storage resources. Further, these exceedingly large storage requirements negatively impact science objectives, for example, by forcing reductions in data output frequency, simulation length, or ensemble size. To lessen data volumes from the Community Earth System Model (CESM), we advocate the use of lossy data compression techniques. While lossy data compression does not exactly preserve the original data (as lossless compression does), lossy techniques have an advantage in terms of smaller storage requirements. To preserve the integrity of the scientific simulation data, the effects of lossy data compression on the original data should, at a minimum, not be statistically distinguishable from the natural variability of the climate system, and previous preliminary work with data from CESM has shown this goal to be attainable. However, to ultimately convince climate scientists that it is acceptable to use lossy data compression, we provide climate scientists with access to publicly available climate data that have undergone lossy data compression. In particular, we report on the results of a lossy data compression experiment with output from the CESM Large Ensemble (CESM-LE) Community Project, in which we challenge climate scientists to examine features of the data relevant to their interests, and attempt to identify which of the ensemble members have been compressed and reconstructed. We find that while detecting distinguishing features is certainly possible, the compression effects noticeable in these features are often unimportant or disappear in post-processing analyses. In addition, we perform several analyses that directly compare the original data to the reconstructed data to investigate the preservation, or lack thereof, of specific features critical to climate science. Overall, we conclude that applying lossy data compression to climate simulation data is both advantageous in terms of data reduction and generally acceptable in terms of effects on scientific results.
APA, Harvard, Vancouver, ISO, and other styles
42

Sardaraz, Muhammad, and Muhammad Tahir. "FCompress: An Algorithm for FASTQ Sequence Data Compression." Current Bioinformatics 14, no. 2 (January 7, 2019): 123–29. http://dx.doi.org/10.2174/1574893613666180322125337.

Full text
Abstract:
Background: Biological sequence data have increased at a rapid rate due to the advancements in sequencing technologies and reduction in the cost of sequencing data. The huge increase in these data presents significant research challenges to researchers. In addition to meaningful analysis, data storage is also a challenge, an increase in data production is outpacing the storage capacity. Data compression is used to reduce the size of data and thus reduces storage requirements as well as transmission cost over the internet. Objective: This article presents a novel compression algorithm (FCompress) for Next Generation Sequencing (NGS) data in FASTQ format. Method: The proposed algorithm uses bits manipulation and dictionary-based compression for bases compression. Headers are compressed with reference-based compression, whereas quality scores are compressed with Huffman coding. Results: The proposed algorithm is validated with experimental results on real datasets. The results are compared with both general purpose and specialized compression programs. Conclusion: The proposed algorithm produces better compression ratio in a comparable time to other algorithms.
APA, Harvard, Vancouver, ISO, and other styles
43

Park, Junho, Yoojin Chung, and Jongmoo Choi. "CoDR: Correlation-Based Data Reduction Scheme for Efficient Gathering of Heterogeneous Driving Data." Sensors 20, no. 6 (March 17, 2020): 1677. http://dx.doi.org/10.3390/s20061677.

Full text
Abstract:
A variety of deep learning techniques are actively employed for advanced driver assistance systems, which in turn require gathering lots of heterogeneous driving data, such as traffic conditions, driver behavior, vehicle status and location information. However, these different types of driving data easily become more than tens of GB per day, forming a significant hurdle due to the storage and network cost. To address this problem, this paper proposes a novel scheme, called CoDR, which can reduce data volume by considering the correlations among heterogeneous driving data. Among heterogeneous datasets, CoDR first chooses one set as a pivot data. Then, according to the objective of data collection, it identifies data ranges relevant to the objective from the pivot dataset. Finally, it investigates correlations among sets, and reduces data volume by eliminating irrelevant data from not only the pivot set but also other remaining datasets. CoDR gathers four heterogeneous driving datasets: two videos for front view and driver behavior, OBD-II and GPS data. We show that CoDR decreases data volume by up to 91%. We also present diverse analytical results that reveal the correlations among the four datasets, which can be exploited usefully for edge computing to reduce data volume on the spot.
APA, Harvard, Vancouver, ISO, and other styles
44

Passricha, Vishal, Ashish Chopra, and Shubhanshi Singhal. "Secure Deduplication Scheme for Cloud Encrypted Data." International Journal of Advanced Pervasive and Ubiquitous Computing 11, no. 2 (April 2019): 27–40. http://dx.doi.org/10.4018/ijapuc.2019040103.

Full text
Abstract:
Cloud storage (CS) is gaining much popularity nowadays because it offers low-cost and convenient network storage services. In this big data era, the explosive growth in digital data moves the users towards CS but this causes a lot of storage pressure on CS systems because a large volume of this data is redundant. Data deduplication is an effective data reduction technique. The dynamic nature of data makes security and ownership of data as a very important issue. Proof-of-ownership schemes are a robust way to check the ownership claimed by any owner. However, this method affects the deduplication process because encryption methods have varying characteristics. A convergent encryption (CE) scheme is widely used for secure data deduplication. The problem with the CE-based scheme is that the user can decrypt the cloud data while he has lost his ownership. This article addresses the problem of ownership revocation by proposing a secure deduplication scheme for encrypted data. The proposed scheme enhances the security against unauthorized encryption and poison attack on the predicted set of data.
APA, Harvard, Vancouver, ISO, and other styles
45

Nakamura, Yusuke, Taku Hoshizawa, and Yuzuru Takashima. "Coherent scattering noise reduction method with wavelength diversity detection for holographic data storage system." Japanese Journal of Applied Physics 56, no. 9S (August 23, 2017): 09NA08. http://dx.doi.org/10.7567/jjap.56.09na08.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Tayebi, Noureddine, Yuegang Zhang, Robert J. Chen, Quan Tran, Rong Chen, Yoshio Nishi, Qing Ma, and Valluri Rao. "An Ultraclean Tip-Wear Reduction Scheme for Ultrahigh Density Scanning Probe-Based Data Storage." ACS Nano 4, no. 10 (October 7, 2010): 5713–20. http://dx.doi.org/10.1021/nn1013512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Yoshida, Shuhei, Yosuke Takahata, Shuma Horiuchi, and Manabu Yamamoto. "Spatial run-length limited code for reduction of hologram size in holographic data storage." Optics Communications 358 (January 2016): 103–7. http://dx.doi.org/10.1016/j.optcom.2015.08.088.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Sravani, Meesala, and Meesala Krishna Murthy. "REDUCTION OF DATA LEAKAGE IN DISTRIBUTED CLOUD STORAGE SYSTEMS USING DISTRIBUTED CLOUD GUARD (DCG)." Indian Journal of Computer Science and Engineering 14, no. 3 (June 20, 2023): 444–50. http://dx.doi.org/10.21817/indjcse/2023/v14i3/231403014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Singh, Gurmeet, Karan Vahi, Arun Ramakrishnan, Gaurang Mehta, Ewa Deelman, Henan Zhao, Rizos Sakellariou, et al. "Optimizing Workflow Data Footprint." Scientific Programming 15, no. 4 (2007): 249–68. http://dx.doi.org/10.1155/2007/701609.

Full text
Abstract:
In this paper we examine the issue of optimizing disk usage and scheduling large-scale scientific workflows onto distributed resources where the workflows are data-intensive, requiring large amounts of data storage, and the resources have limited storage resources. Our approach is two-fold: we minimize the amount of space a workflow requires during execution by removing data files at runtime when they are no longer needed and we demonstrate that workflows may have to be restructured to reduce the overall data footprint of the workflow. We show the results of our data management and workflow restructuring solutions using a Laser Interferometer Gravitational-Wave Observatory (LIGO) application and an astronomy application, Montage, running on a large-scale production grid-the Open Science Grid. We show that although reducing the data footprint of Montage by 48% can be achieved with dynamic data cleanup techniques, LIGO Scientific Collaboration workflows require additional restructuring to achieve a 56% reduction in data space usage. We also examine the cost of the workflow restructuring in terms of the application's runtime.
APA, Harvard, Vancouver, ISO, and other styles
50

Jiang, Hou, Ning Lu, and Xuecheng Wang. "Assessing Carbon Reduction Potential of Rooftop PV in China through Remote Sensing Data-Driven Simulations." Sustainability 15, no. 4 (February 13, 2023): 3380. http://dx.doi.org/10.3390/su15043380.

Full text
Abstract:
Developing rooftop photovoltaic (PV) has become an important initiative for achieving carbon neutrality in China, but the carbon reduction potential assessment has not properly considered the spatial and temporal variability of PV generation and the curtailment in electricity dispatch. In this study, we propose a technical framework to fill the gap in assessing carbon reduction potential through remote sensing data-driven simulations. The spatio-temporal variations in rooftop PV generations were simulated on an hourly basis, and a dispatch analysis was then performed in combination with hourly load profiles to quantify the PV curtailment in different scenarios. Our results showed that the total rooftop PV potential in China reached 6.5 PWh yr−1, mainly concentrated in the eastern region where PV generation showed high variability. The carbon reduction from 100% flexible grids with 12 h of storage capacity is close to the theoretical maximum, while without storage, the potential may be halved. To maximize the carbon reduction potential, rooftop PV development should consider grid characteristics and regional differences. This study has important implications for the development of rooftop PV and the design of carbon-neutral pathways based on it.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography