Journal articles on the topic 'Partial Data processing'

To see the other types of publications on this topic, follow the link: Partial Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Partial Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wu, Xiaoying, Stefanos Souldatos, Dimitri Theodoratos, Theodore Dalamagas, Yannis Vassiliou, and Timos Sellis. "Processing and Evaluating Partial Tree Pattern Queries on XML Data." IEEE Transactions on Knowledge and Data Engineering 24, no. 12 (December 2012): 2244–59. http://dx.doi.org/10.1109/tkde.2011.137.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Xiaoming, and Paul L. Rosin. "Superellipse fitting to partial data." Pattern Recognition 36, no. 3 (March 2003): 743–52. http://dx.doi.org/10.1016/s0031-3203(02)00088-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gregor, Jiří, and František Pastuszek. "Novel Approach to Well Tests Data Processing." Environment and Natural Resources Research 12, no. 1 (May 27, 2022): 80. http://dx.doi.org/10.5539/enrr.v12n1p80.

Full text
Abstract:
A new bounded well function is suggested for processing well tests data. A solution of the basic partial differential equation with physically meaningful initial and boundary conditions is given using its Laplace transform simultaneously with the proof of its unicity. A model for distance-dependence of drawdown is suggested. Results reveal the link between unsteady and steady state of pumping. Related computational problems are discussed. Examples of processing actual data using these results are presented. They illustrate high accuracy of results and a considerable increase of information obtainable from a well test.
APA, Harvard, Vancouver, ISO, and other styles
4

Patapoutian, A. "Data-dependent synchronization in partial-response systems." IEEE Transactions on Signal Processing 54, no. 4 (April 2006): 1494–503. http://dx.doi.org/10.1109/tsp.2006.870587.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kelly, S. I., C. Du, G. Rilling, and M. E. Davies. "Advanced image formation and processing of partial synthetic aperture radar data." IET Signal Processing 6, no. 5 (2012): 511. http://dx.doi.org/10.1049/iet-spr.2011.0073.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Veidenbergs, I., D. Blumberga, C. Rochas, F. Romagnoli, A. Blumberga, and M. Rošā. "Small-Scale Cogeneration Plant Data Processing and Analysis." Latvian Journal of Physics and Technical Sciences 45, no. 3 (September 1, 2008): 25–33. http://dx.doi.org/10.2478/v10047-008-0009-3.

Full text
Abstract:
Small-Scale Cogeneration Plant Data Processing and Analysis In the article, the operational data on electricity and heat energy generation in a small-scale cogeneration plant are analysed. Different measurements done in the plant formed a basis for estimation and evaluation of the savings of primary energy in comparison with distributed energy production. The authors analyse the efficiency values for the heat and the electricity production in the cogeneration regime and the savings of primary energy when the cogeneration plant works with partial load.
APA, Harvard, Vancouver, ISO, and other styles
7

White, Thomas A., Anton Barty, Francesco Stellato, James M. Holton, Richard A. Kirian, Nadia A. Zatsepin, and Henry N. Chapman. "Crystallographic data processing for free-electron laser sources." Acta Crystallographica Section D Biological Crystallography 69, no. 7 (June 15, 2013): 1231–40. http://dx.doi.org/10.1107/s0907444913013620.

Full text
Abstract:
A processing pipeline for diffraction data acquired using the `serial crystallography' methodology with a free-electron laser source is described with reference to the crystallographic analysis suiteCrystFELand the pre-processing programCheetah. A detailed analysis of the nature and impact of indexing ambiguities is presented. Simulations of the Monte Carlo integration scheme, which accounts for the partially recorded nature of the diffraction intensities, are presented and show that the integration of partial reflections could be made to converge more quickly if the bandwidth of the X-rays were to be increased by a small amount or if a slight convergence angle were introduced into the incident beam.
APA, Harvard, Vancouver, ISO, and other styles
8

Lin, Yan, Shu Wen Guo, and Jing Hua Lin. "Improved Method of Data Processing for Flocculating Sedimentation Experiment." Advanced Materials Research 599 (November 2012): 340–43. http://dx.doi.org/10.4028/www.scientific.net/amr.599.340.

Full text
Abstract:
The shortages of the present method of data processing of flocculating sedimentation are pointed out and analyzed in this study. A new method of data processing is put forward and the advantages of this new method are discussed. Compared to the conventional method, the isolines of partial removal efficiency need not to be drawn when the new method is employed and the new method is straightforward, practical and accurate and it is worth to be popularized.
APA, Harvard, Vancouver, ISO, and other styles
9

Wei, Wei, Chongshi Gu, and Xiao Fu. "Processing Method of Missing Data in Dam Safety Monitoring." Mathematical Problems in Engineering 2021 (July 2, 2021): 1–12. http://dx.doi.org/10.1155/2021/9950874.

Full text
Abstract:
A large amount of data obtained by dam safety monitoring provides the basis to evaluate the dam operation state. Due to the interference caused by equipment failure and human error, it is common or even inevitable to suffer the loss of measurement data. Most of the traditional data processing methods for dam monitoring ignore the actual correlation between different measurement points, which brings difficulties to the objective diagnosis of dam safety and even leads to misdiagnosis. Therefore, it is necessary to conduct further study on how to process the missing data in dam safety monitoring. In this study, a data processing method based on partial distance combining fuzzy C-means with long short-term memory (PDS-FCM-LSTM) was proposed to deal with the data missing from dam monitoring. Based on the fuzzy clustering performed for the measurement points of the same category deployed on the dam, the membership degree of each measurement point to cluster center was described by using the fuzzy C-means clustering algorithm based on partial distance (PDS-FCM), so as to determine the clustering results and preprocess the missing data of corresponding measurement points. Then, the bidirectional long short-term memory (LSTM) network was applied to explore the pattern of changes of measurement values under identical clustering conditions, thus processing the data missing from monitoring effectively.
APA, Harvard, Vancouver, ISO, and other styles
10

VAN GENNIP, YVES, and CAROLA-BIBIANE SCHÖNLIEB. "Introduction: Big data and partial differential equations." European Journal of Applied Mathematics 28, no. 6 (November 7, 2017): 877–85. http://dx.doi.org/10.1017/s0956792517000304.

Full text
Abstract:
Partial differential equations (PDEs) are expressions involving an unknown function in many independent variables and their partial derivatives up to a certain order. Since PDEs express continuous change, they have long been used to formulate a myriad of dynamical physical and biological phenomena: heat flow, optics, electrostatics and -dynamics, elasticity, fluid flow and many more. Many of these PDEs can be derived in a variational way, i.e. via minimization of an ‘energy’ functional. In this globalised and technologically advanced age, PDEs are also extensively used for modelling social situations (e.g. models for opinion formation, mathematical finance, crowd motion) and tasks in engineering (such as models for semiconductors, networks, and signal and image processing tasks). In particular, in recent years, there has been increasing interest from applied analysts in applying the models and techniques from variational methods and PDEs to tackle problems in data science. This issue of the European Journal of Applied Mathematics highlights some recent developments in this young and growing area. It gives a taste of endeavours in this realm in two exemplary contributions on PDEs on graphs [1, 2] and one on probabilistic domain decomposition for numerically solving large-scale PDEs [3].
APA, Harvard, Vancouver, ISO, and other styles
11

OYET, ALWELL J., and CHUKWUDI OGBONNA. "STUDENTIZED PARTIAL SCORE TESTS FOR VARIANCES IN LONGITUDINAL DATA." International Journal of Wavelets, Multiresolution and Information Processing 12, no. 01 (December 2013): 1450002. http://dx.doi.org/10.1142/s0219691314500027.

Full text
Abstract:
Koenker9 studied a studentized version of Neyman's score statistic and obtained theoretical results which indicated that the studentized version will outperform the score test under a linear model if the data is from a heavy-tailed t-distribution. However, the author failed to examine the size and power performances of the studentized test through a simulation study. Subsequently, Cai, Hurvich and Tsai7 after a simulation study in a nonparametric setting, found that even when the data is from a normal population the score test was biased in estimating a pre-assigned level of significance. Thus, he recommended that the studentized score test should be used in all situations. Several authors have, however, shown earlier that when the data is from a normal population, Neyman's partial score test is asymptotically unbiased in estimating a pre-assigned level of significance. As a result in this paper, we obtain the partial score statistic and the studentized version under various models but conduct our simulation studies under the special case considered by Cai et al.7 in order to examine the studentized test. We found, in our simulation studies, that when the model of interest is nonparametric with uncorrelated errors, the power of the score test is generally higher than that of the studentized test. The difference in power performances becomes more pronounced under the heavy-tailed t-distribution. In the normal case, both the partial score test and its studentized version performed well in controlling the size of the test. We also found that if the score statistic is constructed based on the underlying distribution of the data, then the score statistic will always outperform the studentized test in both power and size.
APA, Harvard, Vancouver, ISO, and other styles
12

Lekhov, M. V. "DATA PROCESSING TECHNIQUES FOR PUMPING TESTS WITH MULTIPLE PARTIALLY PENETRATING WELLS." Engineering Geology 13, no. 4-5 (December 21, 2018): 98–107. http://dx.doi.org/10.25296/1993-5056-2018-13-4-5-98-107.

Full text
Abstract:
In addition to correct planning, pumping tests with partial penetrating wells, requires methods of processing and calculating parameters that are understandable not only to hydrogeologists, but also to geologists of related specialties. The data processing procedure should be brought to simple plotting and formulas. This requires an understanding of the validity and limitations in the use of the original expressions. Data analysis should begin with a meaningful diagnosis of pumping modes on experimental plots. Developed in the last decades, analytical solutions are applicable to homogeneous aquifers and include complex mathematical functions with complex arguments. Their use requires software. Without justification of the applicability of the chosen model, the mechanical computer "calculation" of pumping has serious drawbacks. The article presents methods for calculating hydraulic parameters from experimental pumping data that can be applied to nonhomogeneous conditions and take into account both degree of partial penetration and near-well destruction. The techniques use the known methods of a straight line on the graphs of combined and area tracing drawdown in semi-logarithmic coordinates, developed for fully penetrating wells. To do this, the well-known formulas introduce a correction ζΣ for degree of partial penetration, anisotropy, inertia of the wells, and other factors. The coordinate of the observation well r is replaced by the rated value rp = r•exp( ζΣ/2). It is impossible to calculate ζΣ except for the case of a homogeneous aquifer. Therefore, it is necessary to determine rp by selection, seeking to combine the experimental well curves on the combined tracking plot. The application of the method requires a realistic view of the accuracy of the field test in general and the procedure used in particular. The article is illustrated by plots and calculations of real field tests, the presentation is accompanied by an analysis of the causes of defectiveness of pumping data.
APA, Harvard, Vancouver, ISO, and other styles
13

Sridevi, G. M., Ashoka D V, and B. V. Ajay Prakash. "Partial Pseudo-Random Hashing for Transactional Memory Read/Write Data Processing and Validation." Karbala International Journal of Modern Science 8, no. 2 (May 5, 2022): 71–80. http://dx.doi.org/10.33640/2405-609x.3223.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Wimberley, Catriona, Georgios Angelis, Frederic Boisson, Paul Callaghan, Kristina Fischer, Bernd J. Pichler, Steven R. Meikle, Marie-Claude Grégoire, and Anthonin Reilhac. "Simulation-based optimisation of the PET data processing for Partial Saturation Approach protocols." NeuroImage 97 (August 2014): 29–40. http://dx.doi.org/10.1016/j.neuroimage.2014.04.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Peluso, Sebastiano, Pedro Ruivo, Paolo Romano, Francesco Quaglia, and Luis Rodrigues. "GMU: Genuine Multiversion Update-Serializable Partial Data Replication." IEEE Transactions on Parallel and Distributed Systems 27, no. 10 (October 1, 2016): 2911–25. http://dx.doi.org/10.1109/tpds.2015.2510998.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Chen, Jie, and Yang Yang. "Quantitative photo-acoustic tomography with partial data." Inverse Problems 28, no. 11 (October 26, 2012): 115014. http://dx.doi.org/10.1088/0266-5611/28/11/115014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Doll, Moritz, André Froehly, and René Schulz. "A partial data problem in linear elasticity." Inverse Problems 36, no. 5 (May 1, 2020): 055016. http://dx.doi.org/10.1088/1361-6420/ab6e76.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Moisyshyn, V. M., A. P. Ivasiutyn, and V. R. Protsiuk. "IMPROVED RESULTS PROCESSING METHODOLOGY FACTORIAL EXPERIMENT." PRECARPATHIAN BULLETIN OF THE SHEVCHENKO SCIENTIFIC SOCIETY Number, no. 17(64) (November 22, 2022): 75–95. http://dx.doi.org/10.31471/2304-7399-2022-17(64)-75-95.

Full text
Abstract:
In the process of processing the results of experimental studies of any, in particular, technical processes, there is a necessity to establish a correlation between independent and dependent variables. During the analysis of experimental data, such a connection is established by using certain computer programs. The authors proposed the App program. 1 to calculate the parameters of ten empirical regression equations using the method of least squares, which is developed in the Visual Studio programming environment in the C# (Сі Sharp) programming language using the “Windows Form Application” framework using Windows operating systems. This program can be used in processing the results of studies conducted both according to the classical, and factorial (rational) plans. Making analysis data of experiments, conducted according to the factor plan with the help of this program the parameters of partial empirical dependencies of the studied factor Y on independent external factors are determined. The basic version of the method of creating an empirical multifactorial model of multiple nonlinear correlations based on the data obtained by the method of rational planning of the experiment is the version proposed in the work "Methodology of processing the results of a factorial experiment". The authors supplemented this method by determining the parameters of partial empirical dependencies based on logarithmic experimental data for averaging of which the geometric mean is used for each independent factor. It is proposed to determine the parameters of partial empirical dependencies, which are used to create a multifactorial model, based on the antilogarithms of the averaged values.
APA, Harvard, Vancouver, ISO, and other styles
19

Zaitsev, V. A. "Geochemical functions Add-In to MS Excel for geochemical and mineralogical data processing." Moscow University Bulletin. Series 4. Geology, no. 3 (December 15, 2022): 54–60. http://dx.doi.org/10.33623/0579-9406-2022-3-54-60.

Full text
Abstract:
«Geochemical functions» is a new free Add-In to MS Excel for geochemists and mineralogists. It implements useful functions for routine operations in geochemical and mineralogical calculations: abundance normalization, calculations of empirical formula coeffi cients from chemical and EMPA analyses, statistical operations with partial and unequal data.
APA, Harvard, Vancouver, ISO, and other styles
20

Sharma, Amit, Linda Johansson, Elin Dunevall, Weixiao Y. Wahlgren, Richard Neutze, and Gergely Katona. "Asymmetry in serial femtosecond crystallography data." Acta Crystallographica Section A Foundations and Advances 73, no. 2 (January 30, 2017): 93–101. http://dx.doi.org/10.1107/s2053273316018696.

Full text
Abstract:
Serial crystallography is an increasingly important approach to protein crystallography that exploits both X-ray free-electron laser (XFEL) and synchrotron radiation. Serial crystallography recovers complete X-ray diffraction data by processing and merging diffraction images from thousands of randomly oriented non-uniform microcrystals, of which all observations are partial Bragg reflections. Random fluctuations in the XFEL pulse energy spectrum, variations in the size and shape of microcrystals, integrating over millions of weak partial observations and instabilities in the XFEL beam position lead to new types of experimental errors. The quality of Bragg intensity estimates deriving from serial crystallography is therefore contingent upon assumptions made while modeling these data. Here it is observed that serial femtosecond crystallography (SFX) Bragg reflections do not follow a unimodal Gaussian distribution and it is recommended that an idealized assumption of single Gaussian peak profiles be relaxed to incorporate apparent asymmetries when processing SFX data. The phenomenon is illustrated by re-analyzing data collected from microcrystals of theBlastochloris viridisphotosynthetic reaction center and comparing these intensity observations with conventional synchrotron data. The results show that skewness in the SFX observations captures the essence of the Wilson plot and an empirical treatment is suggested that can help to separate the diffraction Bragg intensity from the background.
APA, Harvard, Vancouver, ISO, and other styles
21

Yao, Xin, and Cho-Li Wang. "Probabilistic Consistency Guarantee in Partial Quorum-Based Data Store." IEEE Transactions on Parallel and Distributed Systems 31, no. 8 (August 1, 2020): 1815–27. http://dx.doi.org/10.1109/tpds.2020.2973619.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Park, Jun-Yong, and Sung-Kwun Oh. "A Comparative Study on CNN-based Pattern Classifier through Partial Discharge Data Processing Methods." Transactions of The Korean Institute of Electrical Engineers 70, no. 3 (March 31, 2021): 515–25. http://dx.doi.org/10.5370/kiee.2021.70.3.515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Liu, Jianxun, Yiping Wen, Ting Li, and Xuyun Zhang. "A data-operation model based on partial vector space for batch processing in workflow." Concurrency and Computation: Practice and Experience 23, no. 16 (May 2, 2011): 1936–50. http://dx.doi.org/10.1002/cpe.1738.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Sauter, Nicholas K. "XFEL diffraction: developing processing methods to optimize data quality." Journal of Synchrotron Radiation 22, no. 2 (January 29, 2015): 239–48. http://dx.doi.org/10.1107/s1600577514028203.

Full text
Abstract:
Serial crystallography, using either femtosecond X-ray pulses from free-electron laser sources or short synchrotron-radiation exposures, has the potential to reveal metalloprotein structural details while minimizing damage processes. However, deriving a self-consistent set of Bragg intensities from numerous still-crystal exposures remains a difficult problem, with optimal protocols likely to be quite different from those well established for rotation photography. Here several data processing issues unique to serial crystallography are examined. It is found that the limiting resolution differs for each shot, an effect that is likely to be due to both the sample heterogeneity and pulse-to-pulse variation in experimental conditions. Shots with lower resolution limits produce lower-quality models for predicting Bragg spot positions during the integration step. Also, still shots by their nature record only partial measurements of the Bragg intensity. An approximate model that corrects to the full-spot equivalent (with the simplifying assumption that the X-rays are monochromatic) brings the distribution of intensities closer to that expected from an ideal crystal, and improves the sharpness of anomalous difference Fourier peaks indicating metal positions.
APA, Harvard, Vancouver, ISO, and other styles
25

Chen, Qianqiao, Vaibhawa Mishra, Jose Nunez-Yanez, and Georgios Zervas. "Reconfigurable Network Stream Processing on Virtualized FPGA Resources." International Journal of Reconfigurable Computing 2018 (2018): 1–11. http://dx.doi.org/10.1155/2018/8785903.

Full text
Abstract:
The software defined network and network function virtualization are proposed to address the network ossification issue in current Internet infrastructure. Network functions and services are implemented as software applications to increase the programmability of network. However, involving general purpose processors in data plane restricts the bandwidth of network services. Therefore, to keep both the bandwidth and flexibility, a FPGA platform is suggested as a reconfigurable platform to deliver high bandwidth virtual network functions on data plane. In this paper, the FPGA resource has been virtualized by interconnecting partial reconfigurable regions to deliver high bandwidth reconfigurable processing on network streams. With the help of partial reconfiguration technology, network functions on our platform can be configured without affecting other functions on the same FPGA device. The on-chip interconnect system is further evaluated by comparing with existing network-on-chip system. A reconfiguration process is also proposed and demonstrated that it can be performed on our platform. The process can happen in the real time of network services and it is able to keep the original function working during the download of partial bitstream.
APA, Harvard, Vancouver, ISO, and other styles
26

Nadarajah, Saralees. "A Comment on “Partial-Update NLMS Algorithms With Data-Selective Updating”." IEEE Transactions on Signal Processing 55, no. 6 (June 2007): 3148–49. http://dx.doi.org/10.1109/tsp.2007.893925.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Grace Yenin Edwige, Johnson, Adepo Joel, and Oumtanaga Souleymane. "A MECHANISM FOR DETECTING PARTIAL INFERENCES IN DATA WAREHOUSES." International Journal of Advanced Research 9, no. 03 (March 31, 2021): 369–78. http://dx.doi.org/10.21474/ijar01/12593.

Full text
Abstract:
Data warehouses are widely used in the fields of Big Data and Business Intelligence for statistics on business activity. Their use through multidimensional queries allows to have aggregated results of the data. The confidential nature of certain data leads malicious people to use means of deduction of this information. Among these means are data inference methods. To solve these security problems, the researchers have proposed several solutions based on the architecture of the warehouses, the design phase, the cuboids of a data cube and the materialized views of multidimensional queries. In this work, we propose a mechanism for detecting inference in data warehouses. The objective of this approach is to highlight partial inferences during the execution of a multidimensional OLAP (Online Analytical Processing) SUM-type multidimensional query. The goal is to prevent a data warehouse user from inferring sensitive information for which he or she has no access rights according to the access control policy in force. Our study improves the model proposed by a previous study carried out by Triki, which proposes an approach based on average deviations. The aim is to propose an optimal threshold to better detect inferences. The results we obtain are better compared to the previous study.
APA, Harvard, Vancouver, ISO, and other styles
28

Zeng, Xue-Qiang, and Guo-Zheng Li. "Incremental partial least squares analysis of big streaming data." Pattern Recognition 47, no. 11 (November 2014): 3726–35. http://dx.doi.org/10.1016/j.patcog.2014.05.022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Lézoray, Olivier, Vinh-Thong Ta, and Abderrahim Elmoataz. "Partial differences as tools for filtering data on graphs." Pattern Recognition Letters 31, no. 14 (October 2010): 2201–13. http://dx.doi.org/10.1016/j.patrec.2010.03.022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Junmei, Shi. "Noise Data Removal and Image Restoration Based on Partial Differential Equation in Sports Image Recognition Technology." Advances in Mathematical Physics 2021 (November 24, 2021): 1–8. http://dx.doi.org/10.1155/2021/1179120.

Full text
Abstract:
With the rapid development of image processing technology, the application range of image recognition technology is becoming more and more extensive. Processing, analyzing, and repairing graphics and images through computer and big data technology are the main methods to obtain image data and repair image data in complex environment. Facing the low quality of image information in the process of sports, this paper proposes to remove the noise data and repair the image based on the partial differential equation system in image recognition technology. Firstly, image recognition technology is used to track and obtain the image information in the process of sports, and the fourth-order partial differential equation is used to optimize and process the image. Finally, aiming at the problem of low image quality and blur in the transmission process, denoising is carried out, and image restoration is studied by using the adaptive diffusion function in partial differential equation. The results show that the research content of this paper greatly improves the problems of blurred image and poor quality in the process of sports and realizes the function of automatically tracking the target of sports image. In the image restoration link, it can achieve the standard repair effect and reduce the repair time. The research content of this paper is effective and applicable to image processing and restoration.
APA, Harvard, Vancouver, ISO, and other styles
31

Apruzzese, Francesca, Ramin Reshadat, and Stephen T. Balke. "In-Line Monitoring of Polymer Processing. II: Spectral Data Analysis." Applied Spectroscopy 56, no. 10 (October 2002): 1268–74. http://dx.doi.org/10.1366/000370202760354713.

Full text
Abstract:
The objective of this work was to examine the application of various multivariate methods to determine the composition of a flowing, molten, immiscible, polyethylene–polypropylene blend from near-infrared spectra. These spectra were acquired during processing by monitoring the melt with a fiber-optic-assisted in-line spectrometer. Undesired differences in spectra obtained from identical compositions were attributed to additive and multiplicative light scattering effects. Duplicate blend composition data were obtained over a range of 0 to 100% polyethylene. On the basis of previously published approaches, three data preprocessing methods were investigated: second derivative of absorbance with respect to wavelength (d2), multiplicative scatter correction (MSC), and a combination consisting of MSC followed by d2. The latter method was shown to substantially improve superposition of spectra and principal component analysis (PCA) scores. Also, fewer latent variables were required. The continuum regression (CR) approach, a method that encompasses ordinary least squares (OLS), partial least squares (PLS), and principle component regression (PCR) models, was then implemented and provided the best prediction model as one based on characteristics between those of PLS and OLS models.
APA, Harvard, Vancouver, ISO, and other styles
32

Zampieri, Sandro. "Recursive partial realization for 2-D data arrays." Multidimensional Systems and Signal Processing 2, no. 2 (May 1991): 101–26. http://dx.doi.org/10.1007/bf01938220.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

BALLIM, AFZAL, and VINCENZO PALLOTTA. "Robust methods in analysis of natural language data." Natural Language Engineering 8, no. 2-3 (June 2002): 93–96. http://dx.doi.org/10.1017/s1351324902002942.

Full text
Abstract:
The automated analysis of natural language data has become a central issue in the design of intelligent information systems. Processing unconstrained natural language data is still considered as an AI-hard task. However, various analysis techniques have been proposed to address specific aspects of natural language. In particular, recent interest has been focused on providing approximate analysis techniques, assuming that when perfect analysis is not possible, partial results may be still very useful.
APA, Harvard, Vancouver, ISO, and other styles
34

Zeldin, Oliver B., Aaron S. Brewster, Johan Hattne, Monarin Uervirojnangkoorn, Artem Y. Lyubimov, Qiangjun Zhou, Minglei Zhao, William I. Weis, Nicholas K. Sauter, and Axel T. Brunger. "Data Exploration Toolkitfor serial diffraction experiments." Acta Crystallographica Section D Biological Crystallography 71, no. 2 (January 23, 2015): 352–56. http://dx.doi.org/10.1107/s1399004714025875.

Full text
Abstract:
Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the `diffraction before destruction' nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography data sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.
APA, Harvard, Vancouver, ISO, and other styles
35

Kolnogorov, A. V. "Universal strategies for the two-alternative big data processing." Journal of Physics: Conference Series 2052, no. 1 (November 1, 2021): 012020. http://dx.doi.org/10.1088/1742-6596/2052/1/012020.

Full text
Abstract:
Abstract We consider the two-alternative processing of big data in the framework of the two-armed bandit problem. We assume that there are two processing methods with different, fixed but a priori unknown efficiencies which are due to different reasons including those caused by legislation. Results of data processing are interpreted as random incomes. During control process, one has to determine the most efficient method and to provide its primary usage. The difficulty of the problem is caused by the fact that its solution essentially depends on distributions of one-step incomes corresponding to results of data processing. However, in case of big data we show that there are universal processing strategies for a wide class of distributions of one-step incomes. To this end, we consider Gaussian two-armed bandit which naturally arises when batch data processing is analyzed. Minimax risk and minimax strategy are searched for as Bayesian ones corresponding to the worst-case prior distribution. We present recursive integro-difference equation for computing Bayesian risk and Bayesian strategy with respect to the worst-case prior distribution and a second order partial differential equation into which integro-difference equation turns in the limiting case as the control horizon goes to infinity. We also show that, in case of big data, processing of data one-by-one is not more efficient than optimal batch data processing for some types of distributions of one-step incomes, e.g. for Bernoulli and Poissonian distributions. Numerical experiments are presented and show that proposed universal strategies provide high performance of two-alternative big data processing.
APA, Harvard, Vancouver, ISO, and other styles
36

Jin, Xi, Xing Zhang, Kaifeng Rao, Liang Tang, and Qiwei Xie. "Semi-supervised partial least squares." International Journal of Wavelets, Multiresolution and Information Processing 18, no. 03 (January 13, 2020): 2050014. http://dx.doi.org/10.1142/s0219691320500149.

Full text
Abstract:
Traditional supervised dimensionality reduction methods can establish a better model often under the premise of a large number of samples. However, in real-world applications where labeled data are scarce, traditional methods tend to perform poorly because of overfitting. In such cases, unlabeled samples could be useful in improving the performance. In this paper, we propose a semi-supervised dimensionality reduction method by using partial least squares (PLS) which we call semi-supervised partial least squares (S2PLS). To combine the labeled and unlabeled samples into a S2PLS model, we first apply the PLS algorithm to unsupervised dimensionality reduction. Then, the final S2PLS model is established by ensembling the supervised PLS model and the unsupervised PLS model which using the basic idea of principal model analysis (PMA) method. Compared with unsupervised or supervised dimensionality reduction algorithms, S2PLS not only can improve the prediction accuracy of the samples but also enhance the generalization ability of the model. Meanwhile, it can obtain better results even there are only a few or no labeled samples. Experimental results on five UCI data sets also confirmed the above properties of S2PLS algorithm.
APA, Harvard, Vancouver, ISO, and other styles
37

Kakarala, Ramakrishna. "Interpreting the Phase Spectrum in Fourier Analysis of Partial Ranking Data." Advances in Numerical Analysis 2012 (June 27, 2012): 1–15. http://dx.doi.org/10.1155/2012/579050.

Full text
Abstract:
Whenever ranking data are collected, such as in elections, surveys, and database searches, it is frequently the case that partial rankings are available instead of, or sometimes in addition to, full rankings. Statistical methods for partial rankings have been discussed in the literature. However, there has been relatively little published on their Fourier analysis, perhaps because the abstract nature of the transforms involved impede insight. This paper provides as its novel contributions an analysis of the Fourier transform for partial rankings, with particular attention to the first three ranks, while emphasizing on basic signal processing properties of transform magnitude and phase. It shows that the transform and its magnitude satisfy a projection invariance and analyzes the reconstruction of data from either magnitude or phase alone. The analysis is motivated by appealing to corresponding properties of the familiar DFT and by application to two real-world data sets.
APA, Harvard, Vancouver, ISO, and other styles
38

Nguyen, Thien An, and Jaejin Lee. "Simplified Two-Dimensional Generalized Partial Response Target of Holographic Data Storage Channel." Applied Sciences 12, no. 8 (April 18, 2022): 4070. http://dx.doi.org/10.3390/app12084070.

Full text
Abstract:
With a high capacity and fast data access rate, holographic data storage (HDS) is a potential candidate for future storage systems. However, for page-oriented data processing, two-dimensional (2D) interference appears intensely in the HDS systems. Therefore, the new 2D generalized partial response (GPR) target is introduced to estimate the 2D interference. In addition, we also propose a method to analyze the 2D GPR target into two serial one-dimensional (1D) GPR targets. It makes us design a simple detection scheme composed of two serial 1D detectors instead of a complicated 2D detector. In simulations, the results show that our proposed scheme can improve the BER performance compared to the conventional 1D GPR target model.
APA, Harvard, Vancouver, ISO, and other styles
39

Hyeok Lee, Dong, and Nam Je. Park. "ROI-based efficient video data processing for large-scale cloud storage in intelligent CCTV environment." International Journal of Engineering & Technology 7, no. 3.3 (June 8, 2018): 151. http://dx.doi.org/10.14419/ijet.v7i2.33.13873.

Full text
Abstract:
Background/Objectives: Big data environment is being realized. Recently, intelligent public safety environment on the foundation of the image processing technique based on big data is being introduced, and accordingly, processing CCTV images is becoming more important day by day.Methods/Statistical analysis: In this paper, an efficient technique to send image information for mass cloud storage environment was proposed. With the offered method, only the ROI area is extracted and partial object images are transmitted, and it has the strengths of higher efficiency and protected privacy with the application of a masking technique.Findings: it is general to apply the masking technique partially to face information, and in this study, the privacy of the image data registered in the cloud storage was to be protected based on this masking technique, and an efficient data transmission structure grounded on ROI area extraction was proposed.Improvements/Applications: With the offered method, only the ROI area is extracted and partial object images are transmitted, and it has the strengths of higher efficiency and protected privacy with the application of a masking technique.
APA, Harvard, Vancouver, ISO, and other styles
40

Xu, Guo Sheng. "Design of Image Processing System Based on FPGA." Advanced Materials Research 403-408 (November 2011): 1281–84. http://dx.doi.org/10.4028/www.scientific.net/amr.403-408.1281.

Full text
Abstract:
To speed up the image acquisition and make full use of effective information, a design method of CCD partial image scanning system is presented. The system achieves to functions of the high -speed data collection, the high -speed video data compression the real time video data Network Transmission and the real time compression picture data storage. the data processed was transferred to PC through USB2.0 real-time to reconstruct defects microscopic images. Experiments show that the system has stable performance, real-time data transmission and high quality images, feasible by adopting the algorithm and scheme proposed in this paper.
APA, Harvard, Vancouver, ISO, and other styles
41

Mills, Nicholas, Ethan M. Bensman, William L. Poehlman, Walter B. Ligon, and F. Alex Feltus. "Moving Just Enough Deep Sequencing Data to Get the Job Done." Bioinformatics and Biology Insights 13 (January 2019): 117793221985635. http://dx.doi.org/10.1177/1177932219856359.

Full text
Abstract:
Motivation: As the size of high-throughput DNA sequence datasets continues to grow, the cost of transferring and storing the datasets may prevent their processing in all but the largest data centers or commercial cloud providers. To lower this cost, it should be possible to process only a subset of the original data while still preserving the biological information of interest. Results: Using 4 high-throughput DNA sequence datasets of differing sequencing depth from 2 species as use cases, we demonstrate the effect of processing partial datasets on the number of detected RNA transcripts using an RNA-Seq workflow. We used transcript detection to decide on a cutoff point. We then physically transferred the minimal partial dataset and compared with the transfer of the full dataset, which showed a reduction of approximately 25% in the total transfer time. These results suggest that as sequencing datasets get larger, one way to speed up analysis is to simply transfer the minimal amount of data that still sufficiently detects biological signal. Availability: All results were generated using public datasets from NCBI and publicly available open source software.
APA, Harvard, Vancouver, ISO, and other styles
42

Burgin, Mark. "Information Processing by Symmetric Inductive Turing Machines." Proceedings 47, no. 1 (May 13, 2020): 28. http://dx.doi.org/10.3390/proceedings2020047028.

Full text
Abstract:
Traditional models of computations, such as Turing machines or partial recursive functions, perform computations of functions using a definite program controlling these computations. This approach detaches data, which are processed, and the permanent program, which controls this processing. Physical computers often process not only data but also their software (programs). To reflect this peculiarity of physical computers, symmetric models of computations and automata were introduced. In this paper, we study information processing by symmetric models, which are called symmetric inductive Turing machines and reflexive inductive Turing machines.
APA, Harvard, Vancouver, ISO, and other styles
43

Burgin, Mark. "Information Processing by Symmetric Inductive Turing Machines." Proceedings 47, no. 1 (May 13, 2020): 28. http://dx.doi.org/10.3390/proceedings47010028.

Full text
Abstract:
Traditional models of computations, such as Turing machines or partial recursive functions, perform computations of functions using a definite program controlling these computations. This approach detaches data, which are processed, and the permanent program, which controls this processing. Physical computers often process not only data but also their software (programs). To reflect this peculiarity of physical computers, symmetric models of computations and automata were introduced. In this paper, we study information processing by symmetric models, which are called symmetric inductive Turing machines and reflexive inductive Turing machines.
APA, Harvard, Vancouver, ISO, and other styles
44

Baykulov, Mikhail, and Dirk Gajewski. "Prestack seismic data enhancement with partial common-reflection-surface (CRS) stack." GEOPHYSICS 74, no. 3 (May 2009): V49—V58. http://dx.doi.org/10.1190/1.3106182.

Full text
Abstract:
We developed a new partial common-reflection-surface (CRS) stacking method to enhance the quality of sparse low-fold seismic data. For this purpose, we use kinematic wavefield attributes computed during the automatic CRS stack. We apply a multiparameter CRS traveltime formula to compute partial stacked CRS supergathers. Our algorithm allows us to generate NMO-uncorrected gathers without the application of inverse NMO/DMO. Gathers obtained by this approach are regularized and have better signal-to-noise ratio compared with original common-midpoint gathers. Instead of the original data, these improved prestack data can be used in many conventional processing steps, e.g., velocity analysis or prestack depth migration, providing enhanced images and better quality control. We verified the method on 2D synthetic data and applied it to low-fold land data from northern Germany. The synthetic examples show the robustness of the partial CRS stack in the presence of noise. Sparse land data became regularized, and the signal-to-noise ratio of the seismograms increased as a result of the partial CRS stack. Prestack depth migration of the generated partially stacked CRS supergathers produced significantly improved common-image gathers as well as depth-migrated sections.
APA, Harvard, Vancouver, ISO, and other styles
45

Wang, Juan. "Research on Key Technology of Data Mining for Volleyball Game Based on Service System." Applied Mechanics and Materials 543-547 (March 2014): 4698–701. http://dx.doi.org/10.4028/www.scientific.net/amm.543-547.4698.

Full text
Abstract:
During the processing of aircraft and other high precision machinery workpieces, if using the traditional machining methods, it will consume a amount of machining costs, and the mechanical processing cycle is long. In this context, this paper designs a kind of robot intelligent processing system with high precision machinery. And it has realized the intelligent online control on the machining process by using the high precision machining intelligent online monitoring technology and the numerical simulation prediction technology. Finally, this system is introduced into the process of data mining for volleyball game, and designs the partial differential variational data mining model, which has realized the key parameter data mining of volleyball games service system, and has provided reliable parameters and technical support for the training of volleyball players.
APA, Harvard, Vancouver, ISO, and other styles
46

Aussal, Matthieu, Yosra Boukari, and Houssem Haddar. "Data completion method for the Helmholtz equation via surface potentials for partial Cauchy data." Inverse Problems 36, no. 5 (April 29, 2020): 055012. http://dx.doi.org/10.1088/1361-6420/ab730c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Pereira, Fabio Henrique, Francisco Elânio Bezerra, Diego Oliva, Gilberto Francisco Martha de Souza, Ivan Eduardo Chabu, Josemir Coelho Santos, Shigueru Nagao Junior, and Silvio Ikuyo Nabeta. "Forecast Model Update Based on a Real-Time Data Processing Lambda Architecture for Estimating Partial Discharges in Hydrogenerator." Sensors 20, no. 24 (December 17, 2020): 7242. http://dx.doi.org/10.3390/s20247242.

Full text
Abstract:
The prediction of partial discharges in hydrogenerators depends on data collected by sensors and prediction models based on artificial intelligence. However, forecasting models are trained with a set of historical data that is not automatically updated due to the high cost to collect sensors’ data and insufficient real-time data analysis. This article proposes a method to update the forecasting model, aiming to improve its accuracy. The method is based on a distributed data platform with the lambda architecture, which combines real-time and batch processing techniques. The results show that the proposed system enables real-time updates to be made to the forecasting model, allowing partial discharge forecasts to be improved with each update with increasing accuracy.
APA, Harvard, Vancouver, ISO, and other styles
48

Zhang, Jinghua, Gordon R. Jones, Anthony G. Deakin, and Joe W. Spencer. "Chromatic processing of DGA data produced by partial discharges for the prognosis of HV transformer behaviour." Measurement Science and Technology 16, no. 2 (January 22, 2005): 556–61. http://dx.doi.org/10.1088/0957-0233/16/2/031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Kasahara, Hironori, and Akimasa Yoshida. "A data-localization compilation scheme using partial-static task assignment for Fortran coarse-grain parallel processing." Parallel Computing 24, no. 3-4 (May 1998): 579–96. http://dx.doi.org/10.1016/s0167-8191(98)00026-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Ji, Jinjie, Qing Chen, Lei Jin, Xiaotong Zhou, and Wei Ding. "Fault Diagnosis System of Power Grid Based on Multi-Data Sources." Applied Sciences 11, no. 16 (August 20, 2021): 7649. http://dx.doi.org/10.3390/app11167649.

Full text
Abstract:
In order to complete the function of power grid fault diagnosis accurately, rapidly and comprehensively, the power grid fault diagnosis system based on multi-data sources is proposed. The integrated system uses accident-level information, warning-level information and fault recording documents and outputs a complete diagnosis and tracking report. According to the timeliness of three types of information transmission, the system is divided into three subsystems: real-time processing system, quasi-real-time processing system and batch processing system. The complete work is realized through the cooperation between them. While a real-time processing system completes fault diagnosis of elements, it also screens out incorrectly operating protections and circuit breakers and judges the loss of accident-level information. Quasi-real-time system outputs reasons for incorrect actions of protections and circuit breakers under the premise of considering partial warning-level information missing. The batch processing system corrects diagnosis results of the real-time processing system and outputs fault details, including fault phases, types, times and locations of faulty elements. The simulation results and test show that the system can meet actual engineering requirements in terms of execution efficiency and fault diagnosis and tracking effect. It can be used as a reference for self-healing and maintenance of power grids and has a preferable application value.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography