Auswahl der wissenschaftlichen Literatur zum Thema „Statistical processing of real data“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Statistical processing of real data" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Statistical processing of real data"

1

Parygin, D. S., V. P. Malikov, A. V. Golubev, N. P. Sadovnikova, T. M. Petrova und A. G. Finogeev. „Categorical data processing for real estate objects valuation using statistical analysis“. Journal of Physics: Conference Series 1015 (Mai 2018): 032102. http://dx.doi.org/10.1088/1742-6596/1015/3/032102.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Sun, Hong Feng, Ying Li und Hong Lv. „Statistical Analysis of the Massive Traffic Data Based on Cloud Platform“. Advanced Materials Research 717 (Juli 2013): 662–66. http://dx.doi.org/10.4028/www.scientific.net/amr.717.662.

Der volle Inhalt der Quelle
Annotation:
Currently, with the rapid development of various geographic data acquisition technologies, the data-intensive geographic calculation is becoming more and more important. The urban motor vehicles loaded with GPS, namely the transport vehicles, can real-timely collect a large number of urban traffic information. If these massive transportation vehicle data can be real-timely collected and analyzed, the real-time and accurate basic information will be provided for monitoring the large area of traffic status as well as the intelligent traffic management. Based on the requirements of the organization, the processing, the statistics and the analysis of the massive urban traffic data, the new framework of the massive data-intensive calculation under the environment of cloud platform has been proposed through employing Bigtable, Mapreduce and other technologies.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Liu, Hua, und Nan Zhang. „Data Processing in the Key Factors Affecting China's Endowment Real Estate Enterprises Financing“. Applied Mechanics and Materials 730 (Januar 2015): 349–52. http://dx.doi.org/10.4028/www.scientific.net/amm.730.349.

Der volle Inhalt der Quelle
Annotation:
Financing problem is one of the main reasons for restricting the development of endowment real estate enterprises in China. By analyzing the present situation of endowment real estate enterprises financing and researching relevant literatures, we sum up 20 general influence factors. Using the data processing model of the principal component analysis method to analyze the 20 general influence factors under the help of SPSS 19.0 statistical analysis software, we can find out the key influence factors which affect endowment real estate enterprises financing. Aiming at the key influence factors, we put forward some specific measures to promote the smooth development of endowment real estate enterprises financing in China.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Liu, Mou Zhong, und Min Sun. „Application of Multidimensional Data Model in the Traffic Accident Data Warehouse“. Applied Mechanics and Materials 548-549 (April 2014): 1857–61. http://dx.doi.org/10.4028/www.scientific.net/amm.548-549.1857.

Der volle Inhalt der Quelle
Annotation:
The traffic administrative department would record real-time information of accidents and update the corresponding database when dealing with daily traffic routines. It is of great significance to study and analyze these data. In this paper, we propose a Multi-dimensional Data Warehouse Model (M-DWM) combined with the concept of Data Warehouse and multi-dimensional data processing theory. The model can greatly improve the efficiency for statistical analysis and data mining.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Vollmar, Melanie, James M. Parkhurst, Dominic Jaques, Arnaud Baslé, Garib N. Murshudov, David G. Waterman und Gwyndaf Evans. „The predictive power of data-processing statistics“. IUCrJ 7, Nr. 2 (27.02.2020): 342–54. http://dx.doi.org/10.1107/s2052252520000895.

Der volle Inhalt der Quelle
Annotation:
This study describes a method to estimate the likelihood of success in determining a macromolecular structure by X-ray crystallography and experimental single-wavelength anomalous dispersion (SAD) or multiple-wavelength anomalous dispersion (MAD) phasing based on initial data-processing statistics and sample crystal properties. Such a predictive tool can rapidly assess the usefulness of data and guide the collection of an optimal data set. The increase in data rates from modern macromolecular crystallography beamlines, together with a demand from users for real-time feedback, has led to pressure on computational resources and a need for smarter data handling. Statistical and machine-learning methods have been applied to construct a classifier that displays 95% accuracy for training and testing data sets compiled from 440 solved structures. Applying this classifier to new data achieved 79% accuracy. These scores already provide clear guidance as to the effective use of computing resources and offer a starting point for a personalized data-collection assistant.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Zhao, Yu Qian, und Zhi Gang Li. „FPGA Implementation of Real-Time Adaptive Bidirectional Equalization for Histogram“. Advanced Materials Research 461 (Februar 2012): 215–19. http://dx.doi.org/10.4028/www.scientific.net/amr.461.215.

Der volle Inhalt der Quelle
Annotation:
According to the characteristics of infrared images, a contrast enhancement algorithm was presented. The principium of FPGA-based adaptive bidirectional plateau histogram equalization was given in this paper. The plateau value was obtained by finding local maximum and whole maximum in statistical histogram based on dimensional histogram statistic. Statistical histogram was modified by the plateau value and balanced in gray scale and gray spacing. Test data generated by single frame image, which was simulated by FPGA-based real-time adaptive bidirectional plateau histogram equalization. The simulation results indicates that the precept meet the requests well in both the image processing effects and processing speed
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Liu, Yuxi, Yiping Zhu und Mingzhe Wei. „Application of Point Cloud Data Processing in River Regulation“. Marine Technology Society Journal 55, Nr. 2 (01.03.2021): 198–204. http://dx.doi.org/10.4031/mtsj.55.2.15.

Der volle Inhalt der Quelle
Annotation:
Abstract Geotextile materials are often used in river regulation projects to cut down sand loss caused by water erosion, to thus ensure a stable and safe river bed. In order to measure the overlap width in the geotextile-laying procedure, we proposed a point processing method for cloud data, which engages point cloud data obtained by 3-D imaging sonar to do automatic measurements. Firstly, random sampling and consensus point cloud segmentation and outer point filtering based on statistical analysis on density were used to extract the upper and lower plane data of the geotextile. Secondly, cluster classification was used to obtain the edge point cloud. Lastly, edge characteristic parameters were extracted by linear fitting, and the overlap width in geotextile laying was calculated. Results show that this measurement scheme is feasible, robust, and accurate enough to meet the requirements in real-life engineering.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Daume III, H., und D. Marcu. „Domain Adaptation for Statistical Classifiers“. Journal of Artificial Intelligence Research 26 (21.06.2006): 101–26. http://dx.doi.org/10.1613/jair.1872.

Der volle Inhalt der Quelle
Annotation:
The most basic assumption used in statistical learning theory is that training data and test data are drawn from the same underlying distribution. Unfortunately, in many applications, the "in-domain" test data is drawn from a distribution that is related, but not identical, to the "out-of-domain" distribution of the training data. We consider the common case in which labeled out-of-domain data is plentiful, but labeled in-domain data is scarce. We introduce a statistical formulation of this problem in terms of a simple mixture model and present an instantiation of this framework to maximum entropy classifiers and their linear chain counterparts. We present efficient inference algorithms for this special case based on the technique of conditional expectation maximization. Our experimental results show that our approach leads to improved performance on three real world tasks on four different data sets from the natural language processing domain.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Majumdar, Chitradeep, Miguel Lopez-Benitez und Shabbir N. Merchant. „Real Smart Home Data-Assisted Statistical Traffic Modeling for the Internet of Things“. IEEE Internet of Things Journal 7, Nr. 6 (Juni 2020): 4761–76. http://dx.doi.org/10.1109/jiot.2020.2969318.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Lytvynenko, T. I. „Problem of data analysis and forecasting using decision trees method“. PROBLEMS IN PROGRAMMING, Nr. 2-3 (Juni 2016): 220–26. http://dx.doi.org/10.15407/pp2016.02-03.220.

Der volle Inhalt der Quelle
Annotation:
This study describes an application of the decision tree approach to the problem of data analysis and forecasting. Data processing bases on the real observations that represent sales level in the period between 2006 and 2009. R (programming language and software environment) is used as a tool for statistical computing. Paper includes comparison of the method with well-known approaches and solutions in order to improve accuracy of the gained consequences.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Dissertationen zum Thema "Statistical processing of real data"

1

Ha, Jin-cheol. „Real-time visual tracking using image processing and filtering methods“. Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/28177.

Der volle Inhalt der Quelle
Annotation:
Thesis (M. S.)--Aerospace Engineering, Georgia Institute of Technology, 2008.
Committee Chair: Eric N. Johnson; Committee Co-Chair: Allen R. Tannenbaum; Committee Member: Anthony J. Calise; Committee Member: Eric Feron; Committee Member: Patricio A. Vela.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Park, Chang Yun. „Predicting deterministic execution times of real-time programs /“. Thesis, Connect to this title online; UW restricted, 1992. http://hdl.handle.net/1773/6978.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Zamazal, Petr. „Statistická analýza rozsáhlých dat z průmyslu“. Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2021. http://www.nusl.cz/ntk/nusl-445466.

Der volle Inhalt der Quelle
Annotation:
This thesis deals with processing of real data regarding waste collection. It describes select parts of the fields of statistical tests, identification of outliers, correlation analysis and linear regression. This theoretical basis is applied through the programming language Python to process the data into a form suitable for creating linear models. Final models explain between 70 \% and 85 \% variability. Finally, the information obtained through this analysis is used to specify recommendations for the waste management company.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Fernandez, Noemi. „Statistical information processing for data classification“. FIU Digital Commons, 1996. http://digitalcommons.fiu.edu/etd/3297.

Der volle Inhalt der Quelle
Annotation:
This thesis introduces new algorithms for analysis and classification of multivariate data. Statistical approaches are devised for the objectives of data clustering, data classification and object recognition. An initial investigation begins with the application of fundamental pattern recognition principles. Where such fundamental principles meet their limitations, statistical and neural algorithms are integrated to augment the overall approach for an enhanced solution. This thesis provides a new dimension to the problem of classification of data as a result of the following developments: (1) application of algorithms for object classification and recognition; (2) integration of a neural network algorithm which determines the decision functions associated with the task of classification; (3) determination and use of the eigensystem using newly developed methods with the objectives of achieving optimized data clustering and data classification, and dynamic monitoring of time-varying data; and (4) use of the principal component transform to exploit the eigensystem in order to perform the important tasks of orientation-independent object recognition, and di mensionality reduction of the data such as to optimize the processing time without compromising accuracy in the analysis of this data.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Macias, Filiberto. „Real Time Telemetry Data Processing and Data Display“. International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/611405.

Der volle Inhalt der Quelle
Annotation:
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California
The Telemetry Data Center (TDC) at White Sands Missile Range (WSMR) is now beginning to modernize its existing telemetry data processing system. Modern networking and interactive graphical displays are now being introduced. This infusion of modern technology will allow the TDC to provide our customers with enhanced data processing and display capability. The intent of this project is to outline this undertaking.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

White, Allan P., und Richard K. Dean. „Real-Time Test Data Processing System“. International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614650.

Der volle Inhalt der Quelle
Annotation:
International Telemetering Conference Proceedings / October 30-November 02, 1989 / Town & Country Hotel & Convention Center, San Diego, California
The U.S. Army Aviation Development Test Activity at Fort Rucker, Alabama needed a real-time test data collection and processing capability for helicopter flight testing. The system had to be capable of collecting and processing both FM and PCM data streams from analog tape and/or a telemetry receiver. The hardware and software was to be off the shelf whenever possible. The integration was to result in a stand alone telemetry collection and processing system.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Clapp, T. C. „Statistical methods for the processing of communications data“. Thesis, University of Cambridge, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.597697.

Der volle Inhalt der Quelle
Annotation:
This thesis describes the use of methods derived from Bayesian statistics on the problem of blind equalisation of communications channels, although much of this work is applicable to the more general problem of blind deconvolution. In order to allow general models to be incorporated, numerical methods are used; the focus is on Markov chain Monte Carlo (MCMC) methods for processing of blocks of data and the use of particle filters for sequential processing. In order to obtain the best performance using MCMC, the choice of the Markov chain needs tailoring to the application in hand. The use of joint sampling of all the states (transmitted data sequence) and reversible jump moves to combat delay ambiguity are proposed. The use of particle filters is still in its infancy, and much of the focus is on the development of strategies to improve its applicability to real problems. It is well known that fixed-lag methods may be used to great effect on Markovian models where later observations can provide information about states in the recent past. Methods of performing fixed-lag simulation for incorporation into particle filters are described. The use of data windowing on fixed parameter systems allows regeneration of the parameters at each time-step without having excessive demands on storage requirements. In certain cases it is difficult to perform the updating when a new data point is received in a single step. The novel concept of introducing intermediate densities in a manner akin to simulated annealing between time steps is described. This improves robustness and provides a natural method for initialisation. All of these techniques are demonstrated in simulations based upon standard models of communications systems, along with favourable comparisons to more conventional techniques.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Dowling, Jason, John Welling, Loral Aerosys, Kathy Nanzetta, Toby Bennett und Jeff Shi. „ACCELERATING REAL-TIME SPACE DATA PACKET PROCESSING“. International Foundation for Telemetering, 1995. http://hdl.handle.net/10150/608429.

Der volle Inhalt der Quelle
Annotation:
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada
NASA’s use of high bandwidth packetized Consultative Committee for Space Data Systems (CCSDS) telemetry in future missions presents a great challenge to ground data system developers. These missions, including the Earth Observing System (EOS), call for high data rate interfaces and small packet sizes. Because each packet requires a similar amount of protocol processing, high data rates and small packet sizes dramatically increase the real-time workload on ground packet processing systems. NASA’s Goddard Space Flight Center has been developing packet processing subsystems for more than twelve years. Implementations of these subsystems have ranged from mini-computers to single-card VLSI multiprocessor subsystems. The latter subsystem, known as the VLSI Packet Processor, was first deployed in 1991 for use in support of the Solar Anomalous & Magnetospheric Particle Explorer (SAMPEX) mission. An upgraded version of this VMEBus card, first deployed for Space Station flight hardware verification, has demonstrated sustained throughput of up to 50 Megabits per second and 15,000 packets per second. Future space missions including EOS will require significantly higher data and packet rate performance. A new approach to packet processing is under development that will not only increase performance levels by at least a factor of six but also reduce subsystem replication costs by a factor of five. This paper will discuss the development of a next generation packet processing subsystem and the architectural changes necessary to achieve a thirty-fold improvement in the performance/price of real-time packet processing.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Khondoker, Md Mizanur Rahman. „Statistical methods for pre-processing microarray gene expression data“. Thesis, University of Edinburgh, 2006. http://hdl.handle.net/1842/12367.

Der volle Inhalt der Quelle
Annotation:
A novel method is developed for combining multiple laser scans of microarrays to correct for “signal saturation” and “signal deterioration” effects in the gene expression measurement. A multivariate nonlinear functional regression model with Cauchy distributed errors having additive plus multiplicative scale is proposed as a model for combining multiple scan data. The model has been found to flexibly describe the nonlinear relationship in multiple scan data. The heavy tailed Cauchy distribution with additive plus multiplicative scale provides a basis for objective and robust estimation of gene expression from multiple scan data adjusting for censoring and deterioration bias in the observed intensity. Through combining multiple scans, the model reduces sampling variability in the gene expression estimates. A unified approach for nonparametric location and scale normalisation of log-ratio data is considered. A Generalised Additive Model for Location, Scale and Shape (GAMLSS) is proposed. GAMLSS uses a nonparametric approach for modelling both location and scale of log-ratio data, in contrast to the general tendency of using a parametric transformation, such as arcsinh, for variance stabilisation. Simulation studies demonstrate GAMLSS to be more powerful than the parametric method when a GAMLSS location and scale model, fitted to real data, is assumed correct. GAMLSS has been found to be as powerful as the parametric approach even when the parametric model is appropriate. Finally, we investigate the optimality of different estimation methods for analysing functional regression models. Alternative estimators are available in the literature to deal with the problems of identifiability and consistency. We investigated these estimators in terms of unbiasedness and efficiency for a specific case involving multiple laser scans of microarrays, and found that, in addition to being consistent, named methods are highly efficient and unbiased.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Wang, Yun, und Wenxuan Jiang. „Statistical Processing of IEEE 802.15.4 Data Collected in Industrial Environment“. Thesis, Mittuniversitetet, Institutionen för informationsteknologi och medier, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-19619.

Der volle Inhalt der Quelle
Annotation:
Wireless sensor network, which is constitute of autonomous sensors, is used for monitoring physical or environmental conditions like temperature, sound, pressure, and so on. The dispersed sensors or nodes will respectively pass their data through the network to the main location. Currently, several standards are ratified or in developing for wireless sensor network, like Wireless Hart, ISA, 100.11a, WIA-PAA, IEEE 802.15.4, etc. Among the standards, Zigbee is often used in industrial applications that require short-range and low-rate wireless transfer. In the research, all the data is collected under industrial environment using IEEE 802.15.4 compliant physical layer, some packets are interfered only by multi-path fading while others are also interfered by Wi-Fi interference. The goal of the thesis is to find out the dependence between the received power (RSS), correlation value (CORR) and bit error rate (BER) of the received message, and their distribution in situations both when the packet is lost or not. Besides, the performance of bit error rate such as the distribution and the features of burst error length under Wi-Fi interference or not will also be tested. All of them are based on a precise statistical processing.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Bücher zum Thema "Statistical processing of real data"

1

Berthold, M. Guide to intelligent data analysis: How to intelligently make sense of real data. London: Springer, 2010.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Hailperin, Max. Load balancing using time series analysis for soft real time systems with statistically periodic loads. Stanford, Calif: Dept. of Computer Science, Stanford University, 1993.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Data driven statistical methods. London: Chapman & Hall, 1998.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Kiselev, I︠U︡. V. (I︠U︡riĭ Vasilʹevich), Hrsg. Statistical methods of geophysical data processing. Singapore: World Scientific, 2010.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Multivariate statistical simulation. New York: Wiley, 1987.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

McCuen, Richard H. Microcomputer applications in statistical hydrology. Englewood Cliffs, N.J: Prentice Hall, 1993.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Fran, Rizzardi, Hrsg. BLSS, the Berkeley interactive statistical system. New York: Norton, 1988.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Hale, Robert L. MYSTAT: Statistical applications. Cambridge, MA: Course Technology, 1992.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

1947-, Carson John Hargadine, Hrsg. Multiple processor systems for real-time applications. Englewood Cliffs, N.J: Prentice-Hall, 1985.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Froeschl, Karl. Metadata management in statistical information processing: A unified framework for metadata-based processing of statistical data aggregates. Wien: Springer, 1997.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Statistical processing of real data"

1

Kundu, Debasis, und Swagata Nandi. „Real Data Example“. In Statistical Signal Processing, 91–99. India: Springer India, 2012. http://dx.doi.org/10.1007/978-81-322-0628-6_6.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Nandi, Swagata, und Debasis Kundu. „Real Data Example Using Sinusoidal-Like Models“. In Statistical Signal Processing, 143–61. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-6280-8_7.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Tavazzi, Erica, Camille L. Gerard, Olivier Michielin, Alexandre Wicky, Roberto Gatta und Michel A. Cuendet. „A Process Mining Approach to Statistical Analysis: Application to a Real-World Advanced Melanoma Dataset“. In Lecture Notes in Business Information Processing, 291–304. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72693-5_22.

Der volle Inhalt der Quelle
Annotation:
AbstractThanks to its ability to offer a time-oriented perspective on the clinical events that define the patient’s path of care, Process Mining (PM) is assuming an emerging role in clinical data analytics. PM’s ability to exploit time-series data and to build processes without any a priori knowledge suggests interesting synergies with the most common statistical analyses in healthcare, in particular survival analysis. In this work we demonstrate contributions of our process-oriented approach in analyzing a real-world retrospective dataset of patients treated for advanced melanoma at the Lausanne University Hospital. Addressing the clinical questions raised by our oncologists, we integrated PM in almost all the steps of a common statistical analysis. We show: (1) how PM can be leveraged to improve the quality of the data (data cleaning/pre-processing), (2) how PM can provide efficient data visualizations that support and/or suggest clinical hypotheses, also allowing to check the consistency between real and expected processes (descriptive statistics), and (3) how PM can assist in querying or re-expressing the data in terms of pre-defined reference workflows for testing survival differences among sub-cohorts (statistical inference). We exploit a rich set of PM tools for querying the event logs, inspecting the processes using statistical hypothesis testing, and performing conformance checking analyses to identify patterns in patient clinical paths and study the effects of different treatment sequences in our cohort.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Nigmatullin, Raoul R., Paolo Lino und Guido Maione. „The Statistics of Fractional Moments and its Application for Quantitative Reading of Real Data“. In New Digital Signal Processing Methods, 87–139. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45359-6_3.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Bingham, John. „On-Line and Real Time Systems“. In Data Processing, 239–44. London: Macmillan Education UK, 1989. http://dx.doi.org/10.1007/978-1-349-19938-9_18.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Weik, Martin H. „real-time data processing“. In Computer Science and Communications Dictionary, 1423. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/1-4020-0613-6_15596.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Fournier, Fabiana, und Inna Skarbovsky. „Real-Time Data Processing“. In Big Data in Bioeconomy, 147–56. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-71069-9_11.

Der volle Inhalt der Quelle
Annotation:
AbstractTo remain competitive, organizations are increasingly taking advantage of the high volumes of data produced in real time for actionable insights and operational decision-making. In this chapter, we present basic concepts in real-time analytics, their importance in today’s organizations, and their applicability to the bioeconomy domains investigated in the DataBio project. We begin by introducing key terminology for event processing, and motivation for the growing use of event processing systems, followed by a market analysis synopsis. Thereafter, we provide a high-level overview of event processing system architectures, with its main characteristics and components, followed by a survey of some of the most prominent commercial and open source tools. We then describe how we applied this technology in two of the DataBio project domains: agriculture and fishery. The devised generic pipeline for IoT data real-time processing and decision-making was successfully applied to three pilots in the project from the agriculture and fishery domains. This event processing pipeline can be generalized to any use case in which data is collected from IoT sensors and analyzed in real-time to provide real-time alerts for operational decision-making.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

MÜHLBAUER, JOHANN A. „Processing Outliers in Statistical Data“. In ACS Symposium Series, 37–47. Washington, D.C.: American Chemical Society, 1985. http://dx.doi.org/10.1021/bk-1985-0284.ch004.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Ratkó, István. „Statistical Data Processing by Microcomputer“. In Medical Informatics Europe 85, 786. Berlin, Heidelberg: Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/978-3-642-93295-3_159.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Panicheva, Polina, und Tatiana Litvinova. „Authorship Attribution in Russian in Real-World Forensics Scenario“. In Statistical Language and Speech Processing, 299–310. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-31372-2_25.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Konferenzberichte zum Thema "Statistical processing of real data"

1

Sarnaglia, A. J. Q., V. A. Reisen, P. Bondou und C. Levy-Leduc. „A robust estimation approach for fitting a PARMA model to real data“. In 2016 IEEE Statistical Signal Processing Workshop (SSP). IEEE, 2016. http://dx.doi.org/10.1109/ssp.2016.7551740.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Babrauckas, Theresa L., und Dale J. Arpasi. „Integrated CFD and Experiments Real-Time Data Acquisition Development“. In ASME 1993 International Gas Turbine and Aeroengine Congress and Exposition. American Society of Mechanical Engineers, 1993. http://dx.doi.org/10.1115/93-gt-097.

Der volle Inhalt der Quelle
Annotation:
The Computational Technologies Branch of NASA Lewis Research Center is developing an Integrated CFD and Experiments (ICE) system as part of the Multistage Compressor Flow Physics program. The ICE operating environment software is general and can be configured for different applications where CFD and experimental information must be managed. ICE currently consists of three powerful subsystems. The experiment support subsystem acquires data and uses parallel processing to generate statistical information and on-line graphical displays for flow physics experiments. The simulation subsystem allows the researcher to conveniently setup and run computational fluid dynamic (CFD) codes on different parallel computer architectures. The analysis subsystem provides tools to display and compare experimental data and CFD data. A consistent data management approach is used across all three subsystems. This paper provides an overview of the ICE design philosophy and discusses its significance in reducing experimental turnaround time and increasing the quality of turbomachinery research. The functional capabilities of the experiment support subsystem are then discussed in detail.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Dalton, L. A., und E. R. Dougherty. „Bayesian MMSE estimation of classification error and performance on real genomic data“. In 2010 IEEE International Workshop on Genomic Signal Processing and Statistics (GENSIPS). IEEE, 2010. http://dx.doi.org/10.1109/gensips.2010.5719674.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Hartman, Daniel A. „Real-Time Detection of Processing Flaws During Inertia Friction Welding of Critical Components“. In ASME Turbo Expo 2012: Turbine Technical Conference and Exposition. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/gt2012-68014.

Der volle Inhalt der Quelle
Annotation:
Outside of statistical process control, current manufacturing quality control is founded upon a “make-then-inspect” mindset [1]. While this approach is an important part of the quality control process, post-process inspection is labor intensive, a bottleneck to continuous production throughput, costly, subject to human interpretation, and susceptible to missing subtle defects. This paper presents the application of in-process quality control (IPQC) during inertia friction welding of critical components. This paper is a follow-on to a preliminary investigation into a new sensing technique for real-time inspection of product quality during friction welding [2]. The previous effort explored the feasibility of modeling the approach that an experienced friction welding operator uses to distinguish anomalous process behavior during friction welding. In particular, a non-contact, audio-based sensor was used to capture the audible process dynamics during inertia friction welding of a dual-alloy component. The previous work employed a neural-network-based data mining technique to locate and identify features within the audio data that can be used to discriminate acceptable from unacceptable process behavior. This paper extends the previous work by providing a formal methodology for automatic, real-time, nondestructive, inspection of rotary friction welding.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Atluru, Sri, und Amit Deshpande. „Statistical Process Monitoring With MTConnect“. In ASME 2012 International Manufacturing Science and Engineering Conference collocated with the 40th North American Manufacturing Research Conference and in participation with the International Conference on Tribology Materials and Processing. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/msec2012-7344.

Der volle Inhalt der Quelle
Annotation:
Statistical Process Control (SPC) techniques are used widely in the manufacturing industry. However, it is sometimes observed that a deviation that is within the acceptable range of inherent process variation does not necessarily conform to specifications. This is especially true in the case of low volume; high precision manufacturing that is customary in aerospace and defense industries. In order to study the limitations posed by conventional SPC techniques in such manufacturing environments, a study was undertaken at TechSolve Inc., Cincinnati to develop a standalone SPC tool. The SPC tool so developed effectively communicates with an on-machine probe and analyzes the collected data to carry out a statistical analysis. MTConnect, a new-generation machine tool communications protocol, was used in developing the communication interfaces with the on-machine probe on a Computer Numerical Control (CNC) machine. The XML (eXtensible Markup Language) code used to extend the MTConnect schema to include the data obtained from the probing routines is also presented. The statistical analysis was developed as a Graphical User Interface (GUI) in LabVIEW. The statistical analysis was carried out as a case study by producing a widget. Real machining was carried out to produce 48 of these widgets using a combination of end mills and face mills. The data obtained during the subsequent quality testing was used to carry out the statistical analysis. The limitations of conventional SPC techniques during the developmental and analytical phases of the study are discussed. The presence of a chip during an on machine probing routine, the variations due to disparities in tool macro geometry, and the demand for conformance to requirements are studied in the view of a statistical process monitoring standpoint. Various alternatives are also discussed that aim to correct and improve the quality of machined parts in these scenarios.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Fair, Douglas C., und Jim Redifer. „Integration of Statistical Process Control Data Into Control and Information Systems for Production Optimization and Regulatory Compliance“. In ASME 2004 Citrus Engineering Conference. American Society of Mechanical Engineers, 2004. http://dx.doi.org/10.1115/cec2004-5004.

Der volle Inhalt der Quelle
Annotation:
Data and Information from the plant floor is proliferating as more sophisticated machinery, instrumentation, and computer based devices are introduced into the manufacturing process. The use of Statistic Process Control to utilize this data goes beyond just measuring and analyzing product specific information. Food processing and manufacturing operations need to integrate real-time data and SPC data analysis into their operation in order to be a world-class manufacturer. To facilitate these goals, an organization must have a vision for an automation and information infrastructure as well as a data strategy. This discussion will touch on a number of key points in formulating that vision, as well as some specific functionalities using data and SPC that one would want to achieve. Paper published with permission.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Banerjee, Sauvik, Fabrizio Ricci und Ajit Mal. „A Vibration and Wave Propagation Based Methodology for Near Real-Time Damage Monitoring of Composite Structures“. In ASME 2007 International Mechanical Engineering Congress and Exposition. ASMEDC, 2007. http://dx.doi.org/10.1115/imece2007-42095.

Der volle Inhalt der Quelle
Annotation:
This paper presents a novel approach to achieve near real-time damage monitoring whereby ultrasonic wave propagation and vibration data are analyzed to determine the location and degree of damage, requiring minimal operator intervention. An improved test setup, consisting of high fidelity sensor arrays, laser scanning vibrometer, data acquisition boards, signal conditioning and dedicated software has been implemented. The collected data are analyzed using a statistical damage index approach to determine the degree of damage to the structure as a function of time with a high confidence level. The statistical damage index approach is designed to overcome the complexity and variability of the signals in the presence of damage as well as the geometric complexity of the structure. It relies on the fact that the dynamical properties of a structure change with the initiation of new damage or growth of existing damage. Using measurements performed on an undamaged or partially damaged structure as baseline, the damage index is evaluated by comparing the changes in the frequency response of the monitored structure as a new damage occurs or an existing damage grows. The proposed algorithm does not require extensive rigorous signal processing, but it computes a single statistical damage parameter (statistic t) with a high confidence level (> 95%), which makes it very fast and automatic. The damage parameter vanishes if there is no change in the structure and its value increases with the severity and proximity of damage to the sensor locations. Thus if damage is initiated at a location within or near the sensor array, then its location and severity can be determined in real-time by the autonomous scheme. The method is applied to identify low velocity impact damage in stiffened composite panels for different arrangements of the source and the receivers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Li, Hongfei, und Hendrik F. Hamann. „A Statistical Approach to Thermal Zone Mapping“. In ASME 2011 Pacific Rim Technical Conference and Exhibition on Packaging and Integration of Electronic and Photonic Systems. ASMEDC, 2011. http://dx.doi.org/10.1115/ipack2011-52192.

Der volle Inhalt der Quelle
Annotation:
Although in most buildings the spatial allocation of cooling resources can be managed using multiple air handling units and an air ducting system, it can be challenging for an operator to leverage this capability, partially because of the complex interdependencies between the different control options. This is in particular important for data centers, where cooling is a major cost while the sufficient allocation of cooling resources has to ensure the reliable operation of mission-critical information processing equipment. It has been shown that thermal zones can provide valuable decision support for optimizing cooling. Such Thermal zones are generally defined as the region of influence of a particular cooling unit or cooling “source” (such as an air condition unit (ACU)). In this paper we show results using a statistical approach, where we leverage real-time sensor data to obtain thermal zones in realtime. Specifically, we model the correlations between temperatures observed from sensors located at the discharge of an ACU and the other sensors located in the room. Outputs from the statistical solution can be used to optimize the placement of equipment in a data center, investigate failure scenarios, and make sure that a proper cooling solution has been achieved.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Peeters, Cédric, Timothy Verstraeten, Ann Nowé, Pieter-Jan Daems und Jan Helsen. „Advanced Vibration Signal Processing Using Edge Computing to Monitor Wind Turbine Drivetrains“. In ASME 2019 2nd International Offshore Wind Technical Conference. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/iowtc2019-7622.

Der volle Inhalt der Quelle
Annotation:
Abstract This paper illustrates an integrated monitoring approach for wind turbines exploiting this Industry 4.0 context. Our combined edge-cloud processing approach is documented. We show edge processing of vibration data captured on a wind turbine gearbox to extract diagnostic features. Focus is on statistical indicators. Real-life signals collected on an offshore turbine are used to illustrate the concept of local processing. The NVIDIA Jet-son platform serves as edge computation medium. Furthermore, we show an integrated failure detection and fault severity assessment at the cloud level. Health assessment and fault localization combines state-of-the-art vibration signal processing on high frequency data (10kHz and higher) with machine learning models to allow anomaly detection for each processing pipeline. Again this is illustrated using data from an offshore wind farm. Additionally, the fact that data of similar wind turbines in the farm is collected allows for exploiting system similarity over the fleet.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Uehara, Kenyu, und Takashi Saito. „Experimental Identification of Model Parameters and the Statistical Processing Using a Nonlinear Oscillator Applied to EEG Analysis“. In ASME 2018 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/imece2018-88112.

Der volle Inhalt der Quelle
Annotation:
The nonlinear analysis may help to reveal the complex behavior of the Electroencephalogram (EEG) signal. In order to analyze the EEG in real time, we have proposed an EEG analysis model using a nonlinear oscillator with one degree of freedom and minimum required parameters. Our method identifies EEG model parameters experimentally. The purpose of this study is to examine the specific characteristic of model parameters. Validation of the method and investigation of characteristic of model parameters were conducted based on alpha frequency EEG data in both relax state and stress state. The results of the parameter identification with the time sliding window for 1 second show almost all of the identified parameters have a normal distribution spread around the average. The model outputs can closely match the complicated experimental EEG data. The results also showed that the existence of nonlinear term in the EEG analysis is crucial and the linearity parameter shows a certain tendency as the nonlinearity increases. Furthermore, the activities of EEG become linear on the mathematical model when suddenly change from the relax state to the stress state. The results indicate that our method may provide useful information in various field including the quantification of human mental or psychological state, diagnosis of brain disease such as epilepsy and design of brain machine interface.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Berichte der Organisationen zum Thema "Statistical processing of real data"

1

Owechko, Yuri, und Bernard Soffer. Real-Time Implementation of Nonlinear Optical Data Processing Functions. Fort Belvoir, VA: Defense Technical Information Center, November 1990. http://dx.doi.org/10.21236/ada233521.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Beer, Randall D. Neural Networks for Real-Time Sensory Data Processing and Sensorimotor Control. Fort Belvoir, VA: Defense Technical Information Center, Dezember 1992. http://dx.doi.org/10.21236/ada259120.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Beer, Randall D. Neural Networks for Real-Time Sensory Data Processing and Sensorimotor Control. Fort Belvoir, VA: Defense Technical Information Center, Juni 1992. http://dx.doi.org/10.21236/ada251567.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Ranjan, Niloo, Jibonananda Sanyal und Joshua Ryan New. In-Situ Statistical Analysis of Autotune Simulation Data using Graphical Processing Units. Office of Scientific and Technical Information (OSTI), August 2013. http://dx.doi.org/10.2172/1093099.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Koopman, D. STATISTICAL EVALUATION OF PROCESSING DATA FROM THE RH RU HG MATRIX STUDY. Office of Scientific and Technical Information (OSTI), April 2009. http://dx.doi.org/10.2172/952445.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Roth, Christopher J., Nelson A. Bonito, Maurice F. Tautz und Eugene C. Courtney. CHAWS Data Processing and Analysis Tools in Real-Time and Postflight Environments. Fort Belvoir, VA: Defense Technical Information Center, September 1998. http://dx.doi.org/10.21236/ada381118.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Boyd, Thomas J., und Richard B. Coffin. Isotope Ratio Spectrometry Data Processing Software: Multivariate Statistical Methods for Hydrocarbon Source Identification and Comparison. Fort Belvoir, VA: Defense Technical Information Center, April 2004. http://dx.doi.org/10.21236/ada422798.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Beedgen, R. Statistical near-real-time accountancy procedures applied to AGNS (Allied General Nuclear Services) minirun data using PROSA. Office of Scientific and Technical Information (OSTI), März 1988. http://dx.doi.org/10.2172/5534189.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Davis, Benjamin. Applying Machine Learning to the Classification of DC-DC Converters: Real-world data collection processing & Validation. Office of Scientific and Technical Information (OSTI), September 2020. http://dx.doi.org/10.2172/1670255.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Bates, C. Richards, Melanie Chocholek, Clive Fox, John Howe und Neil Jones. Scottish Inshore Fisheries Integrated Data System (SIFIDS): Work package (3) final report development of a novel, automated mechanism for the collection of scallop stock data. Herausgegeben von Mark James und Hannah Ladd-Jones. Marine Alliance for Science and Technology for Scotland (MASTS), 2019. http://dx.doi.org/10.15664/10023.23449.

Der volle Inhalt der Quelle
Annotation:
[Extract from Executive Summary] This project, aimed at the development of a novel, automated mechanism for the collection of scallop stock data was a sub-part of the Scottish Inshore Fisheries Integrated Data Systems (SIFIDS) project. The project reviewed the state-of-the-art remote sensing (geophysical and camera-based) technologies available from industry and compared these to inexpensive, off-the -shelf equipment. Sea trials were conducted on scallop dredge sites and also hand-dived scallop sites. Data was analysed manually, and tests conducted with automated processing methods. It was concluded that geophysical acoustic technologies cannot presently detect individual scallop but the remote sensing technologies can be used for broad scale habitat mapping of scallop harvest areas. Further, the techniques allow for monitoring these areas in terms of scallop dredging impact. Camera (video and still) imagery is effective for scallop count and provide data that compares favourably with diver-based ground truth information for recording scallop density. Deployment of cameras is possible through inexpensive drop-down camera frames which it is recommended be deployed on a wide area basis for further trials. In addition, implementation of a ‘citizen science’ approach to wide area recording is suggested to increase the stock assessment across the widest possible variety of seafloor types around Scotland. Armed with such data a full, statistical analysis could be completed and data used with automated processing routines for future long-term monitoring of stock.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie