Littérature scientifique sur le sujet « Real data model »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Real data model ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Articles de revues sur le sujet "Real data model"

1

AbdElHamid, Amr, Peng Zong et Bassant Abdelhamid. « Advanced UAV Hybrid Simulator Model Based-on Dynamic Real Weather Data ». International Journal of Modeling and Optimization 5, no 4 (2015) : 246–56. http://dx.doi.org/10.7763/ijmo.2015.v5.470.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Mueller, Conrad S. M. « A New Computational Model for Real Gains in Big Data Processing Power ». Advances in Cyber-Physical Systems 2, no 1 (28 mars 2017) : 11–21. http://dx.doi.org/10.23939/acps2017.01.011.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Lasič, S. « Geyser model with real-time data collection ». European Journal of Physics 27, no 4 (19 juin 2006) : 995–1005. http://dx.doi.org/10.1088/0143-0807/27/4/031.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Charpentier, Philippe. « The LHCb Computing Model and Real Data ». Journal of Physics : Conference Series 331, no 7 (23 décembre 2011) : 072008. http://dx.doi.org/10.1088/1742-6596/331/7/072008.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Artuso, Paola, Rupert Gammon, Fabio Orecchini et Simon J. Watson. « Alkaline electrolysers : Model and real data analysis ». International Journal of Hydrogen Energy 36, no 13 (juillet 2011) : 7956–62. http://dx.doi.org/10.1016/j.ijhydene.2011.01.094.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Naik, Kevin, et Anton Ianakiev. « Heat demand prediction : A real-life data model vs simulated data model comparison ». Energy Reports 7 (octobre 2021) : 380–88. http://dx.doi.org/10.1016/j.egyr.2021.08.093.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Yoshikawa, Kogo, et Gen Nakamura. « Model Independent MRE Data Analysis ». Computational and Mathematical Methods in Medicine 2013 (2013) : 1–11. http://dx.doi.org/10.1155/2013/912920.

Texte intégral
Résumé :
For the diagnosing modality called MRE (magnetic resonance elastography), the displacement vector of a wave propagating in a human tissue can be measured. The average of the local wavelength from this measured data could be an index for the diagnosing, because the local wave length becomes larger when the tissue is stiffer. By assuming that the local form of the wave is given approximately as multiple complex plane waves, we identify the real part of the complex linear phase of the strongest plane wave of this multiple complex plane waves, by first applying the FBI transform (Fourier-Bros-Iagolnitzer transform) with an appropriate size of Gaussian window and then taking the maximum of the modulus of the transform with respect to the Fourier variable. The real part of the linear phase is nothing but the real inner product of the wave vector and the position vector. Similarly the imaginary part of the linear phase describes the attenuation of the wave and it is given as a real inner product of a real vector and the position vector. This vector can also be recovered by our method. We also apply these methods to design some denoising and filtering for noisy MRE data.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Dutta, Hiren. « Graph Based Data Governance Model for Real Time Data Ingestion ». International Journal of Information Technology and Computer Science 8, no 10 (8 octobre 2016) : 56–62. http://dx.doi.org/10.5815/ijitcs.2016.10.07.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Srisaila, A., D. Rajani, M. V. D. N. S. Madhavi, G. Jaya Lakshmi, K. Amarendra et Narasimha Rao Dasari. « An Improved Data Generalization Model for Real-Time Data Analysis ». Scientific Programming 2022 (9 août 2022) : 1–9. http://dx.doi.org/10.1155/2022/4118371.

Texte intégral
Résumé :
This research proposes a maximum likelihood-Weibull distribution (WD) model for the generalized data distribution family. The distribution function of the anticipated maximum likelihood-Weibull distribution is defined where the statistical properties are derived. The data distribution is capable of modelling monotonically decreasing, increasing, and constant hazard rates. The proposed maximum likelihood-Weibull distribution is used for evaluated these parameters. The experimentation is done to evaluate the potential of the maximum likelihood-Weibull distribution estimated. Here, the online available dataset is adopted for computing the anticipated maximum likelihood-Weibull distribution performance. The outcomes show that the anticipated model is well-suited for computation and compared with other distributions as it possesses maximal and least value of some statistical criteria.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Dutta, Hiren. « Graph based data governance model for real time data ingestion ». CSI Transactions on ICT 3, no 2-4 (décembre 2015) : 119–25. http://dx.doi.org/10.1007/s40012-016-0079-y.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Thèses sur le sujet "Real data model"

1

Frafjord, Christine. « Friction Factor Model and Interpretation of Real Time Data ». Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for petroleumsteknologi og anvendt geofysikk, 2013. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-21772.

Texte intégral
Résumé :
The interest in torque and drag issues has over the last years increased when the complexity of wells drilled has become higher. The petroleum industry continually expands the extended reach drilling envelope, which forces the industry to find new methods and tools to better interpret friction because of the central role it has in achieving successful wells. The importance of model friction is further elaborated in chapter 2.Present report contains an investigation of the theory in Aadnoy`s friction model in terms of possibilities and limitations. The theory has been used as foundation to build an Excel calculation tool to model friction. To study the model, three field cases have been applied with the friction model. Excel program which is applied in this work can be provided by the author of the present report to readers if requested.Even though the limitations listed in chapter 6 do exist, the model has been shown to be applicable to get an indication on the downhole situation. By comparing the result from two different well sections in the same well, it was possible to evaluate high borehole friction because of hole cleaning issues. However, one important issue detected during present work, is that the modeled friction factor is highly sensitive to changes in hook load measurements in shallow sections with small inclination and little drillpipe downhole. This is an issue that demands awareness during application of the model because it is a source of risk to get misleading results.For future work, the model should be improved in a more powerful tool than Excel with added features built in to it to reduce constrains in the model, which are mentioned in chapter 6. As continuation of present work the analytical model should also be further investigated by using real wells with more quantitative data and qualitative measured data.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Granholm, George Richard 1976. « Near-real time atmospheric density model correction using space catalog data ». Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/44899.

Texte intégral
Résumé :
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2000.
Includes bibliographical references (p. 179-184).
Several theories have been presented in regard to creating a neutral density model that is corrected or calibrated in near-real time using data from space catalogs. These theories are usually limited to a small number of frequently tracked "calibration satellites" about which information such as mass and crosssectional area is known very accurately. This work, however, attempts to validate a methodology by which drag information from all available low-altitude space objects is used to update any given density model on a comprehensive basis. The basic update and prediction algorithms and a technique to estimate true ballistic factors are derived in detail. A full simulation capability is independently verified. The process is initially demonstrated using simulated range, azimuth, and elevation observations so that issues such as required number and types of calibration satellites, density of observations, and susceptibility to atmospheric conditions can be examined. Methods of forecasting the density correction models are also validated under different atmospheric conditions.
by George Richard Granholm.
S.M.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Bloodsworth, Peter Charles. « A generic model for real-time scheduling based on dynamic heterogeneous data ». Thesis, Oxford Brookes University, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.432716.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Larsson, Daniel. « ARAVQ for discretization of radar data : An experimental study on real world sensor data ». Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-11114.

Texte intégral
Résumé :
The aim of this work was to investigate if interesting patterns could be found in time series radar data that had been discretized by the algorithm ARAVQ into symbolic representations and if the ARAVQ thus might be suitable for use in the radar domain. An experimental study was performed where the ARAVQ was used to create symbolic representations of data sets with radar data. Two experiments were carried out that used a Markov model to calculate probabilities used for discovering potentially interesting patterns. Some of the most interesting patterns were then investigated further. Results have shown that the ARAVQ was able to create accurate representations for several time series and that it was possible to discover patterns that were interesting and represented higher level concepts. However, the results also showed that the ARAVQ was not able to create accurate representations for some of the time series.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Robidoux, Jeff. « Real-Time Spatial Monitoring of Vehicle Vibration Data as a Model for TeleGeoMonitoring Systems ». Thesis, Virginia Tech, 2005. http://hdl.handle.net/10919/32426.

Texte intégral
Résumé :
This research presents the development and proof of concept of a TeleGeoMonitoring (TGM) system for spatially monitoring and analyzing, in real-time, data derived from vehicle-mounted sensors. In response to the concern for vibration related injuries experienced by equipment operators in surface mining and construction operations, the prototype TGM system focuses on spatially monitoring vehicle vibration in real-time. The TGM vibration system consists of 3 components: (1) Data Acquisition Component, (2) Data Transfer Component, and (3) Data Analysis Component. A GPS receiver, laptop PC, data acquisition hardware, triaxial accelerometer, and client software make up the Data Acquisition Component. The Data Transfer Component consists of a wireless data network and a data server. The Data Analysis Component provides tools to the end user for spatially monitoring and analyzing vehicle vibration data in real-time via the web or GIS workstations. Functionality of the prototype TGM system was successfully demonstrated in both lab and field tests. The TGM vibration system presented in this research demonstrates the potential for TGM systems as a tool for research and management projects, which aim to spatially monitor and analyze data derived from mobile sensors in real-time.
Master of Science
Styles APA, Harvard, Vancouver, ISO, etc.
6

hu, xiaoxiang. « Analysis of Time-related Properties in Real-time Data Aggregation Design ». Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-39046.

Texte intégral
Résumé :
Data aggregation is extensively used in data management systems nowadays. Based on a data aggregation taxonomy named DAGGTAX, we propose an analytic process to evaluate the run-time platform and time-related parameters of Data Aggregation Processes (DAP) in a real-time system design, which can help designers to eliminate infeasible design decisions at early stage. The process for data aggregation design and analysis mainly includes the following outlined steps. Firstly, the user needs to specify the variation of the platform and DAP by figuring out the features of the system and time-related parameters respectively. Then, the user can choose one combination of the variations between the features of the platform and DAP, which forms the initial design of the system. Finally, apply the analytic method for feasibility analysis by schedulability analysis techniques. If there are no infeasibilities found in the process, then the design can be finished. Otherwise, the design must be altered from the run-time platform and DAP design stage, and the schedulability analysis will be applied again for the revised design until all the infeasibilities are fixed. In order to help designers to understand and describe the system and do feasibility analysis, we propose a new UML (Unified Modeling Language) profile that extends UML with concepts related to real-time data aggregation design. These extensions aim to accomplish the conceptual modeling of a real-time data aggregation. In addition, the transferring method based on UML profile to transfer the data aggregation design into a task model is proposed as well. In the end of the thesis, a case study, which applies the analytic process to analyze the architecture design of an environmental monitoring sensor network, is presented as a demonstration of our research.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Hwang, Yuan-Chun. « Local and personalised models for prediction, classification and knowledge discovery on real world data modelling problems ». Click here to access this resource online, 2009. http://hdl.handle.net/10292/776.

Texte intégral
Résumé :
This thesis presents several novel methods to address some of the real world data modelling issues through the use of local and individualised modelling approaches. A set of real world data modelling issues such as modelling evolving processes, defining unique problem subspaces, identifying and dealing with noise, outliers, missing values, imbalanced data and irrelevant features, are reviewed and their impact on the models are analysed. The thesis has made nine major contributions to information science, includes four generic modelling methods, three real world application systems that apply these methods, a comprehensive review of the real world data modelling problems and a data analysis and modelling software. Four novel methods have been developed and published in the course of this study. They are: (1) DyNFIS – Dynamic Neuro-Fuzzy Inference System, (2) MUFIS – A Fuzzy Inference System That Uses Multiple Types of Fuzzy Rules, (3) Integrated Temporal and Spatial Multi-Model System, (4) Personalised Regression Model. DyNFIS addresses the issue of unique problem subspaces by identifying them through a clustering process, creating a fuzzy inference system based on the clusters and applies supervised learning to update the fuzzy rules, both antecedent and consequent part. This puts strong emphasis on the unique problem subspaces and allows easy to understand rules to be extracted from the model, which adds knowledge to the problem. MUFIS takes DyNFIS a step further by integrating a mixture of different types of fuzzy rules together in a single fuzzy inference system. In many real world problems, some problem subspaces were found to be more suitable for one type of fuzzy rule than others and, therefore, by integrating multiple types of fuzzy rules together, a better prediction can be made. The type of fuzzy rule assigned to each unique problem subspace also provides additional understanding of its characteristics. The Integrated Temporal and Spatial Multi-Model System is a different approach to integrating two contrasting views of the problem for better results. The temporal model uses recent data and the spatial model uses historical data to make the prediction. By combining the two through a dynamic contribution adjustment function, the system is able to provide stable yet accurate prediction on real world data modelling problems that have intermittently changing patterns. The personalised regression model is designed for classification problems. With the understanding that real world data modelling problems often involve noisy or irrelevant variables and the number of input vectors in each class may be highly imbalanced, these issues make the definition of unique problem subspaces less accurate. The proposed method uses a model selection system based on an incremental feature selection method to select the best set of features. A global model is then created based on this set of features and then optimised using training input vectors in the test input vector’s vicinity. This approach focus on the definition of the problem space and put emphasis the test input vector’s residing problem subspace. The novel generic prediction methods listed above have been applied to the following three real world data modelling problems: 1. Renal function evaluation which achieved higher accuracy than all other existing methods while allowing easy to understand rules to be extracted from the model for future studies. 2. Milk volume prediction system for Fonterra achieved a 20% improvement over the method currently used by Fonterra. 3. Prognoses system for pregnancy outcome prediction (SCOPE), achieved a more stable and slightly better accuracy than traditional statistical methods. These solutions constitute a contribution to the area of applied information science. In addition to the above contributions, a data analysis software package, NeuCom, was primarily developed by the author prior and during the PhD study to facilitate some of the standard experiments and analysis on various case studies. This is a full featured data analysis and modelling software that is freely available for non-commercial purposes (see Appendix A for more details). In summary, many real world problems consist of many smaller problems. It was found beneficial to acknowledge the existence of these sub-problems and address them through the use of local or personalised models. The rules extracted from the local models also brought about the availability of new knowledge for the researchers and allowed more in-depth study of the sub-problems to be carried out in future research.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Bergstrom, Sarah Elizabeth 1979. « An algorithm for reducing atmospheric density model errors using satellite observation data in real-time ». Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/17537.

Texte intégral
Résumé :
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2002.
Vita.
Includes bibliographical references (p. 233-240).
Atmospheric density mismodeling is a large source of errors in satellite orbit determination and prediction in the 200-600 kilometer range. Algorithms for correcting or "calibrating" an existing atmospheric density model to improve accuracy have been seen as a major way to reduce these errors. This thesis examines one particular algorithm, which does not require launching special "calibration satellites" or new sensor platforms. It relies solely on the large quantity of observations of existing satellites, which are already being made for space catalog maintenance. By processing these satellite observations in near real-time, a linear correction factor can be determined and forecasted into the near future. As a side benefit, improved estimates of the ballistic coefficients of some satellites are also produced. Also, statistics concerning the accuracy of the underlying density model can also be extracted from the correction. This algorithm had previously been implemented and the implementation had been partially validated using simulated data. This thesis describes the completion of the validation process using simulated data and the beginning of the real data validation process. It is also intended to serve as a manual for using and modifying the implementation of the algorithm.
by Sarah Elizabeth Bergstrom.
S.M.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Byrnes, Denise Dianne. « Static scheduling of hard real-time control software using an asynchronous data-driven execution model / ». The Ohio State University, 1992. http://rave.ohiolink.edu/etdc/view?acc_num=osu14877799148243.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Hajjam, Sohrab. « Real-time flood forecasting model intercomparison and parameter updating rain gauge and weather radar data ». Thesis, University of Salford, 1997. http://usir.salford.ac.uk/43019/.

Texte intégral
Résumé :
This thesis describes the development of real-time flood forecasting models at selected catchments in the three countries, using rain gauge and radar derived rainfall estimates and time-series analysis. An extended inter-comparison of real-time flood forecasting models has been carried out and an attempt has been made to rank the flood forecasting models. It was found that an increase in model complexity does not necessarily lead to an increase in forecast accuracy. An extensive analysis of group calibrated transfer function (TF) models on the basis of antecedent conditions of the catchment and storm characteristics has revealed that the use of group model resulted in a significant improvement in the quality of the forecast. A simple model to calculate the average pulse response has also been developed. The development of a hybrid genetic algorithm (HGA), applied to a physically realisable transfer function model is described. The techniques of interview selection and fitness scaling as well as random bit mutation and multiple crossover have been included, and both binary and real number encoding technique have been assessed. The HGA has been successfully applied for the identification and simulation of the dynamic TF model. Four software packages have been developed and extensive development and testing has proved the viability of the approach. Extensive research has been conducted to find the most important adjustment factor of the dynamic TF model. The impact of volume, shape and time adjustment factors on forecast quality has been evaluated. It has been concluded that the volume adjustment factor is the most important factor of the three. Furthermore, several attempts have been made to relate the adjustment factors to different elements. The interaction of adjustment factors has also been investigated. An autoregressive model has been used to develop a new updating technique for the dynamic TF model by the updating of the B parameters through the prediction of future volume adjustment factors over the forecast lead-time. An autoregressive error prediction model has also been combined with a static TF model. Testing has shown that the performance of both new TF models is superior to conventional procedures.
Styles APA, Harvard, Vancouver, ISO, etc.

Livres sur le sujet "Real data model"

1

Henzinger, T. A. An interleaving model for real time. Stanford, Calif : Dept. of Computer Science, Stanford University, 1990.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Dailey, Daniel J. A cellular automata model for use with real freeway data. [Olympia, Wash.] : Washington State Dept. of Transportation, 2002.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Carroll, Jeremy J. A deterministic model of time for distributed systems. Palo Alto, CA : Hewlett-Packard Laboratories, Technical Publications Department, 1996.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Savola, Reijo. The model execution kernel of a real-time software prototyping environment. Espoo [Finland] : Technical Research Centre of Finland, 1992.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Verghese, Gilbert. Perspective alignment back-projection for real-time monocular three-dimensional model-based computer vision. Toronto : Dept. of Computer Science, University of Toronto, 1995.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Galí, Jordi. Technology shocks and aggregate fluctuations : How well does the real business cycle model fit postwar U.S. data ? [Washington, D.C] : International Monetary Fund, Western Hemisphere Dept., 2004.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Johnson, Steven A. A simple dynamic engine model for use in a real-time aircraft simulation with thrust vectoring. [Washington, D.C.] : National Aeronautics and Space Administration, Office of Management, Scientific and Technical Information Division, 1990.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Crawford, Jason A. Modal emissions modeling with real traffic data. College Station, Tex : Texas Transportation Institute, Texas A&M University System, 1999.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

United States. National Aeronautics and Space Administration., dir. Real-time sensor data validation. [Washington, DC] : National Aeronautics and Space Administration, 1994.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

United States. National Aeronautics and Space Administration., dir. Real-time sensor data validation. [Washington, DC] : National Aeronautics and Space Administration, 1994.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Chapitres de livres sur le sujet "Real data model"

1

Yoon, Sung-eui, Enrico Gobbetti, David Kasik et Dinesh Manocha. « Cache-Coherent Data Management ». Dans Real-Time Massive Model Rendering, 55–83. Cham : Springer International Publishing, 2008. http://dx.doi.org/10.1007/978-3-031-79531-2_5.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Kaddes, Mourad, Majed Abdouli, Laurent Amanton, Mouez Ali, Rafik Bouaziz et Bruno Sadeg. « F-RT-ETM : Toward Analysis and Formalizing Real Time Transaction and Data in Real-Time Database ». Dans Model and Data Engineering, 85–96. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-24443-8_11.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Ouhammou, Yassine, Emmanuel Grolleau, Michael Richard et Pascal Richard. « Towards a Simple Meta-model for Complex Real-Time and Embedded Systems ». Dans Model and Data Engineering, 226–36. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-24443-8_24.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Cuzzocrea, Alfredo, Giuseppe Psaila et Maurizio Toccu. « Knowledge Discovery from Geo-Located Tweets for Supporting Advanced Big Data Analytics : A Real-Life Experience ». Dans Model and Data Engineering, 285–94. Cham : Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-23781-7_23.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Sen, Soumya, Agostino Cortesi et Nabendu Chaki. « Applications of Hyper-lattice in Real Life ». Dans Hyper-lattice Algebraic Model for Data Warehousing, 25–33. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-28044-8_2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Landau, I. D. « On the Use of Real Data for Controller Reduction ». Dans Model Identification and Adaptive Control, 75–93. London : Springer London, 2001. http://dx.doi.org/10.1007/978-1-4471-0711-8_4.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Cluckie, Ian D., Pao-shan Yu et Kevin A. Tiford. « Real-Time Forecasting : Model Structure and Data Resolution ». Dans Weather Radar Networking, 449–61. Dordrecht : Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-0551-1_49.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Rastogi, Namrata, Parul Verma et Pankaj Kumar. « Ontological Design of Information Retrieval Model for Real Estate Documents ». Dans Microservices in Big Data Analytics, 73–85. Singapore : Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-15-0128-9_7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Babau, Jean-Philippe, Philippe Dhaussy et Pierre-Yves Pillain. « Model Integration for Formal Qualification of Timing-Aware Software Data Acquisition Components ». Dans Model-Driven Engineering for Distributed Real-Time Systems, 167–200. Hoboken, NJ, USA : John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118558096.ch7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Chen, Lucheng, Haiqin Xie, Wen Yang et Lin Xiao. « Data Space Based on Mass Customization Model ». Dans Designing Data Spaces, 437–50. Cham : Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-93975-5_26.

Texte intégral
Résumé :
AbstractThe industrial Internet has brought the biggest economic opportunity since the mobile Internet. Haier started the exploration of intelligentization, networking, and informationalization as early as 2005 and gradually launched an industrial Internet platform COSMOPlat with independent intellectual property rights. In COSMOPlat, traditional mass manufacturing model is replaced by mass customization model (MCM), in which production is driven by users’ order rather than inventory. What the model achieved is not simply automation but also real intelligent manufacturing with high efficiency driven by high precision. With users’ participation in the whole process, manufacturing is precisely made according to the users’ dynamic needs, which will better meet actual requirements.The article provides an overview of the mass customization model of COSMOPlat and how it facilitates enterprises for digital transformation and upgrade. Also indicated are the successful use cases in application.
Styles APA, Harvard, Vancouver, ISO, etc.

Actes de conférences sur le sujet "Real data model"

1

Cowdale, Alan. « Real world data collection for model validation ». Dans 2009 Winter Simulation Conference - (WSC 2009). IEEE, 2009. http://dx.doi.org/10.1109/wsc.2009.5429176.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Mecheva, Teodora A., et Nikolay R. Kakanakov. « Traffic flow model based on real data ». Dans 2021 XXX International Scientific Conference Electronics (ET). IEEE, 2021. http://dx.doi.org/10.1109/et52713.2021.9579596.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Jiang, Jingyan, Ziyue Luo, Chenghao Hu, Zhaoliang He, Zhi Wang, Shutao Xia et Chuan Wu. « Joint Model and Data Adaptation for Cloud Inference Serving ». Dans 2021 IEEE Real-Time Systems Symposium (RTSS). IEEE, 2021. http://dx.doi.org/10.1109/rtss52674.2021.00034.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Campbell, David K., et David K. Towner. « A Magneto-optic Polarization Readout Model ». Dans Optical Data Storage. Washington, D.C. : Optica Publishing Group, 1985. http://dx.doi.org/10.1364/ods.1985.tubb2.

Texte intégral
Résumé :
Information stored on magneto-optic disks is typically read using a linearly polarized laser beam whose state of polarization is altered by the Faraday and/or polar Kerr effects upon reflection from the recording medium. Polarization sensitive optics are used to convert these media induced polarization changes into irradiance variations at photodetectors. Because the magneto-optic polarization effects are small (typically less than one degree of polarization rotation in the reflected beam) it is essential that the optical system introduce little additional polarization change if the recorded signal is to be recovered faithfully. This paper describes a polarization model that is used to predict the effects that real optical elements will have on the readout signals, noise, and ultimately, the signal to noise ratio of a magneto-optic recording system.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Gabrielian, Ana B., Tejas G. Puranik, Mayank V. Bendarkar, Michelle Kirby, Dimitri Mavris et Dylan Monteiro. « Noise Model Validation using Real World Operations Data ». Dans AIAA AVIATION 2021 FORUM. Reston, Virginia : American Institute of Aeronautics and Astronautics, 2021. http://dx.doi.org/10.2514/6.2021-2136.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Feller, Christian, Juergen Wuenschmann, Raimar Wagner et Albrecht Rothermel. « Model-based video compression for real world data ». Dans 2013 IEEE Third International Conference on Consumer Electronics ¿ Berlin (ICCE-Berlin). IEEE, 2013. http://dx.doi.org/10.1109/icce-berlin.2013.6697976.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Capobianco, Giovanni, Umberto Di Giacomo, Tommaso Di Tusa, Francesco Mercaldo et Antonella Santone. « A Methodology for Real-Time Data Verification exploiting Deep Learning and Model Checking ». Dans 2019 IEEE International Conference on Big Data (Big Data). IEEE, 2019. http://dx.doi.org/10.1109/bigdata47090.2019.9005994.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Shetye, Komal S., Wonhyeok Jang et Thomas J. Overbye. « System Dynamic Model Validation using Real-Time Models and PMU Data ». Dans 2018 Clemson University Power Systems Conference (PSC). IEEE, 2018. http://dx.doi.org/10.1109/psc.2018.8664078.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Zhao, Li, Zhang Chuang, Xu Ke-Fu et Chen Meng-Meng. « A Computing Model for Real-Time Stream Processing ». Dans 2014 International Conference on Cloud Computing and Big Data (CCBD). IEEE, 2014. http://dx.doi.org/10.1109/ccbd.2014.26.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Feller, Christian, Juergen Wuenschmann, Raimar Wagner et Albrecht Rothermel. « Model optimization for model-based compression of real world video data ». Dans 2014 IEEE Fourth International Conference on Consumer Electronics – Berlin (ICCE-Berlin). IEEE, 2014. http://dx.doi.org/10.1109/icce-berlin.2014.7034314.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Rapports d'organisations sur le sujet "Real data model"

1

Dahlke, Garland R. Revalidation of a REA, IMF and BF Projection Model Using Real-time Ultrasound Imaging and Feeding Data in Cattle. Ames (Iowa) : Iowa State University, janvier 2012. http://dx.doi.org/10.31274/ans_air-180814-111.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Selvaraju, Ragul, SHABARIRAJ SIDDESWARAN et Hariharan Sankarasubramanian. The Validation of Auto Rickshaw Model for Frontal Crash Studies Using Video Capture Data. SAE International, septembre 2020. http://dx.doi.org/10.4271/2020-28-0490.

Texte intégral
Résumé :
Despite being Auto rickshaws are the most important public transportation around Asian countries and especially in India, the safety standards and regulations have not been established as much as for the car segment. The Crash simulations have evolved to analyze the vehicle crashworthiness since crash experimentations are costly. The work intends to provide the validation for an Auto rickshaw model by comparing frontal crash simulation with a random head-on crash video. MATLAB video processing tool has been used to process the crash video, and the impact velocity of the frontal crash is obtained. The vehicle modelled in CATIA is imported in the LS-DYNA software simulation environment to perform frontal crash simulation at the captured speed. The simulation is compared with the crash video at 5, 25, and 40 milliseconds respectively. The comparison shows that the crash pattern of simulation and real crash video are similar in detail. Thus the modelled Auto-rickshaw can be used in the future to validate the real-time crash for providing the scope of improvement in Three-wheeler safety.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Selvaraju, Ragul, SHABARIRAJ SIDDESWARAN et Hariharan Sankarasubramanian. The Validation of Auto Rickshaw Model for Frontal Crash Studies Using Video Capture Data. SAE International, septembre 2020. http://dx.doi.org/10.4271/2020-28-0490.

Texte intégral
Résumé :
Despite being Auto rickshaws are the most important public transportation around Asian countries and especially in India, the safety standards and regulations have not been established as much as for the car segment. The Crash simulations have evolved to analyze the vehicle crashworthiness since crash experimentations are costly. The work intends to provide the validation for an Auto rickshaw model by comparing frontal crash simulation with a random head-on crash video. MATLAB video processing tool has been used to process the crash video, and the impact velocity of the frontal crash is obtained. The vehicle modelled in CATIA is imported in the LS-DYNA software simulation environment to perform frontal crash simulation at the captured speed. The simulation is compared with the crash video at 5, 25, and 40 milliseconds respectively. The comparison shows that the crash pattern of simulation and real crash video are similar in detail. Thus the modelled Auto-rickshaw can be used in the future to validate the real-time crash for providing the scope of improvement in Three-wheeler safety.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Salazar-Díaz, Andrea, Aaron Levi Garavito Acosta, Sergio Restrepo-Ángel et Leidy Viviana Arcila-Agudelo. Real Equilibrium Exchange Rate in Colombia : Thousands of VEC Models Approach. Banco de la República Colombia, décembre 2022. http://dx.doi.org/10.32468/be.1221.

Texte intégral
Résumé :
Behavioral Equilibrium Exchange Rate (BEER) models suggest many variables as potential drivers of equilibrium real exchange rates (ERER). This gives rise to model uncertainty issues, as ERER depends and varies, often drastically, on a particular set of chosen variables. We address this issue by estimating thousands of Vector Error Correction (VEC) specifications for Colombian data between 2000Q1-2019Q4. According to an extensive literature review, we employ thirty-five proxies categorized among five fixed groups of economic fundamentals that underlie the ERER: Indebtedness, Fiscal sector, Productivity, Terms-of-Trade, and Interest Rate Differentials. Our approach derives an empirical distribution of ERER that allows us to state with greater certainty, among hundreds of plausible economic specifications, whether the real exchange rate is either misaligned or in equilibrium.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Passner, Jeffrey E., Stephen Kirby et Terry Jameson. Using Real-Time Weather Data from an Unmanned Aircraft System to Support the Advanced Research Version of the Weather Research and Forecast Model. Fort Belvoir, VA : Defense Technical Information Center, avril 2012. http://dx.doi.org/10.21236/ada561959.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Kuropiatnyk, D. I. Actuality of the problem of parametric identification of a mathematical model. [б. в.], décembre 2018. http://dx.doi.org/10.31812/123456789/2885.

Texte intégral
Résumé :
The purpose of the article is to study the possibilities of increasing the efficiency of a mathematical model by identifying the parameters of an object. A key factor for parametrization can be called the consideration of properties of the values of the model at a specific time point, which allows a deeper analysis of data dependencies and correlation between them. However, such a technique does not always work, because in advance it is impossible to predict that the parameters can be substantially optimized. In addition, it is necessary to take into account the fact that minimization reduces the values of parameters without taking into account their real physical properties. The correctness of the final values will be based on dynamically selected parameters, which allows you to modify the terms of use of the system in real time. In the development process, the values of experimentally obtained data with the model are compared, which allows you to understand the accuracy of minimization. When choosing the most relevant parameters, various minimization functions are used, which provides an opportunity to cover a wide range of theoretical initial situations. Verification of the correctness of the decision is carried out with the help of a quality function, which can identify the accuracy and correctness of the optimized parameters. It is possible to choose different types of functional quality, depending on the characteristics of the initial data. The presence of such tools during parametrization allows for varied analysis of the model, testing it on various algorithms, data volumes and conditions of guaranteed convergence of functional methods.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Hevia, Constantino, et Juan Pablo Nicolini. Research Insights : Do Primary Commodity Prices Account for the Fluctuations of Exchange Rates ? Inter-American Development Bank, décembre 2022. http://dx.doi.org/10.18235/0004605.

Texte intégral
Résumé :
We explicitly derive a relationship between bilateral real exchange rates and primary commodity prices in a model that highlights the role of heterogeneity in production structures across countries. Fluctuations of just a few primary commodity prices account for between one third and one half of the volatility of the bilateral exchange rates of the United States against Germany, Japan, and the United Kingdom. Once we calibrate our quantitative model with data from input-output matrices and shocks to generate the observed commodity price fluctuations, our model delivers the same volatility and persistence of real exchange rates as in the data.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Moldovan, Paula, Sérgio Lagoa et Diana Mendes. The impact of Economic Policy Uncertainty on the real exchange rate : Evidence from the UK. DINÂMIA'CET-Iscte, 2021. http://dx.doi.org/10.15847/dinamiacet-iul.wp.2021.06.

Texte intégral
Résumé :
The world economy has been punctuated by uncertainty as a result of the 2008 subprime crisis, the European sovereign debt crisis, Brexit, and the 2016 US presidential elections, to mention but a few of the reasons. This study explores how the UK real exchange rate reacts to economic policy uncertainty (EPU) shocks using monthly data for the period 1998 to 2020. We contribute to the literature by identifying the long-run and short-run impacts of EPU using a cointegrated ARDL model, and by studying a country that has been through periods of both relatively low and high uncertainty. Results confirm that EPU has an important effect in the long run by depreciating the exchange rate. In addition to urging policymakers and regulators to concentrate on the sometimes difficult task of keeping policy uncertainty to a minimum as a way of sustaining exchange rate stability and thus promoting long-term economic growth, further evidence is provided on exchange rate fundamentals.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Del Valle, Sara Y. Real-time Social Internet Data to Guide Forecasting Models. Office of Scientific and Technical Information (OSTI), septembre 2016. http://dx.doi.org/10.2172/1325660.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Vecherin, Sergey, Derek Chang, Emily Wells, Benjamin Trump, Aaron Meyer, Jacob Desmond, Kyle Dunn, Maxim Kitsak et Igor Linkov. Assessment of the COVID-19 infection risk at a workplace through stochastic microexposure modeling. Engineer Research and Development Center (U.S.), mars 2022. http://dx.doi.org/10.21079/11681/43740.

Texte intégral
Résumé :
The COVID-19 pandemic has a significant impact on economy. Decisions regarding the reopening of businesses should account for infection risks. This paper describes a novel model for COVID-19 infection risks and policy evaluations. The model combines the best principles of the agent-based, microexposure, and probabilistic modeling approaches. It takes into account specifics of a workplace, mask efficiency, and daily routines of employees, but does not require specific interagent rules for simulations. Likewise, it does not require knowledge of microscopic disease related parameters. Instead, the risk of infection is aggregated into the probability of infection, which depends on the duration and distance of every contact. The probability of infection at the end of a workday is found using rigorous probabilistic rules. Unlike previous models, this approach requires only a few reference data points for calibration, which are more easily collected via empirical studies. The application of the model is demonstrated for a typical office environment and for a real-world case. The proposed model allows for effective risk assessment and policy evaluation when there are large uncertainties about the disease, making it particularly suitable for COVID-19 risk assessments.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie