Journal articles on the topic 'Robust model validation'

To see the other types of publications on this topic, follow the link: Robust model validation.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Robust model validation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ronchetti, Elvezio, Christopher Field, and Wade Blanchard. "Robust Linear Model Selection by Cross-Validation." Journal of the American Statistical Association 92, no. 439 (September 1997): 1017–23. http://dx.doi.org/10.1080/01621459.1997.10474057.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wan, Li, and Ying Jin. "Assessment of model validation outcomes of a new recursive spatial equilibrium model for the Greater Beijing." Environment and Planning B: Urban Analytics and City Science 46, no. 5 (September 27, 2017): 805–25. http://dx.doi.org/10.1177/2399808317732575.

Full text
Abstract:
Robust calibration and validation of applied urban models are prerequisites for their successful, policy-cogent use. This is particularly important today when expert assessment is questioned and closely scrutinized. This paper proposes a new model calibration-validation strategy based on a spatial equilibrium model that incorporates multiple time horizons, such that the predictive capabilities of the model can be empirically tested. The model is implemented for the Greater Beijing city region and the model validation strategy is demonstrated over the Census years 2000 to 2010. Through forward/backward forecasting, the model validation helps to verify the stability of the model parameters as well as the predictive capabilities of the recursive equilibrium framework. The proposed modelling strategy sets a new standard for verifying and validating recursive equilibrium models. We also consider the wider implications of the approach.
APA, Harvard, Vancouver, ISO, and other styles
3

Carvalho, Vinícius N., Arinan De P. Dourado, Bruno RF Rende, Aldemir Ap Cavalini, and Valder Steffen. "Experimental validation of a robust model-based balancing approach." Journal of Vibration and Control 25, no. 2 (June 20, 2018): 423–34. http://dx.doi.org/10.1177/1077546318783552.

Full text
Abstract:
Balancing procedures are periodically applied to rotating machines to reduce vibration amplitudes, thus keeping the system operating within acceptable safety limits. Various methods have been developed to balance rotating machines, such as the so-called signal-based balancing techniques. This paper presents the experimental validation of a balancing approach dedicated to rotating machines. This approach is based on a representative mathematical model of the system and it first performs the model updating of the machine. The unbalance condition is obtained through the solution of an inverse problem by taking into account the uncertain parameters of the rotor that can affect the balancing result. This robust balancing methodology is performed by considering a mono-objective optimization method in which the uncertain parameters are represented by random variables. In the present contribution, only the unbalance distribution along the rotor is considered as uncertain information. For this aim, uncertainty was modeled as a Gaussian field and was represented by Monte Carlo simulations. The corresponding experimental investigation considers a rotor system composed of a horizontal steel shaft, three discs, and two self-alignment ball bearings. The obtained results demonstrate that the proposed approach is an effective alternative for the balancing of rotating machines.
APA, Harvard, Vancouver, ISO, and other styles
4

Wittwer, J. W., M. S. Baker, and L. L. Howell. "Robust Design and Model Validation of Nonlinear Compliant Micromechanisms." Journal of Microelectromechanical Systems 15, no. 1 (February 2006): 33–41. http://dx.doi.org/10.1109/jmems.2005.859190.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Smith, R. S., and J. C. Doyle. "Model validation: a connection between robust control and identification." IEEE Transactions on Automatic Control 37, no. 7 (July 1992): 942–52. http://dx.doi.org/10.1109/9.148346.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gevers, Michel, Xavier Bombois, Benoît Codrons, Franky De Bruyne, and Gérard Scorletti. "Model Validation for Robust Control and Controller Validation in a Prediction Error Framework." IFAC Proceedings Volumes 33, no. 15 (June 2000): 19–24. http://dx.doi.org/10.1016/s1474-6670(17)39720-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Stevenson, Samantha, Baylor Fox-Kemper, Markus Jochum, Balaji Rajagopalan, and Stephen G. Yeager. "ENSO Model Validation Using Wavelet Probability Analysis." Journal of Climate 23, no. 20 (October 15, 2010): 5540–47. http://dx.doi.org/10.1175/2010jcli3609.1.

Full text
Abstract:
Abstract A new method to quantify changes in El Niño–Southern Oscillation (ENSO) variability is presented, using the overlap between probability distributions of the wavelet spectrum as measured by the wavelet probability index (WPI). Examples are provided using long integrations of three coupled climate models. When subsets of Niño-3.4 time series are compared, the width of the confidence interval on WPI has an exponential dependence on the length of the subset used, with a statistically identical slope for all three models. This exponential relationship describes the rate at which the system converges toward equilibrium and may be used to determine the necessary simulation length for robust statistics. For the three models tested, a minimum of 250 model years is required to obtain 90% convergence for Niño-3.4, longer than typical Intergovernmental Panel on Climate Change (IPCC) simulations. Applying the same decay relationship to observational data indicates that measuring ENSO variability with 90% confidence requires approximately 240 years of observations, which is substantially longer than the modern SST record. Applying hypothesis testing techniques to the WPI distributions from model subsets and from comparisons of model subsets to the historical Niño-3.4 index then allows statistically robust comparisons of relative model agreement with appropriate confidence levels given the length of the data record and model simulation.
APA, Harvard, Vancouver, ISO, and other styles
8

Smith, R. S. "Model Validation for Robust Control: An Experimental Process Control Application." IFAC Proceedings Volumes 26, no. 2 (July 1993): 133–36. http://dx.doi.org/10.1016/s1474-6670(17)48239-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Namvar, Mehrzad, Alina Voda, and I. D. Landau. "Approaches in identification and model validation for robust control design." IFAC Proceedings Volumes 32, no. 2 (July 1999): 4076–81. http://dx.doi.org/10.1016/s1474-6670(17)56695-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Smith, Roy S. "Model validation for robust control: an experimental process control application." Automatica 31, no. 11 (November 1995): 1637–47. http://dx.doi.org/10.1016/0005-1098(95)00093-c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Zhou, Yixuan, Weimin Zhang, Yongliang Shi, Ziyu Wang, Fangxing Li, and Qiang Huang. "LPCF: Robust Correlation Tracking via Locality Preserving Tracking Validation." Sensors 20, no. 23 (November 30, 2020): 6853. http://dx.doi.org/10.3390/s20236853.

Full text
Abstract:
In visual tracking, the tracking model must be updated online, which often leads to undesired inclusion of corrupted training samples, and hence inducing tracking failure. We present a locality preserving correlation filter (LPCF) integrating a novel and generic decontamination approach, which mitigates the model drift problem. Our decontamination approach maintains the local neighborhood feature points structures of the bounding box center. This proposed tracking-result validation approach models not only the spatial neighborhood relationship but also the topological structures of the bounding box center. Additionally, a closed-form solution to our approach is derived, which makes the tracking-result validation process could be accomplished in only milliseconds. Moreover, a dimensionality reduction strategy is introduced to improve the real-time performance of our translation estimation component. Comprehensive experiments are performed on OTB-2015, LASOT, TrackingNet. The experimental results show that our decontamination approach remarkably improves the overall performance by 6.2%, 12.6%, and 3%, meanwhile, our complete algorithm improves the baseline by 27.8%, 34.8%, and 15%. Finally, our tracker achieves the best performance among most existing decontamination trackers under the real-time requirement.
APA, Harvard, Vancouver, ISO, and other styles
12

Kellogg, Deborah L. "A customer contact measurement model: an extension." International Journal of Service Industry Management 11, no. 1 (March 1, 2000): 26–45. http://dx.doi.org/10.1108/09564230010310277.

Full text
Abstract:
This paper validates the customer contact measurement model by performing a replication using three different sample groups. The impact of customer, managerial, and cultural differences is examined. Findings indicate that all validation groups use similar variables when defining the customer contact construct. The measurement model is robust when compared to US customer and managerial validation groups. However, the applicability across culture is questioned.
APA, Harvard, Vancouver, ISO, and other styles
13

Archer, Graeme, Michael Balls, Leon H. Bruner, Rodger D. Curren, Julia H. Fentem, Hermann-Georg Holzhütter, Manfred Liebsch, David P. Lovell, and Jacqueline A. Southee. "The Validation of Toxicological Prediction Models." Alternatives to Laboratory Animals 25, no. 5 (September 1997): 505–16. http://dx.doi.org/10.1177/026119299702500507.

Full text
Abstract:
An alternative method is shown to consist of two parts: the test system itself; and a prediction model for converting in vitro endpoints into predictions of in vivo toxicity. For the alternative method to be relevant and reliable, it is important that its prediction model component is of high predictive power and is sufficiently robust against sources of data variability. In other words, the prediction model must be subjected to criticism, leading successful models to the state of confirmation. It is shown that there are certain circumstances in which a new prediction model may be introduced without the necessity to generate new test system data.
APA, Harvard, Vancouver, ISO, and other styles
14

Jiang, Hailei, Sirish L. Shah, and Biao Huang. "Control relevant on-line model validation criterion based on robust stability conditions." Canadian Journal of Chemical Engineering 86, no. 5 (September 28, 2008): 893–904. http://dx.doi.org/10.1002/cjce.20097.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Li, Zhixiong. "Robust global sliding model control for water-hull-propulsion unit interaction systems - Part 2: Model validation." Tehnicki vjesnik - Technical Gazette 22, no. 2 (2015): 465–73. http://dx.doi.org/10.17559/tv-20141208054604.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Ma, W., M. Yilmaz, M. Sznaier, and C. Lagoa. "SEMI-BLIND ROBUST IDENTIFICATION/MODEL (IN)VALIDATION WITH APPLICATIONS TO MACRO-ECONOMIC MODELLING." IFAC Proceedings Volumes 38, no. 1 (2005): 886–91. http://dx.doi.org/10.3182/20050703-6-cz-1902.00149.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Taunay, Pierre-Yves C. R., and Edgar Y. Choueiri. "Multi-wavelength pyrometry based on robust statistics and cross-validation of emissivity model." Review of Scientific Instruments 91, no. 11 (November 1, 2020): 114902. http://dx.doi.org/10.1063/5.0019847.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Savkin, Andrey V., and Ian R. Petersen. "Model validation for robust control of uncertain systems with an integral quadratic constraint." Automatica 32, no. 4 (April 1996): 603–6. http://dx.doi.org/10.1016/0005-1098(95)00167-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Ludwig, T., D. Gaida, C. Keysers, J. Pinnekamp, M. Bongards, P. Kern, C. Wolf, and A. L. Sousa Brito. "An advanced simulation model for membrane bioreactors: development, calibration and validation." Water Science and Technology 66, no. 7 (October 1, 2012): 1384–91. http://dx.doi.org/10.2166/wst.2012.249.

Full text
Abstract:
Membrane wastewater treatment plants (WWTPs) have several advantages compared with conventionally designed WWTPs with classical purification techniques. The filtration process is the key to their commercial success in Germany with respect to energy consumption and effectiveness, enabled by the optimization of filtration using a dynamic simulation model. This work is focused on the development of a robust, flexible and practically applicable membrane simulation model for submerged hollow-fibre and flat-sheet membrane modules. The model is based on standard parameters usually measured on membrane WWTPs. The performance of the model is demonstrated by successful calibration and validation for three different full-scale membrane WWTPs achieving good results. Furthermore, the model is combinable with Activated Sludge Models.
APA, Harvard, Vancouver, ISO, and other styles
20

Tsang, Eric K. C., and Bertram E. Shi. "Normalization Enables Robust Validation of Disparity Estimates from Neural Populations." Neural Computation 20, no. 10 (October 2008): 2464–90. http://dx.doi.org/10.1162/neco.2008.05-07-532.

Full text
Abstract:
Binocular fusion takes place over a limited region smaller than one degree of visual angle (Panum's fusional area), which is on the order of the range of preferred disparities measured in populations of disparity-tuned neurons in the visual cortex. However, the actual range of binocular disparities encountered in natural scenes extends over tens of degrees. This discrepancy suggests that there must be a mechanism for detecting whether the stimulus disparity is inside or outside the range of the preferred disparities in the population. Here, we compare the efficacy of several features derived from the population responses of phase-tuned disparity energy neurons in differentiating between in-range and out-of-range disparities. Interestingly, some features that might be appealing at first glance, such as the average activation across the population and the difference between the peak and average responses, actually perform poorly. On the other hand, normalizing the difference between the peak and average responses results in a reliable indicator. Using a probabilistic model of the population responses, we improve classification accuracy by combining multiple features. A decision rule that combines the normalized peak to average difference and the peak location significantly improves performance over decision rules based on either measure in isolation. In addition, classifiers using normalized difference are also robust to mismatch between the image statistics assumed by the model and the actual image statistics.
APA, Harvard, Vancouver, ISO, and other styles
21

Sendrowski, Alicia, Kazi Sadid, Ehab Meselhe, Wayne Wagner, David Mohrig, and Paola Passalacqua. "Transfer Entropy as a Tool for Hydrodynamic Model Validation." Entropy 20, no. 1 (January 12, 2018): 58. http://dx.doi.org/10.3390/e20010058.

Full text
Abstract:
The validation of numerical models is an important component of modeling to ensure reliability of model outputs under prescribed conditions. In river deltas, robust validation of models is paramount given that models are used to forecast land change and to track water, solid, and solute transport through the deltaic network. We propose using transfer entropy (TE) to validate model results. TE quantifies the information transferred between variables in terms of strength, timescale, and direction. Using water level data collected in the distributary channels and inter-channel islands of Wax Lake Delta, Louisiana, USA, along with modeled water level data generated for the same locations using Delft3D, we assess how well couplings between external drivers (river discharge, tides, wind) and modeled water levels reproduce the observed data couplings. We perform this operation through time using ten-day windows. Modeled and observed couplings compare well; their differences reflect the spatial parameterization of wind and roughness in the model, which prevents the model from capturing high frequency fluctuations of water level. The model captures couplings better in channels than on islands, suggesting that mechanisms of channel-island connectivity are not fully represented in the model. Overall, TE serves as an additional validation tool to quantify the couplings of the system of interest at multiple spatial and temporal scales.
APA, Harvard, Vancouver, ISO, and other styles
22

Chrysos, Grigorios G., Jean Kossaifi, and Stefanos Zafeiriou. "RoCGAN: Robust Conditional GAN." International Journal of Computer Vision 128, no. 10-11 (July 14, 2020): 2665–83. http://dx.doi.org/10.1007/s11263-020-01348-5.

Full text
Abstract:
Abstract Conditional image generation lies at the heart of computer vision and conditional generative adversarial networks (cGAN) have recently become the method of choice for this task, owing to their superior performance. The focus so far has largely been on performance improvement, with little effort in making cGANs more robust to noise. However, the regression (of the generator) might lead to arbitrarily large errors in the output, which makes cGANs unreliable for real-world applications. In this work, we introduce a novel conditional GAN model, called RoCGAN, which leverages structure in the target space of the model to address the issue. Specifically, we augment the generator with an unsupervised pathway, which promotes the outputs of the generator to span the target manifold, even in the presence of intense noise. We prove that RoCGAN share similar theoretical properties as GAN and establish with both synthetic and real data the merits of our model. We perform a thorough experimental validation on large scale datasets for natural scenes and faces and observe that our model outperforms existing cGAN architectures by a large margin. We also empirically demonstrate the performance of our approach in the face of two types of noise (adversarial and Bernoulli).
APA, Harvard, Vancouver, ISO, and other styles
23

Guthrie, J. A., D. J. Reid, and K. B. Walsh. "Assessment of internal quality attributes of mandarin fruit. 2. NIR calibration model robustness." Australian Journal of Agricultural Research 56, no. 4 (2005): 417. http://dx.doi.org/10.1071/ar04299.

Full text
Abstract:
The robustness of multivariate calibration models, based on near infrared spectroscopy, for the assessment of total soluble solids (TSS) and dry matter (DM) of intact mandarin fruit (Citrus reticulata cv. Imperial) was assessed. TSS calibration model performance was validated in terms of prediction of populations of fruit not in the original population (different harvest days from a single tree, different harvest localities, different harvest seasons). Of these, calibration performance was most affected by validation across seasons (signal to noise statistic on root mean squared error of prediction of 3.8, compared with 20 and 13 for locality and harvest day, respectively). Procedures for sample selection from the validation population for addition to the calibration population (‘model updating’) were considered for both TSS and DM models. Random selection from the validation group worked as well as more sophisticated selection procedures, with approximately 20 samples required. Models that were developed using samples at a range of temperatures were robust in validation for TSS and DM.
APA, Harvard, Vancouver, ISO, and other styles
24

Shadan, Fariba, Faramarz Khoshnoudian, Daniel J. Inman, and Akbar Esfandiari. "Experimental validation of a FRF-based model updating method." Journal of Vibration and Control 24, no. 8 (August 16, 2016): 1570–83. http://dx.doi.org/10.1177/1077546316664675.

Full text
Abstract:
In this paper, a finite element model updating method using frequency response functions is experimentally validated. The method is a sensitivity-based model updating approach which utilizes a pseudo-linear sensitivity equation. The method is robust against the adverse effects of incomplete measurement, measurement errors and modeling errors. The experimental setup consists of a free-free aluminum beam, where changes are introduced by reducing the stiffness and attaching lumped mass at certain parts of the beam. The method is applied to identify the location and amount of the changes in structural parameters. The results indicate that the location and the size of different level of changes in the structure can be properly identified by the method. In addition, a study is done on the influence of the number of impacts and sensors on the quality of the identified parameters.
APA, Harvard, Vancouver, ISO, and other styles
25

Crispoltoni, Michele, Mario Fravolini, Fabio Balzano, Stephane D’Urso, and Marcello Napolitano. "Interval Fuzzy Model for Robust Aircraft IMU Sensors Fault Detection." Sensors 18, no. 8 (August 1, 2018): 2488. http://dx.doi.org/10.3390/s18082488.

Full text
Abstract:
This paper proposes a data-based approach for a robust fault detection (FD) of the inertial measurement unit (IMU) sensors of an aircraft. Fuzzy interval models (FIMs) have been introduced for coping with the significant modeling uncertainties caused by poorly modeled aerodynamics. The proposed FIMs are used to compute robust prediction intervals for the measurements provided by the IMU sensors. Specifically, a nonlinear neural network (NN) model is used as central prediction of the sensor response while the uncertainty around the central estimation is captured by the FIM model. The uncertainty has been also modelled using a conventional linear Interval Model (IM) approach; this allows a quantitative evaluation of the benefits provided by the FIM approach. The identification of the IMs and of the FIMs was formalized as a linear matrix inequality (LMI) optimization problem using as cost function the (mean) amplitude of the prediction interval and as optimization variables the parameters defining the amplitudes of the intervals of the IMs and FIMs. Based on the identified models, FD validation tests have been successfully conducted using actual flight data of a P92 Tecnam aircraft by artificially injecting additive fault signals on the fault free IMU readings.
APA, Harvard, Vancouver, ISO, and other styles
26

Evans, Katherine J., Joseph H. Kennedy, Dan Lu, Mary M. Forrester, Stephen Price, Jeremy Fyke, Andrew R. Bennett, et al. "LIVVkit 2.1: automated and extensible ice sheet model validation." Geoscientific Model Development 12, no. 3 (March 22, 2019): 1067–86. http://dx.doi.org/10.5194/gmd-12-1067-2019.

Full text
Abstract:
Abstract. A collection of scientific analyses, metrics, and visualizations for robust validation of ice sheet models is presented using the Land Ice Verification and Validation toolkit (LIVVkit), version 2.1, and the LIVVkit Extensions repository (LEX), version 0.1. This software collection targets stand-alone ice sheet or coupled Earth system models, and handles datasets and analyses that require high-performance computing and storage. LIVVkit aims to enable efficient and fully reproducible workflows for postprocessing, analysis, and visualization of observational and model-derived datasets in a shareable format, whereby all data, methodologies, and output are distributed to users for evaluation. Extending from the initial LIVVkit software framework, we demonstrate Greenland ice sheet simulation validation metrics using the coupled Community Earth System Model (CESM) as well as an idealized stand-alone high-resolution Community Ice Sheet Model, version 2 (CISM2), coupled to the Albany/FELIX velocity solver (CISM-Albany or CISM-A). As one example of the capability, LIVVkit analyzes the degree to which models capture the surface mass balance (SMB) and identifies potential sources of bias, using recently available in situ and remotely sensed data as comparison. Related fields within atmosphere and land surface models, e.g., surface temperature, radiation, and cloud cover, are also diagnosed. Applied to the CESM1.0, LIVVkit identifies a positive SMB bias that is focused largely around Greenland's southwest region that is due to insufficient ablation.
APA, Harvard, Vancouver, ISO, and other styles
27

Pan, H., and Z. Zou. "PENALIZED SPLINE: A GENERAL ROBUST TRAJECTORY MODEL FOR ZIYUAN-3 SATELLITE." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLI-B1 (June 3, 2016): 365–72. http://dx.doi.org/10.5194/isprsarchives-xli-b1-365-2016.

Full text
Abstract:
Owing to the dynamic imaging system, the trajectory model plays a very important role in the geometric processing of high resolution satellite imagery. However, establishing a trajectory model is difficult when only discrete and noisy data are available. In this manuscript, we proposed a general robust trajectory model, the penalized spline model, which could fit trajectory data well and smooth noise. The penalized parameter λ controlling the smooth and fitting accuracy could be estimated by generalized cross-validation. Five other trajectory models, including third-order polynomials, Chebyshev polynomials, linear interpolation, Lagrange interpolation and cubic spline, are compared with the penalized spline model. Both the sophisticated ephemeris and on-board ephemeris are used to compare the orbit models. The penalized spline model could smooth part of noise, and accuracy would decrease as the orbit length increases. The band-to-band misregistration of ZiYuan-3 Dengfeng and Faizabad multispectral images is used to evaluate the proposed method. With the Dengfeng dataset, the third-order polynomials and Chebyshev approximation could not model the oscillation, and introduce misregistration of 0.57 pixels misregistration in across-track direction and 0.33 pixels in along-track direction. With the Faizabad dataset, the linear interpolation, Lagrange interpolation and cubic spline model suffer from noise, introducing larger misregistration than the approximation models. Experimental results suggest the penalized spline model could model the oscillation and smooth noise.
APA, Harvard, Vancouver, ISO, and other styles
28

Pan, H., and Z. Zou. "PENALIZED SPLINE: A GENERAL ROBUST TRAJECTORY MODEL FOR ZIYUAN-3 SATELLITE." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLI-B1 (June 3, 2016): 365–72. http://dx.doi.org/10.5194/isprs-archives-xli-b1-365-2016.

Full text
Abstract:
Owing to the dynamic imaging system, the trajectory model plays a very important role in the geometric processing of high resolution satellite imagery. However, establishing a trajectory model is difficult when only discrete and noisy data are available. In this manuscript, we proposed a general robust trajectory model, the penalized spline model, which could fit trajectory data well and smooth noise. The penalized parameter λ controlling the smooth and fitting accuracy could be estimated by generalized cross-validation. Five other trajectory models, including third-order polynomials, Chebyshev polynomials, linear interpolation, Lagrange interpolation and cubic spline, are compared with the penalized spline model. Both the sophisticated ephemeris and on-board ephemeris are used to compare the orbit models. The penalized spline model could smooth part of noise, and accuracy would decrease as the orbit length increases. The band-to-band misregistration of ZiYuan-3 Dengfeng and Faizabad multispectral images is used to evaluate the proposed method. With the Dengfeng dataset, the third-order polynomials and Chebyshev approximation could not model the oscillation, and introduce misregistration of 0.57 pixels misregistration in across-track direction and 0.33 pixels in along-track direction. With the Faizabad dataset, the linear interpolation, Lagrange interpolation and cubic spline model suffer from noise, introducing larger misregistration than the approximation models. Experimental results suggest the penalized spline model could model the oscillation and smooth noise.
APA, Harvard, Vancouver, ISO, and other styles
29

Mukkamala, Ramakrishna, and Richard J. Cohen. "A forward model-based validation of cardiovascular system identification." American Journal of Physiology-Heart and Circulatory Physiology 281, no. 6 (December 1, 2001): H2714—H2730. http://dx.doi.org/10.1152/ajpheart.2001.281.6.h2714.

Full text
Abstract:
We present a theoretical evaluation of a cardiovascular system identification method that we previously developed for the analysis of beat-to-beat fluctuations in noninvasively measured heart rate, arterial blood pressure, and instantaneous lung volume. The method provides a dynamical characterization of the important autonomic and mechanical mechanisms responsible for coupling the fluctuations (inverse modeling). To carry out the evaluation, we developed a computational model of the cardiovascular system capable of generating realistic beat-to-beat variability (forward modeling). We applied the method to data generated from the forward model and compared the resulting estimated dynamics with the actual dynamics of the forward model, which were either precisely known or easily determined. We found that the estimated dynamics corresponded to the actual dynamics and that this correspondence was robust to forward model uncertainty. We also demonstrated the sensitivity of the method in detecting small changes in parameters characterizing autonomic function in the forward model. These results provide confidence in the performance of the cardiovascular system identification method when applied to experimental data.
APA, Harvard, Vancouver, ISO, and other styles
30

Salama, M. S., R. Van der Velde, H. J. van der Woerd, J. C. Kromkamp, C. J. M. Philippart, A. T. Joseph, P. E. O'Neill, et al. "Technical Notes: Calibration and validation of geophysical observation models." Biogeosciences Discussions 9, no. 1 (January 11, 2012): 311–27. http://dx.doi.org/10.5194/bgd-9-311-2012.

Full text
Abstract:
Abstract. We present a method to calibrate and validate observational models that interrelate remotely sensed energy fluxes to geophysical variables of land and water surfaces. Coincident sets of remote sensing observation of visible and microwave radiations and geophysical data are assembled and subdivided into calibration (Cal) and validation (Val) data sets. Each Cal/Val pair is used to derive the coefficients (from the Cal set) and the accuracy (from the Val set) of the observation model. Combining the results from all Cal/Val pairs provides probability distributions of the model coefficients and model errors. The method is generic and demonstrated using comprehensive matchup sets from two very different disciplines, soil moisture and water quality. The results demonstrate that the method provides robust model coefficients and quantitative measure of the model uncertainty. This approach can be adopted for the calibration/validation of satellite products of land and water surfaces and the resulting uncertainty can be used as input to data assimilation schemes.
APA, Harvard, Vancouver, ISO, and other styles
31

Salama, M. S., R. Van der Velde, H. J. van der Woerd, J. C. Kromkamp, C. J. M. Philippart, A. T. Joseph, P. E. O'Neill, et al. "Technical Note: Calibration and validation of geophysical observation models." Biogeosciences 9, no. 6 (June 18, 2012): 2195–201. http://dx.doi.org/10.5194/bg-9-2195-2012.

Full text
Abstract:
Abstract. We present a method to calibrate and validate observational models that interrelate remotely sensed energy fluxes to geophysical variables of land and water surfaces. Coincident sets of remote sensing observation of visible and microwave radiations and geophysical data are assembled and subdivided into calibration (Cal) and validation (Val) data sets. Each Cal/Val pair is used to derive the coefficients (from the Cal set) and the accuracy (from the Val set) of the observation model. Combining the results from all Cal/Val pairs provides probability distributions of the model coefficients and model errors. The method is generic and demonstrated using comprehensive matchup sets from two very different disciplines: soil moisture and water quality. The results demonstrate that the method provides robust model coefficients and quantitative measure of the model uncertainty. This approach can be adopted for the calibration/validation of satellite products of land and water surfaces, and the resulting uncertainty can be used as input to data assimilation schemes.
APA, Harvard, Vancouver, ISO, and other styles
32

Asiamah, Nestor, Henry Kofi Mensah, and Emelia Danquah. "An assessment of the emotional intelligence of health workers." Journal of Global Responsibility 9, no. 2 (May 8, 2018): 141–59. http://dx.doi.org/10.1108/jgr-03-2017-0014.

Full text
Abstract:
Purpose This study aims to assess health workers’ level of emotional intelligence (EI) in Accra North and recommend a simple but robust statistical technique for compulsorily validating EI measurement scales. Design/methodology/approach The researchers used a self-reported questionnaire to collect data from 1,049 randomly selected health workers. Two non-nested models, BNK MODEL and CMODEL, were compared to see which of them better fits the study population and yields a better level of EI. The one-sample and independent-samples t-tests, exploratory factor analysis and confirmatory factor analysis were used to present results. Findings The study found that health workers were appreciably emotionally intelligent for both models at the 5 per cent significance level. However, EI was higher for the CMODEL. The CMODEL also better fits the study population (χ2 = 132.2, p = 0.487, Akaike information criterion = 124.932) and thus better underlies EI in it. This study recommends proper validation of the two EI scales evaluated in this study, and possibly other scales, before the use of their data in research, as failure to do so could lead to unrealistic results. Originality/value Apart from its contribution to the literature, this study provides a robust statistical approach for assessing health workers’ EI and validating EI scales. By comparing two models of EI in the validation process, this paper suggests that the researcher’s choice of a measurement scale can influence his/her results.
APA, Harvard, Vancouver, ISO, and other styles
33

Krämer, Dominik, Terrance Wilms, and Rudibert King. "Model-Based Process Optimization for the Production of Macrolactin D by Paenibacillus polymyxa." Processes 8, no. 7 (June 28, 2020): 752. http://dx.doi.org/10.3390/pr8070752.

Full text
Abstract:
In this study, we show the successful application of different model-based approaches for the maximizing of macrolactin D production by Paenibacillus polymyxa. After four initial cultivations, a family of nonlinear dynamic biological models was determined automatically and ranked by their respective Akaike Information Criterion (AIC). The best models were then used in a multi-model setup for robust product maximization. The experimental validation shows the highest product yield attained compared with the identification runs so far. In subsequent fermentations, the online measurements of CO2 concentration, base consumption, and near-infrared spectroscopy (NIR) were used for model improvement. After model extension using expert knowledge, a single superior model could be identified. Model-based state estimation with a sigma-point Kalman filter (SPKF) was based on online measurement data, and this improved model enabled nonlinear real-time product maximization. The optimization increased the macrolactin D production even further by 28% compared with the initial robust multi-model offline optimization.
APA, Harvard, Vancouver, ISO, and other styles
34

Riccardi, Matteo, Alessandro d’Adamo, Andrea Vaini, Marcello Romagnoli, Massimo Borghi, and Stefano Fontanesi. "Experimental Validation of a 3D-CFD Model of a PEM Fuel Cell." E3S Web of Conferences 197 (2020): 05004. http://dx.doi.org/10.1051/e3sconf/202019705004.

Full text
Abstract:
The growing energy demand is inevitably accompanied by a strong increase in greenhouse gas emissions, primarily carbon dioxide. The adoption of new energy vectors is therefore seen as the most promising countermeasure. In this context, hydrogen is an extremely interesting energy carrier, since it can be used as a fuel in both conventional energy systems (internal combustion engines, turbines) and in Fuel Cells (FC). In particular, PEM (Polymeric Electrolyte Membrane) FC are given growing attention in the transportation sector as a Life-Cycle viable solution to sustainable mobility. The use of 3D CFD analysis of for the development of efficient FC architectures is extremely interesting since it can provide a fast development tool for design exploration and optimization. The designer can therefore take advantage of a robust and accurate modelling in order to define and develop fuel cell systems in a more time-efficient and cost-efficient way, to optimize their performance and to lower their production costs. So far, studies available in the scientific literature lack of quantitative validation of the CFD simulations of complete PEM fuel cells against experimental evidence. The proposed study presents a quantitative validation of a multi-physics model of a Clearpak PEM cell. The chemistry and physics implemented in the methodology allow the authors to obtain both thermal and electrical results, characterizing the performance of each component of the PEM. The results obtained, compared with the experimental polarization curve, show that the model is not only numerically stable and robust in terms of boundary conditions, but also capable to accurately characterize the performance of the PEM cell over almost its entire polarization range.
APA, Harvard, Vancouver, ISO, and other styles
35

Eady, Matthew, and Bosoon Park. "An Unsupervised Prediction Model for Salmonella Detection with Hyperspectral Microscopy: A Multi-Year Validation." Applied Sciences 11, no. 3 (January 20, 2021): 895. http://dx.doi.org/10.3390/app11030895.

Full text
Abstract:
Hyperspectral microscope images (HMIs) have been previously explored as a tool for the early and rapid detection of common foodborne pathogenic bacteria. A robust unsupervised classification approach to differentiate bacterial species with the potential for single cell sensitivity is needed for real-world application, in order to confirm the identity of pathogenic bacteria isolated from a food product. Here, a one-class soft independent modelling of class analogy (SIMCA) was used to determine if individual cells are Salmonella positive or negative. The model was constructed and validated with a spectral library built over five years, containing 13 Salmonella serotypes and 14 non-Salmonella foodborne pathogens. An image processing method designed to take less than one minute paired with the one-class Salmonella prediction algorithm resulted in an overall classification accuracy of 95.4%, with a Salmonella sensitivity of 0.97, and specificity of 0.92. SIMCA’s prediction accuracy was only achieved after a robust model incorporating multiple serotypes was established. These results demonstrate the potential for HMI as a sensitive and unsupervised presumptive screening method, moving towards the early (<8 h) and rapid (<1 h) identification of Salmonella from food matrices.
APA, Harvard, Vancouver, ISO, and other styles
36

Dretzke, Janine, Naomi Chuchu, Ridhi Agarwal, Clare Herd, Winnie Chua, Larissa Fabritz, Susan Bayliss, et al. "Predicting recurrent atrial fibrillation after catheter ablation: a systematic review of prognostic models." EP Europace 22, no. 5 (March 30, 2020): 748–60. http://dx.doi.org/10.1093/europace/euaa041.

Full text
Abstract:
Abstract Aims We assessed the performance of modelsf (risk scores) for predicting recurrence of atrial fibrillation (AF) in patients who have undergone catheter ablation. Methods and results Systematic searches of bibliographic databases were conducted (November 2018). Studies were eligible for inclusion if they reported the development, validation, or impact assessment of a model for predicting AF recurrence after ablation. Model performance (discrimination and calibration) measures were extracted. The Prediction Study Risk of Bias Assessment Tool (PROBAST) was used to assess risk of bias. Meta-analysis was not feasible due to clinical and methodological differences between studies, but c-statistics were presented in forest plots. Thirty-three studies developing or validating 13 models were included; eight studies compared two or more models. Common model variables were left atrial parameters, type of AF, and age. Model discriminatory ability was highly variable and no model had consistently poor or good performance. Most studies did not assess model calibration. The main risk of bias concern was the lack of internal validation which may have resulted in overly optimistic and/or biased model performance estimates. No model impact studies were identified. Conclusion Our systematic review suggests that clinical risk prediction of AF after ablation has potential, but there remains a need for robust evaluation of risk factors and development of risk scores.
APA, Harvard, Vancouver, ISO, and other styles
37

Lee, K.-H. "A robust structural design method using the Kriging model to define the probability of design success." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 224, no. 2 (February 1, 2010): 379–88. http://dx.doi.org/10.1243/09544062jmes1736.

Full text
Abstract:
In this study, a robust optimization method is proposed by introducing the Kriging approximation model and defining the probability of design-success. A key problem in robust optimization is that the mean and the variation of a response cannot be calculated easily. This research presents an implementation of the approximate statistical moment method based on the Kriging metamodel. Furthermore, the statistics using the second-order statistical approximation method are adopted to avoid the local robust optimum. Thus, the probability of design-success, which is defined as the probability of satisfying the imposed design requirements, is represented as a function of approximate mean and variance. The formulation for the robust optimization can be defined as the probability of design-success of each response. The mathematical problem and the design problems of a two-bar structure and microgyroscope are investigated for the validation of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
38

Chang, Liang-Wey, and K. P. Gannon. "A Dynamic Model on a Single-Link Flexible Manipulator." Journal of Vibration and Acoustics 112, no. 1 (January 1, 1990): 138–43. http://dx.doi.org/10.1115/1.2930089.

Full text
Abstract:
An enhanced Equivalent Rigid Link System (ERLS) model using natural-mode shape function for flexible manipulators is developed. An experimental validation of the model is performed on a single-link manipulator. The Lagrangian dynamics and the Finite Element Method are used to derive the equations of motion. Joint variables and nodal displacements are chosen being generalized coordinates. The model well describes the dynamic behavior of manipulator systems and allows for applications to design a robust motion control.
APA, Harvard, Vancouver, ISO, and other styles
39

Oomen, Tom, Robbert van Herpen, and Okko Bosgra. "Robust-Control-Relevant Coprime Factor Identification with Application to Model Validation of a Wafer Stage." IFAC Proceedings Volumes 42, no. 10 (2009): 1044–49. http://dx.doi.org/10.3182/20090706-3-fr-2004.00173.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Hamel, Owen S., Kevin R. Piner, and John R. Wallace. "A Robust Deterministic Model Describing the Bomb Radiocarbon Signal for Use in Fish Age Validation." Transactions of the American Fisheries Society 137, no. 3 (May 2008): 852–59. http://dx.doi.org/10.1577/t07-144.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Savkin, Andrey V., and Ian R. Petersen. "Model Validation for Robust Control of Uncertain Continuous-Time Systems with Discrete and Continuous Measurements." IFAC Proceedings Volumes 29, no. 1 (June 1996): 4126–31. http://dx.doi.org/10.1016/s1474-6670(17)58327-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Savkin, Andrey V., and Ian R. Petersen. "Robust filtering and model validation for uncertain continuous-time systems with discrete and continuous measurements." International Journal of Control 69, no. 1 (January 1998): 163–74. http://dx.doi.org/10.1080/002071798222983.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Errouissi, Rachid, S. M. Muyeen, Ahmed Al-Durra, and Siyu Leng. "Experimental Validation of a Robust Continuous Nonlinear Model Predictive Control Based Grid-Interlinked Photovoltaic Inverter." IEEE Transactions on Industrial Electronics 63, no. 7 (July 2016): 4495–505. http://dx.doi.org/10.1109/tie.2015.2508920.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Caracciolo, Roberto, Dario Richiedei, and Alberto Trevisani. "Experimental validation of a model-based robust controller for multi-body mechanisms with flexible links." Multibody System Dynamics 20, no. 2 (July 23, 2008): 129–45. http://dx.doi.org/10.1007/s11044-008-9113-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Ninomiya, Kenta, Hidetaka Arimura, Wai Yee Chan, Kentaro Tanaka, Shinichi Mizuno, Nadia Fareeda Muhammad Gowdh, Nur Adura Yaakup, Chong-Kin Liam, Chee-Shee Chai, and Kwan Hoong Ng. "Robust radiogenomics approach to the identification of EGFR mutations among patients with NSCLC from three different countries using topologically invariant Betti numbers." PLOS ONE 16, no. 1 (January 11, 2021): e0244354. http://dx.doi.org/10.1371/journal.pone.0244354.

Full text
Abstract:
Objectives To propose a novel robust radiogenomics approach to the identification of epidermal growth factor receptor (EGFR) mutations among patients with non-small cell lung cancer (NSCLC) using Betti numbers (BNs). Materials and methods Contrast enhanced computed tomography (CT) images of 194 multi-racial NSCLC patients (79 EGFR mutants and 115 wildtypes) were collected from three different countries using 5 manufacturers’ scanners with a variety of scanning parameters. Ninety-nine cases obtained from the University of Malaya Medical Centre (UMMC) in Malaysia were used for training and validation procedures. Forty-one cases collected from the Kyushu University Hospital (KUH) in Japan and fifty-four cases obtained from The Cancer Imaging Archive (TCIA) in America were used for a test procedure. Radiomic features were obtained from BN maps, which represent topologically invariant heterogeneous characteristics of lung cancer on CT images, by applying histogram- and texture-based feature computations. A BN-based signature was determined using support vector machine (SVM) models with the best combination of features that maximized a robustness index (RI) which defined a higher total area under receiver operating characteristics curves (AUCs) and lower difference of AUCs between the training and the validation. The SVM model was built using the signature and optimized in a five-fold cross validation. The BN-based model was compared to conventional original image (OI)- and wavelet-decomposition (WD)-based models with respect to the RI between the validation and the test. Results The BN-based model showed a higher RI of 1.51 compared with the models based on the OI (RI: 1.33) and the WD (RI: 1.29). Conclusion The proposed model showed higher robustness than the conventional models in the identification of EGFR mutations among NSCLC patients. The results suggested the robustness of the BN-based approach against variations in image scanner/scanning parameters.
APA, Harvard, Vancouver, ISO, and other styles
46

Butts, Christine A., John A. Monro, and Paul J. Moughan. "In vitrodetermination of dietary protein and amino acid digestibility for humans." British Journal of Nutrition 108, S2 (August 2012): S282—S287. http://dx.doi.org/10.1017/s0007114512002310.

Full text
Abstract:
The development, refinement and validation ofin vitrodigestibility assays for dietary protein and amino acids for single stomached mammals are reviewed. The general principles ofin vitrodigestibility assays and their limitations are discussed.In vitroprotein digestibility assays must be accurate, rapid, cheap, simple, robust, adaptable and relevant to the processes of digestion, absorption, and metabolism. Simplein vitromethods have the potential to give useful measures ofin vivoamino acid and protein digestibility for humans.In vitromethods, including the complex multi-component models of digestion simulating the various physical and chemical processes, require independent validation within vivodata from the target species or an acceptable animal model using the most appropriatein vivomeasure of digestibility. For protein sources devoid of anti-nutritional factors or plant fibre, true ileal digestibility is the recommendedin vivobaseline, while for plant proteins the recommendedin vivoassay is real ileal digestibility. More published comparative studies are required to adequately validatein vitrodigestibility assays.
APA, Harvard, Vancouver, ISO, and other styles
47

Järvinen, Mika, Ville Valtteri Visuri, Sauli Pisilä, Aki Kärnä, Petri Sulasalmi, Eetu Pekka Heikkinen, and Timo Fabritius. "Advanced Methods in Modelling of Metallurgical Unit Operations." Materials Science Forum 762 (July 2013): 236–41. http://dx.doi.org/10.4028/www.scientific.net/msf.762.236.

Full text
Abstract:
This paper summarizes and discusses our recent work on modelling of several steelmaking processes. The work started by developing a detailed sub-model for a single gas bubble reacting in liquid steel. The key feature in this model was an approach based on LOMA, Law of Mass Action, which was employed for defining the chemical rate of a reaction in a robust way. The bubble reaction model was then coupled with a new simulator concept for the AOD process, Argon-Oxygen Decarburization. After a successful validation, the same approach was used to model chemical reactions and chemical heating of liquid steel in the CAS-OB process, Composition Adjustment by Sealed Argon Bubbling Oxygen Blowing,using a supersonic lance. Finally, a new model was developed and implemented into the existing AOD process model for slag reduction with slag droplets. The purpose of this paper is to present a generalised framework for applying and validating the LOMA approach into modelling of metallurgical unit operations. In addition, the use of Computational Fluid Dynamics (CFD) in the validation and verification work is discussed.
APA, Harvard, Vancouver, ISO, and other styles
48

Punzo, Vincenzo, and Fulvio Simonelli. "Analysis and Comparison of Microscopic Traffic Flow Models with Real Traffic Microscopic Data." Transportation Research Record: Journal of the Transportation Research Board 1934, no. 1 (January 2005): 53–63. http://dx.doi.org/10.1177/0361198105193400106.

Full text
Abstract:
The evermore widespread use of microscopic traffic simulation in the analysis of road systems has refocused attention on submodels, including car-following models. The difficulties of microscopic-level simulation models in the accurate reproduction of real traffic phenomena stem not only from the complexity of calibration and validation operations but also from the structural inadequacies of the submodels themselves. Both of these drawbacks originate from the scant information available on real phenomena because of the difficulty with the gathering of accurate field data. In this study, the use of kinematic differential Global Positioning System instruments allowed the trajectories of four vehicles in a platoon to be accurately monitored under real traffic conditions on both urban and extraurban roads. Some of these data were used to analyze the behaviors of four microscopic traffic flow models that differed greatly in both approach and complexity. The effect of the choice of performance measures on the model calibration results was first investigated, and intervehicle spacing was shown to be the most reliable measure. Model calibrations showed results similar to those obtained in other studies that used test track data. Instead, validations resulted in higher deviations compared with those from previous studies (with peaks in cross validations between urban and extraurban experiments). This confirms the need for real traffic data. On comparison of the models, all models showed similar performances (i.e., similar deviations in validation). Surprisingly, however, the simplest model performed on average better than the others, but the most complex one was the most robust, never reaching particularly high deviations.
APA, Harvard, Vancouver, ISO, and other styles
49

Shaheryar, Ahmad, Xu-Cheng Yin, Hong-Wei Hao, Hazrat Ali, and Khalid Iqbal. "A Denoising Based Autoassociative Model for Robust Sensor Monitoring in Nuclear Power Plants." Science and Technology of Nuclear Installations 2016 (2016): 1–17. http://dx.doi.org/10.1155/2016/9746948.

Full text
Abstract:
Sensors health monitoring is essentially important for reliable functioning of safety-critical chemical and nuclear power plants. Autoassociative neural network (AANN) based empirical sensor models have widely been reported for sensor calibration monitoring. However, such ill-posed data driven models may result in poor generalization and robustness. To address above-mentioned issues, several regularization heuristics such as training with jitter, weight decay, and cross-validation are suggested in literature. Apart from these regularization heuristics, traditional error gradient based supervised learning algorithms for multilayered AANN models are highly susceptible of being trapped in local optimum. In order to address poor regularization and robust learning issues, here, we propose a denoised autoassociative sensor model (DAASM) based on deep learning framework. Proposed DAASM model comprises multiple hidden layers which are pretrained greedily in an unsupervised fashion under denoising autoencoder architecture. In order to improve robustness, dropout heuristic and domain specific data corruption processes are exercised during unsupervised pretraining phase. The proposed sensor model is trained and tested on sensor data from a PWR type nuclear power plant. Accuracy, autosensitivity, spillover, and sequential probability ratio test (SPRT) based fault detectability metrics are used for performance assessment and comparison with extensively reported five-layer AANN model by Kramer.
APA, Harvard, Vancouver, ISO, and other styles
50

Huang, Yingchun, and Andras Bardossy. "Impacts of Data Quantity and Quality on Model Calibration: Implications for Model Parameterization in Data-Scarce Catchments." Water 12, no. 9 (August 21, 2020): 2352. http://dx.doi.org/10.3390/w12092352.

Full text
Abstract:
The application of hydrological models in data-scarce catchments is usually limited by the amount of available data. It is of great significance to investigate the impacts of data quantity and quality on model calibration—as well as to further improve the understanding of the effective estimation of robust model parameters. How to make adequate utilization of external information to identify model parameters of data-scarce catchments is also worthy of further exploration. HBV (Hydrologiska Byråns Vattenbalansavdelning) models was used to simulate streamflow at 15 catchments using input data of different lengths. The transferability of all calibrated model parameters was evaluated for two validation periods. A simultaneous calibration approach was proposed for data-scarce catchment by using data from the catchment with minimal spatial proximity. The results indicate that the transferability of model parameters increases with the increase of data used for calibration. The sensitivity of data length in calibration varies between the study catchments, while flood events show the key impacts on surface runoff parameters. In general, ten-year data are relatively sufficient to obtain robust parameters. For data-scarce catchments, simultaneous calibration with neighboring catchment may yield more reliable parameters than only using the limited data.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography