Academic literature on the topic 'Random measurement error'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Random measurement error.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Random measurement error"

1

Kroc, Edward. "Generalized measurement error: Intrinsic and incidental measurement error." PLOS ONE 18, no. 6 (June 29, 2023): e0286680. http://dx.doi.org/10.1371/journal.pone.0286680.

Full text
Abstract:
In this paper, we generalize the notion of measurement error on deterministic sample datasets to accommodate sample data that are random-variable-valued. This leads to the formulation of two distinct kinds of measurement error: intrinsic measurement error, and incidental measurement error. Incidental measurement error will be recognized as the traditional kind that arises from a set of deterministic sample measurements, and upon which the traditional measurement error modelling literature is based, while intrinsic measurement error reflects some subjective quality of either the measurement tool or the measurand itself. We define calibrating conditions that generalize common and classical types of measurement error models to this broader measurement domain, and explain how the notion of generalized Berkson error in particular mathematicizes what it means to be an expert assessor or rater for a measurement process. We then explore how classical point estimation, inference, and likelihood theory can be generalized to accommodate sample data composed of generic random-variable-valued measurements.
APA, Harvard, Vancouver, ISO, and other styles
2

Madan, Hennadii, Franjo Pernuš, and Žiga Špiclin. "Reference-free error estimation for multiple measurement methods." Statistical Methods in Medical Research 28, no. 7 (January 31, 2018): 2196–209. http://dx.doi.org/10.1177/0962280217754231.

Full text
Abstract:
We present a computational framework to select the most accurate and precise method of measurement of a certain quantity, when there is no access to the true value of the measurand. A typical use case is when several image analysis methods are applied to measure the value of a particular quantitative imaging biomarker from the same images. The accuracy of each measurement method is characterized by systematic error (bias), which is modeled as a polynomial in true values of measurand, and the precision as random error modeled with a Gaussian random variable. In contrast to previous works, the random errors are modeled jointly across all methods, thereby enabling the framework to analyze measurement methods based on similar principles, which may have correlated random errors. Furthermore, the posterior distribution of the error model parameters is estimated from samples obtained by Markov chain Monte-Carlo and analyzed to estimate the parameter values and the unknown true values of the measurand. The framework was validated on six synthetic and one clinical dataset containing measurements of total lesion load, a biomarker of neurodegenerative diseases, which was obtained with four automatic methods by analyzing brain magnetic resonance images. The estimates of bias and random error were in a good agreement with the corresponding least squares regression estimates against a reference.
APA, Harvard, Vancouver, ISO, and other styles
3

Müller, Andreas Michael, and Tino Hausotte. "Analysis of the random measurement error of areal 3D coordinate measurements exclusively based on measurement repetitions." tm - Technisches Messen 88, no. 2 (January 27, 2021): 71–77. http://dx.doi.org/10.1515/teme-2020-0087.

Full text
Abstract:
Abstract The measurement uncertainty characteristics of a measurement system are an important parameter when evaluating the suitability of a certain measurement system for a specific measurement task. The measurement uncertainty can be calculated from observed measurement errors, which consist of both systematic and random components. While the unfavourable influence of systematic components can be compensated by calibration, random components are inherently not correctable. There are various measurement principles which are affected by different measurement error characteristics depending on specific properties of the measurement task, e. g. the optical surface properties of the measurement object when using fringe projection or the material properties when using industrial X-ray computed tomography. Thus, it can be helpful in certain scenarios if the spatial distribution of the acquisition quality as well as uncertainty characteristics on the captured surface of a certain measurement task can be found out. This article demonstrates a methodology to determine the random measurement error solely from a series of measurement repetitions without the need of additional information, e. g. a reference measurement or the nominal geometry of the examined part.
APA, Harvard, Vancouver, ISO, and other styles
4

Hutcheon, J. A., A. Chiolero, and J. A. Hanley. "Random measurement error and regression dilution bias." BMJ 340, jun23 2 (June 23, 2010): c2289. http://dx.doi.org/10.1136/bmj.c2289.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Leinonen, Jaakko, Eero Laakkonen, and Leila Laatikainen. "Random measurement error in visual acuity measurement in clinical settings." Acta Ophthalmologica Scandinavica 83, no. 3 (June 8, 2005): 328–32. http://dx.doi.org/10.1111/j.1600-0420.2005.00469.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhai, Xiaochun, Songhua Wu, Bingyi Liu, Xiaoquan Song, and Jiaping Yin. "Shipborne Wind Measurement and Motion-induced Error Correction of a Coherent Doppler Lidar over the Yellow Sea in 2014." Atmospheric Measurement Techniques 11, no. 3 (March 5, 2018): 1313–31. http://dx.doi.org/10.5194/amt-11-1313-2018.

Full text
Abstract:
Abstract. Shipborne wind observations by a coherent Doppler lidar (CDL) have been conducted to study the structure of the marine atmospheric boundary layer (MABL) during the 2014 Yellow Sea campaign. This paper evaluates uncertainties associated with the ship motion and presents the correction methodology regarding lidar velocity measurement based on modified 4-Doppler beam swing (DBS) solution. The errors of calibrated measurement, both for the anchored and the cruising shipborne observations, are comparable to those of ground-based measurements. The comparison between the lidar and radiosonde results in a bias of −0.23 ms−1 and a standard deviation of 0.87 ms−1 for the wind speed measurement, and 2.48, 8.84∘ for the wind direction. The biases of horizontal wind speed and random errors of vertical velocity are also estimated using the error propagation theory and frequency spectrum analysis, respectively. The results show that the biases are mainly related to the measuring error of the ship velocity and lidar pointing error, and the random errors are mainly determined by the signal-to-noise ratio (SNR) of the lidar backscattering spectrum signal. It allows for the retrieval of vertical wind, based on one measurement, with random error below 0.15 ms−1 for an appropriate SNR threshold and bias below 0.02 ms−1. The combination of the CDL attitude correction system and the accurate motion correction process has the potential of continuous long-term high temporal and spatial resolution measurement for the MABL thermodynamic and turbulence process.
APA, Harvard, Vancouver, ISO, and other styles
7

Gertner, George Z. "The sensitivity of measurement error in stand volume estimation." Canadian Journal of Forest Research 20, no. 6 (June 1, 1990): 800–804. http://dx.doi.org/10.1139/x90-105.

Full text
Abstract:
A method is given for approximating and evaluating the consequences of random and nonrandom errors in the independent variables of a nonlinear tree volume function that is used in the estimation of stand volume based on a simple random sample of plots. Sampling error, regression function error, and measurement error are accounted for with the method presented. An application is given where relatively moderate amounts of measurement error in the independent variables of a tree volume function can cause a relatively large reduction in the accuracy of estimated stand volume.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhou, Jun Wei, Lin He, and Rong Wu Xu. "Typical Errors Analysis in Frequency Response Function Measurement." Applied Mechanics and Materials 419 (October 2013): 470–76. http://dx.doi.org/10.4028/www.scientific.net/amm.419.470.

Full text
Abstract:
FRF measurements can suffer from various errors. The effect of deterministic errors become more prominent compared to random errors in FRF measurement. Excitation and sensor misalignment is the most common source of deterministic error, so mathematic model is established and the effect on FRF estimation was analyzed for senor and excitation misalignment situations. Finite element model simulation reveals that misalignment error can have the least effect on the dominant FRFs and a stronger effect on lesser FRFs, beside that it also results in the appearance of false peaks in the measured FRFs.
APA, Harvard, Vancouver, ISO, and other styles
9

Rigdon, Edward E. "Demonstrating the effects of unmodeled random measurement error." Structural Equation Modeling: A Multidisciplinary Journal 1, no. 4 (January 1994): 375–80. http://dx.doi.org/10.1080/10705519409539986.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hasebe, Satoshi. "Random Measurement Error in Assessing Compensatory Ocular Countertorsion." Archives of Ophthalmology 121, no. 12 (December 1, 2003): 1805. http://dx.doi.org/10.1001/archopht.121.12.1805.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Random measurement error"

1

Zhou, Tingxian, Xiaohua Yin, and Xianming Zhao. "A New Error Control Scheme for Remote Control System." International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/611658.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
How to rise the reliability of the data transmission is one of the main problem faced by modern digital communication designers. This paper studies the error-correcting codes being suitable for the channel existing both the random and burst error. A new error control scheme is given. The scheme is a concatenated coding system using an interleaved Reed-Solomon code with symbols over GF (24) as the outer code and a Viterbi-decoded convolutional code as the inner code. As a result of the computer simulation, it is proved that the concatenated coding system has a output at a very low bit error rate (BER)and can correct a lot of compound error patterns. It is suitable for the serious disturb channel existing both the random and burst error. This scheme will be adopted for a remote control system.
APA, Harvard, Vancouver, ISO, and other styles
2

Häggström, Lundevaller Erling. "Tests of random effects in linear and non-linear models." Doctoral thesis, Umeå universitet, Statistik, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Varughese, Suma. "A Study On Effects Of Phase - Amplitude Errors In Planar Near Field Measurement Facility." Thesis, Indian Institute of Science, 2000. https://etd.iisc.ac.in/handle/2005/231.

Full text
Abstract:
Antenna is an indispensable part of a radar or free space communication system. Antenna requires different stringent specifications for different applications. Designed and fabricated for an intended application, antenna or antenna array has to be evaluated for its far-field characteristics in real free space environment which requires setting up of far-field test site. Maintenance of the site to keep the stray reflections levels low, the cost of the real estate are some of the disadvantages. Nearfield measurements are compact and can be used to test the antennas by exploiting the relationship between near-field and far-field. It is shown that the far-field patterns of an antenna can be sufficiently accurately predicted provided the near-field measurements are accurate. Due to limitation in the near-field measurement systems, errors creep in corrupting the nearfield-measured data thus making error in prediction of the far field. All these errors ultimately corrupt the phase and amplitude data. In this thesis, one such near-field measurement facility, the Planar Near Field Measurement facility is discussed. The limitations of the facility and the errors that occur due to their limitations are discussed. Various errors that occur in measurements ultimately corrupt the near-field phase and amplitude. Investigations carried out aim at a detailed study of these phase and amplitude errors and their effect on the far-field patterns of the antenna. Depending on the source of error, the errors are classified as spike, pulse and random errors. The location of occurrence of these types of errors in the measurement plane, their effects on the far-field of the antenna is studied both for phase and amplitude errors. The studies conducted for various phase and amplitude errors show that the near-field phase and amplitude data are more tolerant to random errors as the far-field patterns do not get affected even for low sidelobe cases. The spike errors, though occur as a wedge at a single point in the measurement plane, have more pronounced effect on the far-field patterns. Lower the taper value of the antenna, more pronounced is the error. It is also noticed that the far-field pattern gets affected only in the plane where the error has occurred and has no effect in the orthogonal plane. Pulse type of errors which occur even for a short length in the measurement affect both the principle plane far-field patterns. This study can be used extensively as a tool to determine to the level to which various error such as mechanical, RF etc need to be controlled to make useful and correct pattern predictions on a particular facility. Thereby, the study can be used as a tool to economise the budget of the facility wherein the parameters required for building the facility need not be over specified beyond the requirement. In general, though this is a limited study, it is certainly a trendsetter in this direction.
APA, Harvard, Vancouver, ISO, and other styles
4

Varughese, Suma. "A Study On Effects Of Phase - Amplitude Errors In Planar Near Field Measurement Facility." Thesis, Indian Institute of Science, 2000. http://hdl.handle.net/2005/231.

Full text
Abstract:
Antenna is an indispensable part of a radar or free space communication system. Antenna requires different stringent specifications for different applications. Designed and fabricated for an intended application, antenna or antenna array has to be evaluated for its far-field characteristics in real free space environment which requires setting up of far-field test site. Maintenance of the site to keep the stray reflections levels low, the cost of the real estate are some of the disadvantages. Nearfield measurements are compact and can be used to test the antennas by exploiting the relationship between near-field and far-field. It is shown that the far-field patterns of an antenna can be sufficiently accurately predicted provided the near-field measurements are accurate. Due to limitation in the near-field measurement systems, errors creep in corrupting the nearfield-measured data thus making error in prediction of the far field. All these errors ultimately corrupt the phase and amplitude data. In this thesis, one such near-field measurement facility, the Planar Near Field Measurement facility is discussed. The limitations of the facility and the errors that occur due to their limitations are discussed. Various errors that occur in measurements ultimately corrupt the near-field phase and amplitude. Investigations carried out aim at a detailed study of these phase and amplitude errors and their effect on the far-field patterns of the antenna. Depending on the source of error, the errors are classified as spike, pulse and random errors. The location of occurrence of these types of errors in the measurement plane, their effects on the far-field of the antenna is studied both for phase and amplitude errors. The studies conducted for various phase and amplitude errors show that the near-field phase and amplitude data are more tolerant to random errors as the far-field patterns do not get affected even for low sidelobe cases. The spike errors, though occur as a wedge at a single point in the measurement plane, have more pronounced effect on the far-field patterns. Lower the taper value of the antenna, more pronounced is the error. It is also noticed that the far-field pattern gets affected only in the plane where the error has occurred and has no effect in the orthogonal plane. Pulse type of errors which occur even for a short length in the measurement affect both the principle plane far-field patterns. This study can be used extensively as a tool to determine to the level to which various error such as mechanical, RF etc need to be controlled to make useful and correct pattern predictions on a particular facility. Thereby, the study can be used as a tool to economise the budget of the facility wherein the parameters required for building the facility need not be over specified beyond the requirement. In general, though this is a limited study, it is certainly a trendsetter in this direction.
APA, Harvard, Vancouver, ISO, and other styles
5

Bručas, Domantas. "Geodezinių kampų matavimo prietaisų kalibravimo įrangos kūrimas ir tyrimas." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2008. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2008~D_20080620_092018-51641.

Full text
Abstract:
Disertacijoje nagrinėjamos geodezinių kampus matuojančių prietaisų patikros bei kalibravimo metodai bei priemonės. Pagrindinis tyrimo objektas yra geodezinių prietaisų tikslumo parametrų matavimo būdų ir priemonių analizė, kalibravimo įrenginio kūrimas, jo tikslumo charakteristikų tyrimas bei įrenginio tobulinimas. Šie objektai yra svarbūs vykdant geodezinių prietaisų kalibravimą, kas savo ruožtu yra labai svarbu užtikrinant reikiamą šių prietaisų matavimų tikslumą geodezijoje, statybose, mašinų gamyboje ir t. t. Pagrindinis disertacijos tikslas – geodezinių kampų matavimo prietaisų kalibravimo galimybių analizė ir sukurto kalibravimo įrenginio tikslumo parametr�� tyrimas. Darbe sprendžiami keli pagrindiniai uždaviniai: plokščiųjų kampų matavimo metodų bei įrenginių, tinkamų geodeziniams prietaisams kalibruoti, analizė; daugiaetalonio kampų kalibravimo stendo kūrimas Vilniaus Gedimino technikos universiteto Geodezijos institute bei jo tikslumo charakteristikų tyrimas; stendo tikslumo didinimo galimybių bei priemonių tyrimas, ir apskritiminių skalių kalibravimo būdų tobulinimas. Disertaciją sudaro šeši skyriai, įvadas, išvados, literatūros sąrašas bei priedai. Įvade nagrinėjamas problemos aktualumas, formuluojamas darbo tikslas bei uždaviniai, aprašomas mokslinis darbo naujumas, pristatomi autoriaus pranešimai ir publikacijos, disertacijos struktūra. Pirmasis skyrius skirtas labiausiai paplitusių plokščiųjų kampų matavimo būdų bei priemonių tinkamų geodeziniams prietaisams... [toliau žr. visą tekstą]
The main idea of current PhD thesis is an accuracy analysis of testing and calibration of geodetic instruments. The object of investigation is an analysis of means and methods for testing and calibration of geodetic instruments for plane angle measurement, development of such calibration equipment, its accuracy investigation and the research of its accuracy increasing possibilities. These objects are important for successful testing or calibration of geodetic instruments for angle measuring which is essential in ensuring the precision of measurements taken in surveying, construction, mechanical engineering, etc. There are several main goals of the presented work. First one is an analysis of the angle measuring methods and devices suitable for the testing and calibration of geodetic instruments, according to the results of the mentioned analysis the second task can be formulated – creation of a multi-reference plane angle testing and calibration equipment at Institute of Geodesy, Vilnius Gediminas Technical University and investigate the parameters of its accuracy. The third task is to investigate the accuracy increasing possibilities of the equipment, and implementation some of them into the practice. The thesis consists of four chapters, introduction, conclusions, list of references and appendixes. Introduction is dedicated for an introduction to the problem and its topicality. There are also formulated purposes and tasks of the work; the used methods and novelty of... [to full text]
APA, Harvard, Vancouver, ISO, and other styles
6

Olson, Brent. "Evaluating the error of measurement due to categorical scaling with a measurement invariance approach to confirmatory factor analysis." Thesis, University of British Columbia, 2008. http://hdl.handle.net/2429/332.

Full text
Abstract:
It has previously been determined that using 3 or 4 points on a categorized response scale will fail to produce a continuous distribution of scores. However, there is no evidence, thus far, revealing the number of scale points that may indeed possess an approximate or sufficiently continuous distribution. This study provides the evidence to suggest the level of categorization in discrete scales that makes them directly comparable to continuous scales in terms of their measurement properties. To do this, we first introduced a novel procedure for simulating discretely scaled data that was both informed and validated through the principles of the Classical True Score Model. Second, we employed a measurement invariance (MI) approach to confirmatory factor analysis (CFA) in order to directly compare the measurement quality of continuously scaled factor models to that of discretely scaled models. The simulated design conditions of the study varied with respect to item-specific variance (low, moderate, high), random error variance (none, moderate, high), and discrete scale categorization (number of scale points ranged from 3 to 101). A population analogue approach was taken with respect to sample size (N = 10,000). We concluded that there are conditions under which response scales with 11 to 15 scale points can reproduce the measurement properties of a continuous scale. Using response scales with more than 15 points may be, for the most part, unnecessary. Scales having from 3 to 10 points introduce a significant level of measurement error, and caution should be taken when employing such scales. The implications of this research and future directions are discussed.
APA, Harvard, Vancouver, ISO, and other styles
7

Bručas, Domantas. "Development and research of the test bench for the angle calibration of geodetic instruments." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2008. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2008~D_20080620_092300-91259.

Full text
Abstract:
The main idea of current PhD thesis is an accuracy analysis of testing and calibration of geodetic instruments. The object of investigation is an analysis of means and methods for testing and calibration of geodetic instruments for plane angle measurement, development of such calibration equipment, its accuracy investigation and the research of its accuracy increasing possibilities. These objects are important for successful testing or calibration of geodetic instruments for angle measuring which is essential in ensuring the precision of measurements taken in surveying, construction, mechanical engineering, etc. There are several main goals of the presented work. First one is an analysis of the angle measuring methods and devices suitable for the testing and calibration of geodetic instruments, according to the results of the mentioned analysis the second task can be formulated – creation of a multi-reference plane angle testing and calibration equipment at Institute of Geodesy, Vilnius Gediminas Technical University and investigate the parameters of its accuracy. The third task is to investigate the accuracy increasing possibilities of the equipment, and implementation some of them into the practice. The thesis consists of four chapters, introduction, conclusions, list of references and appendixes. Introduction is dedicated for an introduction to the problem and its topicality. There are also formulated purposes and tasks of the work; the used methods and novelty of... [to full text]
Disertacijoje nagrinėjamos geodezinių kampus matuojančių prietaisų patikros bei kalibravimo metodai bei priemonės. Pagrindinis tyrimo objektas yra geodezinių prietaisų tikslumo parametrų matavimo būdų ir priemonių analizė, kalibravimo įrenginio kūrimas, jo tikslumo charakteristikų tyrimas bei įrenginio tobulinimas. Šie objektai yra svarbūs vykdant geodezinių prietaisų kalibravimą, kas savo ruožtu yra labai svarbu užtikrinant reikiamą šių prietaisų matavimų tikslumą geodezijoje, statybose, mašinų gamyboje ir t. t. Pagrindinis disertacijos tikslas – geodezinių kampų matavimo prietaisų kalibravimo galimybių analizė ir sukurto kalibravimo įrenginio tikslumo para¬metrų tyrimas. Darbe sprendžiami keli pagrindiniai uždaviniai: plokščiųjų kampų mata¬vimo metodų bei įrenginių, tinkamų geodeziniams prietaisams kalibruoti, analizė; daugiaetalonio kampų kalibravimo stendo kūrimas Vilniaus Gedimino technikos universiteto Geodezijos institute bei jo tikslumo charakteristikų tyri¬mas; stendo tikslumo didinimo galimybių bei priemonių tyrimas, ir apskri¬timinių skalių kalibravimo būdų tobulinimas. Disertaciją sudaro šeši skyriai, įvadas, išvados, literatūros sąrašas bei priedai. Įvade nagrinėjamas problemos aktualumas, formuluojamas darbo tikslas bei uždaviniai, aprašomas mokslinis darbo naujumas, pristatomi autoriaus pranešimai ir publikacijos, disertacijos struktūra. Pirmasis skyrius skirtas labiausiai paplitusių plokščiųjų kampų matavimo būdų bei priemonių tinkamų geodeziniams prietaisams... [toliau žr. visą tekstą]
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Ziyang. "Next Generation Ultrashort-Pulse Retrieval Algorithm for Frequency-Resolved Optical Gating: The Inclusion of Random (Noise) and Nonrandom (Spatio-Temporal Pulse Distortions) Error." Diss., Available online, Georgia Institute of Technology, 2005, 2005. http://etd.gatech.edu/theses/available/etd-04122005-224257/unrestricted/wang%5Fziyang%5F200505%5Fphd.pdf.

Full text
Abstract:
Thesis (Ph. D.)--Physics, Georgia Institute of Technology, 2005.
You, Li, Committee Member ; Buck, John A., Committee Member ; Kvam, Paul, Committee Member ; Kennedy, Brian, Committee Member ; Trebino, Rick, Committee Chair. Vita. Includses bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhong, Shan. "Measurement calibration/tuning & topology processing in power system state estimation." Texas A&M University, 2003. http://hdl.handle.net/1969.1/1595.

Full text
Abstract:
State estimation plays an important role in modern power systems. The errors in the telemetered measurements and the connectivity information of the network will greatly contaminate the estimated system state. This dissertation provides solutions to suppress the influences of these errors. A two-stage state estimation algorithm has been utilized in topology error identification in the past decade. Chapter II discusses the implementation of this algorithm. A concise substation model is defined for this purpose. A friendly user interface that incorporates the two-stage algorithm into the conventional state estimator is developed. The performances of the two-stage state estimation algorithms rely on accurate determination of suspect substations. A comprehensive identification procedure is described in chapter III. In order to evaluate the proposed procedure, a topology error library is created. Several identification methods are comparatively tested using this library. A remote measurement calibration method is presented in chapter IV. The un-calibrated quantities can be related to the true values by the characteristic functions. The conventional state estimation algorithm is modified to include the parameters of these functions. Hence they can be estimated along with the system state variables and used to calibrate the measurements. The measurements taken at different time instants are utilized to minimize the influence of the random errors. A method for auto tuning of measurement weights in state estimation is described in chapter V. Two alternative ways to estimate the measurement random error variances are discussed. They are both tested on simulation data generated based on IEEE systems. Their performances are compared. A comprehensive solution, which contains an initialization process and a recursively updating process, is presented. Chapter VI investigates the errors introduced in the positive sequence state estimation due to the usual assumptions of having fully balanced bus loads/generations and continuously transposed transmission lines. Several tests are conducted using different assumptions regarding the availability of single and multi-phase measurements. It is demonstrated that incomplete metering of three-phase system quantities may lead to significant errors in the positive sequence state estimates for certain cases. A novel sequence domain three-phase state estimation algorithm is proposed to solve this problem.
APA, Harvard, Vancouver, ISO, and other styles
10

Bassani, N. P. "STATISTICAL METHODS FOR THE ASSESSMENT OF WITHIN-PLAFORM REPEATABILITY AND BETWEEN-PLATFORM AGREEMENT IN MICRORNA MICROARRAY TECHNOLOGIES." Doctoral thesis, Università degli Studi di Milano, 2013. http://hdl.handle.net/2434/215072.

Full text
Abstract:
Over the last years miRNA microarray platforms have provided great insights in the biological mechanisms underlying onset and development of several diseases such as tumours and chronic pathologies. However, only a few studies have evaluated reliability and concordance of these technologies, mostly by using improper statistical methods. In this project we have studied performance of three different miRNA microarray platforms (Affymetrix, Agilent and Illumina) in terms of their within-platform repeatability by means of a of random-effects model to assess contribution to overall variability from different technical sources, and interpret the quotas of explained variance as Intraclass Correlation Coefficients. Jointly, Concordance Correlation Coefficients between technical array replicates are estimated and assessed in a non-inferiority setting to further evaluate patterns of platform repeatability. Concordance between-platforms has been studied using a modifed version of the Bland-Altman plot which makes use of a linear measurement error to build the agreement intervals. All the analysis have been performed on unfiltered and non-normalized data, yet all results have been compared to those obtained after applying standard filtering procedures and quantile and loess normalization algorithms. In this project two biological samples were considered: one tumor cell line, A498, and a pool of healthy human tissues, hREF, with three technical replicates for each platform. Our results suggest a good degree of repeatability for all the technologies considered, whereas only Agilent and Illumina show good patterns of concordance. The proposed methods have the advantage of being very flexible, and can be useful for performance assessment of other emerging genomic platforms other than microarrays, such as RNASeq technologies.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Random measurement error"

1

Srivastava, M. S. Economical on-line quality control procedures based on normal random walk model with measurement error. Toronto, Ont: University of Toronto, Dept. of Statistics, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fearon, James D., and David D. Laitin. Integrating Qualitative and Quantitative Methods. Edited by Janet M. Box-Steffensmeier, Henry E. Brady, and David Collier. Oxford University Press, 2009. http://dx.doi.org/10.1093/oxfordhb/9780199286546.003.0033.

Full text
Abstract:
This article describes how qualitative and quantitative tools can be used jointly to strengthen causal inference. It first outlines the findings of statistical analysis of civil war onsets. It then addresses the different criteria for choosing which narratives to tell. A method for structuring narratives that is complementary to the statistical work is reported. Next, the article shows in light of narrative findings the incompleteness of the statistical models that are initially ran. One narrative is emphasized as an example of its potential yield. It underlines some surprises and advantages of the random narrative approach. There are general lessons as well to be learned from the random narrative exercise. The random narrative method allows the assessment of measurement error for variables that are hard to code reliably across large numbers of cases.
APA, Harvard, Vancouver, ISO, and other styles
3

Halperin, Sandra, and Oliver Heath. 11. Surveys. Oxford University Press, 2017. http://dx.doi.org/10.1093/hepl/9780198702740.003.0011.

Full text
Abstract:
This chapter discusses the principles of survey research as well as the issues and problems associated with different stages of the research design process. In particular, it examines questionnaire design, sample design, and interviewing techniques, along with the common sources of error that affect survey research and what can be done to try and avoid or minimize them. Although surveys have several weaknesses, they are widely used in political research to investigate a wide range of political phenomena. They combine two things: obtaining information from people by asking questions and random sampling. When done well, surveys provide an accurate and reliable insight into what ordinary people think about politics and how they participate in politics. The chapter considers the elements of a survey that need to be addressed, namely: questionnaire design, measurement error, sampling design, sampling error, and interview mode.
APA, Harvard, Vancouver, ISO, and other styles
4

Ślusarski, Marek. Metody i modele oceny jakości danych przestrzennych. Publishing House of the University of Agriculture in Krakow, 2017. http://dx.doi.org/10.15576/978-83-66602-30-4.

Full text
Abstract:
The quality of data collected in official spatial databases is crucial in making strategic decisions as well as in the implementation of planning and design works. Awareness of the level of the quality of these data is also important for individual users of official spatial data. The author presents methods and models of description and evaluation of the quality of spatial data collected in public registers. Data describing the space in the highest degree of detail, which are collected in three databases: land and buildings registry (EGiB), geodetic registry of the land infrastructure network (GESUT) and in database of topographic objects (BDOT500) were analyzed. The results of the research concerned selected aspects of activities in terms of the spatial data quality. These activities include: the assessment of the accuracy of data collected in official spatial databases; determination of the uncertainty of the area of registry parcels, analysis of the risk of damage to the underground infrastructure network due to the quality of spatial data, construction of the quality model of data collected in official databases and visualization of the phenomenon of uncertainty in spatial data. The evaluation of the accuracy of data collected in official, large-scale spatial databases was based on a representative sample of data. The test sample was a set of deviations of coordinates with three variables dX, dY and Dl – deviations from the X and Y coordinates and the length of the point offset vector of the test sample in relation to its position recognized as a faultless. The compatibility of empirical data accuracy distributions with models (theoretical distributions of random variables) was investigated and also the accuracy of the spatial data has been assessed by means of the methods resistant to the outliers. In the process of determination of the accuracy of spatial data collected in public registers, the author’s solution was used – resistant method of the relative frequency. Weight functions, which modify (to varying degree) the sizes of the vectors Dl – the lengths of the points offset vector of the test sample in relation to their position recognized as a faultless were proposed. From the scope of the uncertainty of estimation of the area of registry parcels the impact of the errors of the geodetic network points was determined (points of reference and of the higher class networks) and the effect of the correlation between the coordinates of the same point on the accuracy of the determined plot area. The scope of the correction was determined (in EGiB database) of the plots area, calculated on the basis of re-measurements, performed using equivalent techniques (in terms of accuracy). The analysis of the risk of damage to the underground infrastructure network due to the low quality of spatial data is another research topic presented in the paper. Three main factors have been identified that influence the value of this risk: incompleteness of spatial data sets and insufficient accuracy of determination of the horizontal and vertical position of underground infrastructure. A method for estimation of the project risk has been developed (quantitative and qualitative) and the author’s risk estimation technique, based on the idea of fuzzy logic was proposed. Maps (2D and 3D) of the risk of damage to the underground infrastructure network were developed in the form of large-scale thematic maps, presenting the design risk in qualitative and quantitative form. The data quality model is a set of rules used to describe the quality of these data sets. The model that has been proposed defines a standardized approach for assessing and reporting the quality of EGiB, GESUT and BDOT500 spatial data bases. Quantitative and qualitative rules (automatic, office and field) of data sets control were defined. The minimum sample size and the number of eligible nonconformities in random samples were determined. The data quality elements were described using the following descriptors: range, measure, result, and type and unit of value. Data quality studies were performed according to the users needs. The values of impact weights were determined by the hierarchical analytical process method (AHP). The harmonization of conceptual models of EGiB, GESUT and BDOT500 databases with BDOT10k database was analysed too. It was found that the downloading and supplying of the information in BDOT10k creation and update processes from the analyzed registers are limited. An effective approach to providing spatial data sets users with information concerning data uncertainty are cartographic visualization techniques. Based on the author’s own experience and research works on the quality of official spatial database data examination, the set of methods for visualization of the uncertainty of data bases EGiB, GESUT and BDOT500 was defined. This set includes visualization techniques designed to present three types of uncertainty: location, attribute values and time. Uncertainty of the position was defined (for surface, line, and point objects) using several (three to five) visual variables. Uncertainty of attribute values and time uncertainty, describing (for example) completeness or timeliness of sets, are presented by means of three graphical variables. The research problems presented in the paper are of cognitive and application importance. They indicate on the possibility of effective evaluation of the quality of spatial data collected in public registers and may be an important element of the expert system.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Random measurement error"

1

Mellenbergh, Gideon J. "Random Measurement Error." In Counteracting Methodological Errors in Behavioral Research, 83–106. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-12272-0_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cooper, Colin. "Random Errors of Measurement." In An Introduction to Psychometrics and Psychological Assessment, 137–64. 2nd ed. London: Routledge, 2023. http://dx.doi.org/10.4324/9781003240181-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zong, Siheng. "Similar Formation Problem with Biased Random Measurement Errors." In Proceedings of 2021 5th Chinese Conference on Swarm Intelligence and Cooperative Control, 372–83. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-3998-3_36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Giovagnoli, A., and L. Martino. "Optimal Sampling Design with Random Size Clusters for a Mixed Model with Measurement Errors." In Nonconvex Optimization and Its Applications, 181–94. Boston, MA: Springer US, 2001. http://dx.doi.org/10.1007/978-1-4757-3419-5_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rivera-Campoverde, Néstor, José Muñoz Sanz, and Blanca Arenas-Ramirez. "Low-Cost Model for the Estimation of Pollutant Emissions Based on GPS and Machine Learning." In Proceedings of the XV Ibero-American Congress of Mechanical Engineering, 182–88. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-38563-6_27.

Full text
Abstract:
AbstractThis paper presents a novel method for estimating pollutants emitted by vehicles powered by internal combustion engines in real driving, without the need for extensive measurement campaigns or the use of instrumentation in the vehicle for long periods of time; for which it is based on the positioning and speed signals generated by the GPS (Global Positioning System) and the machine learning application. To obtain the training data and validation of the model, two road tests are carried out using the Euro 6 directives for the estimation of pollutants through RDE (Real Driving Emissions), in which a portable emission measurement system is used, and a recorder that stores data from OBD (On Board Diagnostics) and GPS. Based on the data obtained in the first route, the vehicle’s performance is determined and, through automatic learning, the model that estimates polluting emissions is generated, which is validated with the data from the second route. When comparing the results generated by the model against those measured in the RDE, relative errors (%) of 0.0976, −0.2187, 0.2249 and −0.1379 are obtained in the emission factors of CO2, CO, HC and NOx respectively. Finally, the model is fed with data obtained in 1218 km of random driving, obtaining similar results to models based on OBD and closer to the real driving conditions generated by models such as the IVE (International Vehicle Emissions).
APA, Harvard, Vancouver, ISO, and other styles
6

Albert, Paul S., Rajeshwari Sundaram, and Alexander C. McLain. "Innovative Applications of Shared Random Parameter Models for Analyzing Longitudinal Data Subject to Dropout." In ISS-2012 Proceedings Volume On Longitudinal Data Analysis Subject to Measurement Errors, Missing Values, and/or Outliers, 139–56. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-6871-4_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pavlopoulos, Dimitris, Paulina Pankowska, Bart Bakker, and Daniel Oberski. "Modelling Error Dependence in Categorical Longitudinal Data." In Measurement Error in Longitudinal Data, 173–94. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198859987.003.0008.

Full text
Abstract:
Hidden Markov models (HMMs) offer an attractive way of accounting and correcting for measurement error in longitudinal data as they do not require the use of a ‘gold standard’ data source as a benchmark. However, while the standard HMM assumes the errors to be independent or random, some common situations in survey and register data cause measurement error to be systematic. HMMs can correct for systematic error as well if the local independence assumption is relaxed. In this chapter, we present several (mixed) HMMs that relax this assumption with the use of two independent indicators for the variable of interest. Finally, we illustrate the results of some of these HMMs with the use of an example of employment mobility. For this purpose, we use linked survey-register data from the Netherlands.
APA, Harvard, Vancouver, ISO, and other styles
8

Biemer, Paul P., Kathleen Mullan Harris, Dan Liao, Brian J. Burke, and Carolyn Tucker Halpern. "Modelling Mode Effects for a Panel Survey in Transition." In Measurement Error in Longitudinal Data, 63–88. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198859987.003.0004.

Full text
Abstract:
Funding reductions combined with increasing data-collection costs required that Wave V of the USA’s National Longitudinal Study of Adolescent to Adult Health (Add Health) abandon its traditional approach of in-person interviewing and adopt a more cost-effective method. This approach used the mail/web mode in Phase 1 of data collection and in-person interviewing for a random sample of nonrespondents in Phase 2. In addition, to facilitate the comparison of modes, a small random subsample served as the control and received the traditional in-person interview. We show that concerns about reduced data quality as a result of the redesign effort were unfounded based on findings from an analysis of the survey data. In several important respects, the new two-phase, mixed-mode design outperformed the traditional design with greater measurement accuracy, improved weighting adjustments for mitigating the risk of nonresponse bias, reduced residual (or post-adjustment) nonresponse bias, and substantially reduced total-mean-squared error of the estimates. This good news was largely unexpected based upon the preponderance of literature suggesting data quality could be adversely affected by the transition to a mixed mode. The bad news is that the transition comes with a high risk of mode effects for comparing Wave V and prior wave estimates. Analytical results suggest that significant differences can occur in longitudinal change estimates about 60 % of the time purely as an artifact of the redesign. This begs the question: how, then, should a data analyst interpret significant findings in a longitudinal analysis in the presence of mode effects? This chapter presents the analytical results and attempts to address this question.
APA, Harvard, Vancouver, ISO, and other styles
9

Ahmad, Rauf, and Silvelyn Zwanzig. "On Total Least Squares Estimation for Longitudinal Errors-in-Variables Models." In Measurement Error in Longitudinal Data, 359–80. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198859987.003.0015.

Full text
Abstract:
The objective of this study is to evaluate the total least squares (TLS) estimator for the linear mixed model when the design matrix is subject to measurement errors, with special focus on models for longitudinal or repeated-measures data. We consider measurement errors only in the design matrix concerning the fixed part of the model and estimate its corresponding parameter vector under the TLS set up. After treating two variants of the general case, the random coefficient model is discussed as a special case. We evaluate conditions, on the design matrices as well as on variance component parameters, under which a reasonable TLS estimator can be expected in such models. Analysis of a real data example is also provided.
APA, Harvard, Vancouver, ISO, and other styles
10

Paccagnella, Omar. "Self-Evaluation, Differential Item Functioning, and Longitudinal Anchoring Vignettes." In Measurement Error in Longitudinal Data, 289–310. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198859987.003.0012.

Full text
Abstract:
Anchoring vignettes are a powerful instrument to detect systematic differences in the use of self-reported ordinal survey responses. Not taking into account the (non-random) heterogeneity in reporting styles across different respondents may systematically bias the measurement of the variables of interest. The presence of such individual heterogeneity leads respondents to interpret, understand, or use the response categories for the same question differently. This phenomenon is defined as differential item functioning (DIF) in the psychometric literature. A growing amount of cross-sectional studies apply the anchoring vignette approach to tackle this issue but its use is still limited in the longitudinal context. This chapter introduces longitudinal anchoring vignettes for DIF correction, as well as the statistical approaches available when working with such data and how to investigate stability over time of individual response scales.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Random measurement error"

1

Yun, Xu, Zhu Xinhua, and Wang Yu. "Error Compensation for MIMU under Random Vibration." In 2013 Third International Conference on Instrumentation, Measurement, Computer, Communication and Control (IMCCC). IEEE, 2013. http://dx.doi.org/10.1109/imccc.2013.237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wallis, T. Mitch, Atif Imtiaz, Sang-Hyun Lim, Pavel Kabos, Kichul Kim, Paul Rice, and Dejan Filipovic. "Broadband measurements of nanofiber devices: Repeatability and random error analysis." In 2010 76th ARFTG Microwave Measurement Conference. IEEE, 2010. http://dx.doi.org/10.1109/arftg76.2010.5700058.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Zezhou, Yunxiang Chen, Zhongyi Cai, and Tao Wang. "Remaining Useful Lifetime Estimation with Random Failure Threshold and Measurement Error." In 2018 Prognostics and System Health Management Conference (PHM-Chongqing). IEEE, 2018. http://dx.doi.org/10.1109/phm-chongqing.2018.00089.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Loboda, Igor, Sergiy Yepifanov, and Yakov Feldshteyn. "A More Realistic Scheme of Deviation Error Representation for Gas Turbine Diagnostics." In ASME Turbo Expo 2012: Turbine Technical Conference and Exposition. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/gt2012-69368.

Full text
Abstract:
Gas turbine diagnostic algorithms widely use fault simulation schemes, in which measurement errors are usually given by theoretical random number distributions, like the Gaussian probability density function. The scatter of simulated noise is determined on the basis of known information on maximum errors for every sensor type. Such simulation differs from real diagnosis because instead of measurements themselves the diagnostic algorithms work with their deviations from an engine baseline. In addition to simulated measurement inaccuracy, the deviations computed for real data have other error components. In this way, simulated and real deviation errors differ by amplitude and distribution. As a result, simulation-based investigations might result in too optimistic conclusions on gas turbine diagnosis reliability. To understand error features, deviations of real measurements are analyzed in the present paper. To make error presentation more realistic, it is proposed to extract an error component from real deviations and to integrate it in fault description. Finally, the effect of the new noise representation mode on diagnostic reliability is estimated. It is shown that the reliability change due to inexact error simulation can be significant.
APA, Harvard, Vancouver, ISO, and other styles
5

Rahman, Nahid, Ana I. Medina Ayala, Konstantin A. Korolev, Mohammed N. Afsar, Rudy Cheung, and Maurice Aghion. "Random and systematic error analysis in the complex permittivity measurements of high dielectric strength thermoplastics." In 2009 IEEE Intrumentation and Measurement Technology Conference (I2MTC). IEEE, 2009. http://dx.doi.org/10.1109/imtc.2009.5168705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Goncharov, Nikita V., and Yuri V. Filatov. "Influence of the rotor rotation rate instability on goniometer's random error of measurement." In SPIE Proceedings, edited by Vadim E. Privalov. SPIE, 2007. http://dx.doi.org/10.1117/12.725595.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Peng, Zhang. "A method for the measurement and analysis of SAR signal random phase error." In 2009 2nd Asian-Pacific Conference on Synthetic Aperture Radar (APSAR). IEEE, 2009. http://dx.doi.org/10.1109/apsar.2009.5374317.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cai, Zhongyi, Zezhou Wang, Xiaoxiong Dong, Lili Wang, and Huachun Xiang. "Remaining Useful Lifetime Prediction of Nonlinear System with Random Effect and Measurement Error." In 2018 Prognostics and System Health Management Conference (PHM-Chongqing). IEEE, 2018. http://dx.doi.org/10.1109/phm-chongqing.2018.00087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Yanshun, and Jiancheng Fang. "Research on the random error modeling and compensation method for MEMS gyro." In Sixth International Symposium on Instrumentation and Control Technology: Sensors, Automatic Measurement, Control, and Computer Simulation, edited by Jiancheng Fang and Zhongyu Wang. SPIE, 2006. http://dx.doi.org/10.1117/12.717648.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Etebari, Ali, and Pavlos P. Vlachos. "Application of Higher-Order Compact Finite Difference Schemes to Out of Plane Vorticity Measurements." In ASME 2003 International Mechanical Engineering Congress and Exposition. ASMEDC, 2003. http://dx.doi.org/10.1115/imece2003-55344.

Full text
Abstract:
The accuracy of out-of-plane vorticity from in-plane experimental velocity measurements is investigated with particular application to Digital Particle Image Velocimetry (DPIV). Higher-order compact finite difference schemes are proposed as alternatives for the vorticity estimation. Simulations of known flow fields are used to quantify errors associated with the different methods of estimation. The effects of spatial sampling resolution with respect to the range of flow scales on the bias errors of the method accuracies are explored. In addition, estimation of the velocity measurement random error propagation into the vorticity measurement is presented. The higher-order compact schemes deliver improved accuracy compared with previously used methods, even when that domain is spatially undersampled or in the presence of strong velocity gradients. The compact schemes demonstrated less than 0.3% bias error throughout the core of the vortex, resolving flow structures as small as the Nyquist sampling frequency of the system, while the error in the conventional methods increased as the spatial sampling and the range of wave numbers present in the flow field was reduced. The bias error is an irrecoverable loss due to the truncation error of the method, and thus poses significant limitations to the system, whereas the random error propagation can be reduced for the higher-order schemes by applying a simple Gaussian smoothing to the flow field. Thus, the reduction in bias error is of greater importance to the accuracy of the system, particularly as modern global flow measurement technologies achieve higher spatial and temporal resolutions, as well as higher accuracies in velocity measurements. Overall, the compact schemes provide an improved approach for vorticity evaluation compared to conventional algorithms.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Random measurement error"

1

Haines, Tomar. PR-218-103608-R01 ILI Tool Calibration based on In-ditch Measurement with Related Uncertainty. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), September 2013. http://dx.doi.org/10.55274/r0010825.

Full text
Abstract:
Keifner and Associates was awarded task 2 and 3 of the project EC-4-2, "ILI Tool Calibration based on In-ditch Measurement with Related Uncertainty". The project comprised of three tasks as follows: Develop a process for either confirming adequacy of vendor claimed random error component, or recalibrating the random error component of ILI measurement uncertainty. Task 1 was completed by ApplusRTD and a report was generated under a separate contract no: PR-366-103606. Develop a process for examining errors, and determine methods to correct for these errors, recognizing that lack of knowledge about the accuracy of in-ditch measurement will limit error correction. Develop comments and recommendations on the number of confirmation measurements required to provide a statistically defensible basis for adjusting vendor claimed tool error. Kiefner reported on these latter two tasks as its part of the work for this project.
APA, Harvard, Vancouver, ISO, and other styles
2

Almås, Ingvild, Orazio Attanasio, Jyotsna Jalan, Francisco Oteiza, and Marcella Vigneri. Using data differently and using different data. Centre for Excellence and Development Impact and Learning (CEDIL), 2018. http://dx.doi.org/10.51744/cip8.

Full text
Abstract:
The lack of adequate measures to capture relevant factors, and the prevalence of measurement error in existing ones, often constitute the main impediment to robust policy evaluation. Random assignment of a given treatment, when feasible, may allow for the identification of causal effects, given that the necessary measurements are available. Measurement challenges include: (a) adequately measuring outcomes of interest; (b) measuring factors that relate to the mechanisms of estimated impacts; and (c) conducting a robust evaluation in areas where the RCT methodology is not feasible. In this paper, we discuss three categories of related approaches to innovation in the use of data and measurements relevant for evaluation: the creation of new measures, the use of multiple measures, and the use of machine learning algorithms. We motivate the relevance of each of the categories by providing a series of detailed examples of cases where each approach has proved useful in impact evaluations. We discuss the challenges and risks involved in each strategy and conclude with an outline of promising directions for future work.
APA, Harvard, Vancouver, ISO, and other styles
3

Lines, Lisa M., Marque C. Long, Jamie L. Humphrey, Crystal T. Nguyen, Suzannah Scanlon, Olivia K. G. Berzin, Matthew C. Brown, and Anupa Bir. Artificially Intelligent Social Risk Adjustment: Development and Pilot Testing in Ohio. RTI Press, September 2022. http://dx.doi.org/10.3768/rtipress.2022.rr.0047.2209.

Full text
Abstract:
Prominent voices have called for a better way to measure, predict, and adjust for social factors in healthcare and population health. Local area characteristics are sometimes framed as a proxy for patient characteristics, but they are often independently associated with health outcomes. We have developed an “artificially intelligent” approach to risk adjustment for local social determinants of health (SDoH) using random forest models to understand life expectancy at the Census tract level. Our Local Social Inequity score draws on more than 150 neighborhood-level variables across 10 SDoH domains. As piloted in Ohio, the score explains 73 percent of the variation in life expectancy by Census tract, with a mean squared error of 4.47 years. Accurate multidimensional, cross-sector, small-area social risk scores could be useful in understanding the impact of healthcare innovations, payment models, and SDoH interventions in communities at higher risk for serious illnesses and diseases; identifying neighborhoods and areas at highest risk of poor outcomes for better targeting of interventions and resources; and accounting for factors outside of providers’ control for more fair and equitable performance/quality measurement and reimbursement.
APA, Harvard, Vancouver, ISO, and other styles
4

Koduru, Smitha, and Jason Skow. PR-244-153719-R01 Quantification of ILI Sizing Uncertainties and Improving Correction Factors. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), August 2018. http://dx.doi.org/10.55274/r0011518.

Full text
Abstract:
Operators routinely perform verification digs to assess whether an inline inspection (ILI) tool meets the performance specified by the ILI vendors. Characterizing the actual ILI tool performance using available field and ILI data is a difficult problem due to uncertainties associated with measurements and geometric classification of features. The focus of this project is to use existing ILI and excavation data to develop better approaches for assessing ILI tool performance. For corrosion features, operators are primarily interested in quantifying magnetic flux leakage (MFL) ILI tool sizing error and its relationship to burst pressure estimates. In previously completed PRCI research, a limited MFL ILI dataset was used to determine the corrosion feature depth sizing bias and random error using principles published in API 1163 (2013). The research demonstrated the tendency for ILI predictions to be slightly lower than field measurements (i.e., under-call) for the dataset studied, and it provided a framework for characterizing this bias. The goal of this project was to expand on previous work by increasing the number and type of feature morphologies available for analysis, and by estimating the sizing error of ILI measured external corrosion features. New geometric classification criteria, complementing the current criteria suggested by the Pipeline Operator Forum (POF 2009), were also investigated. Lastly, correction factors based on burst pressure prediction accuracy were developed to account for the effect of adopting various feature interaction rules. This report has a related webinar (member login required).
APA, Harvard, Vancouver, ISO, and other styles
5

Hodgdon, Taylor, Anthony Fuentes, Brian Quinn, Bruce Elder, and Sally Shoop. Characterizing snow surface properties using airborne hyperspectral imagery for autonomous winter mobility. Engineer Research and Development Center (U.S.), October 2021. http://dx.doi.org/10.21079/11681/42189.

Full text
Abstract:
With changing conditions in northern climates it is crucial for the United States to have assured mobility in these high-latitude regions. Winter terrain conditions adversely affect vehicle mobility and, as such, they must be accurately characterized to ensure mission success. Previous studies have attempted to remotely characterize snow properties using varied sensors. However, these studies have primarily used satellite-based products that provide coarse spatial and temporal resolution, which is unsuitable for autonomous mobility. Our work employs the use of an Unmanned Aeriel Vehicle (UAV) mounted hyperspectral camera in tandem with machine learning frameworks to predict snow surface properties at finer scales. Several machine learning models were trained using hyperspectral imagery in tandem with in-situ snow measurements. The results indicate that random forest and k-nearest neighbors models had the lowest Mean Absolute Error for all surface snow properties. A pearson correlation matrix showed that density, grain size, and moisture content all had a significant positive correlation to one another. Mechanically, density and grain size had a slightly positive correlation to compressive strength, while moisture had a much weaker negative correlation. This work provides preliminary insight into the efficacy of using hyperspectral imagery for characterizing snow properties for autonomous vehicle mobility.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography