Academic literature on the topic 'Form Error Evaluation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Form Error Evaluation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Form Error Evaluation"

1

Cui, Changcai, Bing Li, Fugui Huang, and Rencheng Zhang. "Genetic algorithm-based form error evaluation." Measurement Science and Technology 18, no. 7 (May 9, 2007): 1818–22. http://dx.doi.org/10.1088/0957-0233/18/7/004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Namboothiri, V. N. Narayanan, and M. S. Shunmugam. "Form error evaluation using L1-approximation." Computer Methods in Applied Mechanics and Engineering 162, no. 1-4 (August 1998): 133–49. http://dx.doi.org/10.1016/s0045-7825(97)00338-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jiang, Qimi, Hsi-Yung Feng, Yujun Wang, and Can Fang. "Batch circular form error characterization and evaluation." Precision Engineering 47 (January 2017): 223–30. http://dx.doi.org/10.1016/j.precisioneng.2016.08.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Weber, Timothy, Saeid Motavalli, Behrooz Fallahi, and S. Hossein Cheraghi. "A unified approach to form error evaluation." Precision Engineering 26, no. 3 (July 2002): 269–78. http://dx.doi.org/10.1016/s0141-6359(02)00105-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kase, K., A. Makinouchi, T. Nakagawa, H. Suzuki, and F. Kimura. "Shape error evaluation method of free-form surfaces." Computer-Aided Design 31, no. 8 (July 1999): 495–505. http://dx.doi.org/10.1016/s0010-4485(99)00046-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yang, Tai-Hung, and J. Jackman. "A Probabilistic View of Problems in Form Error Estimation." Journal of Manufacturing Science and Engineering 119, no. 3 (August 1, 1997): 375–82. http://dx.doi.org/10.1115/1.2831116.

Full text
Abstract:
Form error estimation techniques based on discrete point measurements can lead to significant errors in form tolerance evaluation. By modeling surface profiles as random variables, we show how sample size and fitting techniques affect form error estimation. Depending on the surface characteristics, typical sampling techniques can result in estimation errors of as much as 50 percent. Another issue raised in the fitting approach is the metric p selection for the fitting objective. We show that for p = 2 and p = ∞, the selection does not appear to significantly affect the estimation of form errors.
APA, Harvard, Vancouver, ISO, and other styles
7

Xie, Jin, Jia Long Guo, and Jing Xu. "Evaluation and Measurement of 3D Form Errors of Ground Curve Surface." Key Engineering Materials 359-360 (November 2007): 513–17. http://dx.doi.org/10.4028/www.scientific.net/kem.359-360.513.

Full text
Abstract:
Evaluation and measurement of form errors distributed on 3D ground curved surface are proposed due to the difficulty of processing measured points of 3D ground curve surface in comparison with axisymmetric curved surface. First form curved surface grinding is conducted by arc-shaped diamond grinding wheel, second ground curved surface is measured by contact measurement to obtain 3D measuring data, next transfer mode from CNC grinding to measuring reference frames is established, then effective and applicable 3D compensation arithmetic for probe sphere error is introduced, finally 3D form errors are investigated in connection with reference frame transfer and probe sphere compensation. It is confirmed that form error PV of is improved form 203 m / 8 cm2 is obtained by using reference frame transfer and probe sphere compensation, enhancing measuring accuracy by about 29 %.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhu, Xiangyang, Han Ding, and Michael Y. Wang. "Form Error Evaluation: An Iterative Reweighted Least Squares Algorithm*." Journal of Manufacturing Science and Engineering 126, no. 3 (August 1, 2004): 535–41. http://dx.doi.org/10.1115/1.1765144.

Full text
Abstract:
This paper establishes the equivalence between the solution to a linear Chebyshev approximation problem and that of a weighted least squares (WLS) problem with the weighting parameters being appropriately defined. On this basis, we present an algorithm for form error evaluation of geometric features. The algorithm is implemented as an iterative procedure. At each iteration, a WLS problem is solved and the weighting parameters are updated. The proposed algorithm is of general-purpose, it can be used to evaluate the exact minimum zone error of various geometric features including flatness, circularity, sphericity, cylindericity and spatial straightness. Numerical examples are presented to show the effectiveness and efficiency of the algorithm.
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Meng, Lifeng Xi, and Shichang Du. "3D surface form error evaluation using high definition metrology." Precision Engineering 38, no. 1 (January 2014): 230–36. http://dx.doi.org/10.1016/j.precisioneng.2013.08.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Su, Na, and Hui Guo. "Research on the Profile Error Evaluation and Visualization of Free-Form Surface." Applied Mechanics and Materials 43 (December 2010): 560–64. http://dx.doi.org/10.4028/www.scientific.net/amm.43.560.

Full text
Abstract:
To solve the problem of evaluating profile error of surface, theoretical surface was built by interpolating design points at the method of bicubic Non-Uniform Rational B-Spline(NURBS). Measuring points were gained by laser measurement, and the mathematical model was built for computing the error. The particle swarm optimization (PSO) was applied to compute the minimum distance from measuring points to design surface, which can evaluate profile error of surface accurately. At the same time, MATLAB software was used to realize visualization of profile error evaluation of free-form surface. Experiments show that the proposed optimization can obtain precise result, the method is feasible, visualization makes geometric feature observed more intuitive and there is important practical significance.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Form Error Evaluation"

1

WANG, ZHUO. "MODELING AND SAMPLING OF WORK PIECE PROFILES FOR FORM ERROR EVALUATION." University of Cincinnati / OhioLINK, 2000. http://rave.ohiolink.edu/etdc/view?acc_num=ucin975356333.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

SARAVANAN, SHANKAR. "EVALUATION OF SPHERICITY USING MODIFIED SEQUENTIAL LINEAR PROGRAMMING." University of Cincinnati / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1132343760.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

YUSUPOV, JAMBUL. "On the assessment of the form error using Probabilistic Approach based on Symmetry Classes." Doctoral thesis, Politecnico di Torino, 2015. http://hdl.handle.net/11583/2588833.

Full text
Abstract:
Nowadays, a Coordinate Measuring Machine (CMM) is one of the essential tools used in the product verification process. Measurement points provided by a CMM are conveyed to the CMM data analysis software. As a matter of fact, the software can contribute significantly to the measurement uncertainty, which is very important from the metrological point of view. Mainly, it is related to the association algorithm used in the software, which is intended to find an optimum fitting solution necessary to ensure that the calculations performed satisfy functional requirements. There are various association methods, which can be used in these algorithms (such as Least squares, Minimum zone, etc.). However, the current standards do not specify any of the methods that have to be established. Moreover, there are different techniques for the evaluation of uncertainty (such as experimental resamplings, Monte Carlo simulations, theoretical approaches based on gradients, etc.), which can be used with association methods for the further processing. Uncertainty evaluated by a combination of an association method and uncertainty evaluation technique is a term of implementation uncertainty, which in its turn is a contributor to measurement uncertainty according to the Geometrical Product Specification and Verification project (GPS). This work is focused on the analysis of the impact of the association method on the implementation uncertainty, by assuming that all the other factors (such as the sampling strategy, the measurement equipment parameters, etc.) are fixed and chosen according to standards, within the GPS framework. The objective of the study is Probabilistic method (PM), which is based on the classification of continuous subgroups of a rigid motion (a mathematical principle of the GPS language) and non-parametric density estimation techniques. The method has essentially been developed to decompose complex surfaces and showed promising future in the shape partitioning. However, it comprises geometric fitting procedures, which are considered in this work in more detail. The methodology of the research is based on the comparison of PM with another statistical association method, namely the Least squares method (LS) by means of the parameter estimation and uncertainty evaluation. For the uncertainty evaluation two different techniques, the Gradient-based and Bootstrap methods are used in a combination with the both association methods, PM and LS. The comparison is performed through both the analysis of the results obtained by the parameter estimation and analysis of variance. Variances of the estimated parameters and estimated form error are considered as the response variables in the analysis of variance. The case study is restricted to the roundness geometric tolerance evaluation. Despite the measurement process was simulated, the methodology can be applied for real measurement data. The obtained results during the work can be interesting both in the theoretical and in the practical points of view.
APA, Harvard, Vancouver, ISO, and other styles
4

PANCIANI, GIUSY DONATELLA. "Intelligent procedures for workpiece inspection: the role of uncertainties in design, production and measurement phases." Doctoral thesis, Politecnico di Torino, 2013. http://hdl.handle.net/11583/2543345.

Full text
Abstract:
The actors involved in the manufacturing process need a common technical language from ideation to verification. The Geometrical Product Specification and Verification standards, defined by ISO/TC 213, lay the groundwork for a new operator based language (ISO/TS 17450-2), able to manage a verification coherent with specifications, and, therefore, to reduce the total uncertainty, arising during the product lifecycle. Manufactured parts are necessarily affected by size and form errors, whose control relies on the analysis of the manufacturing technological signature and the evaluation of the associated uncertainty. In order for the study to be unscathed by the impossibility to know the actual shape of a workpiece before measuring it, five roundness profiles, affected by systematic deviation, have been simulated. The impact of simplified verification operators, the relative uncertainty contribution to measurement uncertainty and the ability to properly assess roundness deviation have been evaluated. Then, since literature proposes different approaches for the evaluation of implementation uncertainty, but a standardized method has not been yet achieved, an analysis of the various elements to be considered when choosing a specific approach has been carried out. The most common manufacturing signatures have been considered, together with the number of points necessary for a reliable estimation of implementation uncertainty [2]. Finally, in the case of flatness, a statistical predictive model combined with adaptive sampling strategies has been implemented, as the best “simplified verification operator”, in order to obtain accuracy in the estimation of the flatness error of a clamp for an industrial air cushion guide [1]. Only the most relevant points have been measured, instead of inspecting the whole surface, as required by the “perfect verification operator”, defined in the ISO standards. This consistent reduction is due to the effectiveness of the adaptive methods in presence of technological signatures.
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Zhi. "Error-rate evaluation and optimization for space-time codes." Click to view the E-thesis via HKUTO, 2007. http://sunzi.lib.hku.hk/hkuto/record/B39634218.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Zhi, and 張治. "Error-rate evaluation and optimization for space-time codes." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2007. http://hub.hku.hk/bib/B39634218.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dawson, Phillip Eng. "Evaluation of human error probabilities for post-initiating events." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/42339.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Nuclear Science and Engineering, 2007.
Includes bibliographical references (leaves 84-85).
The United States Nuclear Regulatory Commission is responsible for the safe operation of the United States nuclear power plant fleet, and human reliability analysis forms an important portion of the probabilistic risk assessment that demonstrates the safety of sites. Treatment of post-initiating event human error probabilities by three human reliability analysis methods are compared to determine the strengths and weaknesses of the methodologies and to identify how they may be best used. A Technique for Human Event Analysis (ATHEANA) has a unique approach because it searches and screens for deviation scenarios in addition to the nominal failure cases that most methodologies concentrate on. The quantification method of ATHEANA also differs from most methods because the quantification is dependent on expert elicitation to produce data instead of relying on a database or set of nominal values. The Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H) method uses eight performance shaping factors to modify nominal values in order to represent the quantification of the specifics of a situation. The Electric Power Research Institute Human Reliability Analysis Calculator is a software package that uses a combination of five methods to calculate human error probabilities. Each model is explained before comparing aspects such as the scope, treatment of time available, performance shaping factors, recovery and documentation. Recommendations for future work include creating a database of values based on the nuclear data and emphasizing the documentation of human reliability analysis methods in the future to improve traceability of the process.
by Phillip E. Dawson.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
8

Hassanien, Mohamed A. M. "Error rate performance metrics for digital communications systems." Thesis, Swansea University, 2011. https://cronfa.swan.ac.uk/Record/cronfa42497.

Full text
Abstract:
In this thesis, novel error rate performance metrics and transmission solutions are investigated for delay limited communication systems and for co-channel interference scenarios. The following four research problems in particular were considered. The first research problem is devoted to analysis of the higher order ergodic moments of error rates for digital communication systems with time- unlimited ergodic transmissions and the statistics of the conditional error rates of digital modulations over fading channels are considered. The probability density function and the higher order moments of the conditional error rates are obtained. Non-monotonic behavior of the moments of the conditional bit error rates versus some channel model parameters is observed for a Ricean distributed channel fading amplitude at the detector input. Properties and possible applications of the second central moments are proposed. The second research problem is the non-ergodic error rate analysis and signaling design for communication systems processing a single finite length received sequence. A framework to analyze the error rate properties of non-ergodic transmissions is established. The Bayesian credible intervals are used to estimate the instantaneous bit error rate. A novel degree of ergodicity measure is introduced using the credible interval estimates to quantify the level of ergodicity of the received sequence with respect to the instantaneous bit error rate and to describe the transition of the data detector from the non-ergodic to ergodic zone of operation. The developed non-ergodic analysis is used to define adaptive forward error correction control and adaptive power control policies that can guarantee, with a given probability, the worst case instantaneous bit error rate performance of the detector in its transition fi'om the non-ergodic to ergodic zone of operation. In the third research problem, novel retransmission schemes are developed for delay-limited retransmissions. The proposed scheme relies on a reliable reverse link for the error-free feedback message delivery. Unlike the conventional automatic repeat request schemes, the proposed scheme does not require the use of cyclic redundancy check bits for error detection. In the proposed scheme, random permutations are exploited to locate the bits for retransmission in the predefined window within the packet. The retransmitted bits are combined using the maximal-ratio combining. The complexity-performance trade-offs of the proposed scheme is investigated by mathematical analysis as well as computer simulations. The bit error rate of the proposed scheme is independent of the packet length while the throughput is dependent on the packet length. Three practical techniques suitable for implementation are proposed. The performance of the proposed retransmission scheme was compared to the block repetition code corresponding to a conventional ARQ retransmission strategy. It was shown that, for the same number of retransmissions, and the same packet length, the proposed scheme always outperforms such repetition coding, and, in some scenarios, the performance improvement is found to be significant. Most of our analysis has been done for the case of AWGN channel, however, the case of a slow Rayleigh block fading channel was also investigated. The proposed scheme appears to provide the throughput and the BER reduction gains only for the medium to large SNR values. Finally, the last research problem investigates the link error rate performance with a single co-channel interference. A novel metric to assess whether the standard Gaussian approximation of a single interferer underestimates or overestimates the link bit error rate is derived. This metric is a function of the interference channel fading statistics. However, it is otherwise independent of the statistics of the desired signal. The key step in derivation of the proposed metric is to construct the standard Gaussian approximation of the interference by a non-linear transformation. A closed form expression of the metric is obtained for a Nakagami distributed interference fading amplitude. Numerical results for the case of Nakagami and lognormal distributed interference fading amplitude confirm the validity of the proposed metric. The higher moments, interval estimators and non-linear transformations were investigated to evaluate the error rate performance for different wireless communication scenarios. The synchronization channel is also used jointly with the communication link to form a transmission diversity and subsequently, to improve the error rate performance.
APA, Harvard, Vancouver, ISO, and other styles
9

Rymer, J. W. "ERROR DETECTION AND CORRECTION -- AN EMPIRICAL METHOD FOR EVALUATING TECHNIQUES." International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/606802.

Full text
Abstract:
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California
This paper describes a method for evaluating error correction techniques for applicability to the flight testing of aircraft. No statistical or math assumptions about the channel or sources of error are used. An empirical method is shown which allows direct “with and without” comparative evaluation of correction techniques. A method was developed to extract error sequences from actual test data independent of the source of the dropouts. Hardware was built to allow a stored error sequence to be repetitively applied to test data. Results are shown for error sequences extracted from a variety of actual test data. The effectiveness of Reed-Solomon (R-S) encoding and interleaving is shown. Test bed hardware configuration is described. Criteria are suggested for worthwhile correction techniques and suggestions are made for future investigation.
APA, Harvard, Vancouver, ISO, and other styles
10

Wennbom, Marika. "Impact of error : Implementation and evaluation of a spatial model for analysing landscape configuration." Thesis, Stockholms universitet, Institutionen för naturgeografi och kvartärgeologi (INK), 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-79214.

Full text
Abstract:
Quality and error assessment is an essential part of spatial analysis which with the increasingamount of applications resulting from today’s extensive access to spatial data, such as satelliteimagery and computer power is extra important to address. This study evaluates the impact ofinput errors associated with satellite sensor noise for a spatial method aimed at characterisingaspects of landscapes associated with the historical village structure, called the HybridCharacterisation Model (HCM), that was developed as a tool to monitor sub goals of theSwedish Environmental Goal “A varied agricultural landscape”. The method and errorsimulation method employed for generating random errors in the input data, is implemented andautomated as a Python script enabling easy iteration of the procedure. The HCM is evaluatedqualitatively (by visual analysis) and quantitatively comparing kappa index values between theoutputs affected by error. Comparing the result of the qualitative and quantitative evaluationshows that the kappa index is an applicable measurement of quality for the HCM. Thequalitative analysis compares impact of error for two different scales, the village scale and thelandscape scale, and shows that the HCM is performing well on the landscape scale for up to30% error and on the village scale for up to 10% and shows that the impact of error differsdepending on the shape of the analysed feature. The Python script produced in this study couldbe further developed and modified to evaluate the HCM for other aspects of input error, such asclassification errors, although for such studies to be motivated the potential errors associatedwith the model and its parameters must first be further evaluated.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Form Error Evaluation"

1

Babeshko, Lyudmila, and Irina Orlova. Econometrics and econometric modeling in Excel and R. ru: INFRA-M Academic Publishing LLC., 2020. http://dx.doi.org/10.12737/1079837.

Full text
Abstract:
The textbook includes topics of modern econometrics, often used in economic research. Some aspects of multiple regression models related to the problem of multicollinearity and models with a discrete dependent variable are considered, including methods for their estimation, analysis, and application. A significant place is given to the analysis of models of one-dimensional and multidimensional time series. Modern ideas about the deterministic and stochastic nature of the trend are considered. Methods of statistical identification of the trend type are studied. Attention is paid to the evaluation, analysis, and practical implementation of Box — Jenkins stationary time series models, as well as multidimensional time series models: vector autoregressive models and vector error correction models. It includes basic econometric models for panel data that have been widely used in recent decades, as well as formal tests for selecting models based on their hierarchical structure. Each section provides examples of evaluating, analyzing, and testing models in the R software environment. Meets the requirements of the Federal state educational standards of higher education of the latest generation. It is addressed to master's students studying in the Field of Economics, the curriculum of which includes the disciplines Econometrics (advanced course)", "Econometric modeling", "Econometric research", and graduate students."
APA, Harvard, Vancouver, ISO, and other styles
2

Office, General Accounting. Social security: Pension data useful for detecting supplemental security payment errors : report to the Secretary of Health and Human Services. Washington, D.C: The Office, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Office, General Accounting. Veterans' benefits: Improvements needed to measure the extent of errors in VA claims processing : report to congressional requesters. Washington, D.C: The Office, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Office, General Accounting. Social security: Pension data useful for detecting supplemental security payment errors : report to the Secretary of Health and Human Services. Washington, D.C: The Office, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Office, General Accounting. Social security: Pension data useful for detecting supplemental security payment errors : report to the Secretary of Health and Human Services. Washington, D.C: The Office, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Eisenberg, Melvin A. Mechanical Errors (“Unilateral Mistakes”). Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199731404.003.0041.

Full text
Abstract:
Chapter 41 concerns unilateral mistakes. A unilateral mistake is a transient mental blunder that results from an error in the mechanics of an actor’s mental machinery, such as incorrectly adding a column of figures. Mistakes of these kinds are referred to in this book as mechanical errors. Mechanical errors differ from evaluative mistakes in several critical respects. For one thing, unlike evaluative mistakes the prospect that a counterparty will make a mechanical error normally is not a risk that is bargained for. And unlike evaluative mistakes, relief for mechanical errors would not undermine the very idea of promise: A promisor who seeks relief on the ground of a mechanical error does not assert that all things considered she doesn’t wish to perform. Rather, she asserts that she has a morally acceptable excuse that is well within the systemics of promise.
APA, Harvard, Vancouver, ISO, and other styles
7

Weisberg, Herb. Total Survey Error. Edited by Lonna Rae Atkeson and R. Michael Alvarez. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780190213299.013.22.

Full text
Abstract:
The total survey error (TSE) approach is a useful schema for organizing the planning and evaluation of surveys. It classifies the several possible types of errors in surveys, including in respondent selection, response accuracy, and survey administration. While the goal is to minimize these errors, the TSE approach emphasizes that this must be done within limitations imposed by several constraints: the cost of minimizing each type of error, the time requirements for the survey, and ethical standards. In addition to survey errors and constraints, there are several survey effects for which there are no error-free solutions; the size of these effects can be studied even though they cannot be eliminated. The total survey quality (TSQ) approach further emphasizes the need for survey organizations to maximize the quality of the product they deliver to their clients, within the context of TSE tradeoffs between survey errors and costs.
APA, Harvard, Vancouver, ISO, and other styles
8

Williams, Craig. Friends, Romans, Errors. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198803034.003.0004.

Full text
Abstract:
From Montaigne’s essay “On Friendship” to popular philosophy of the mid twentieth and early twenty-first centuries (C.S. Lewis’ The Four Loves and Joseph Epstein’s Friendship: An Exposé) to scholarship of the past few decades, this chapter shows how such texts rely to varying degrees upon the binarisms true/false, correct/incorrect, right/wrong in their descriptions and evaluations of Roman friendship. This chapter asks not whether this or that modern text gets ancient Rome wrong, but rather how Rome, evaluated as right or wrong, functions as a vehicle for these texts’ meanings, and also what might be some of the implications for a topic so freighted with larger social, cultural, and political significance over the course of its centuries-long reception. Finally, although its readings of amicitia avoid applying evaluative labels, this chapter asks to what extent its readings of later readings of Roman friendship do (must?) nonetheless appeal to some concept of error.
APA, Harvard, Vancouver, ISO, and other styles
9

Gugerty, Mary Kay, and Dean Karlan. The CART Principles for Impact Evaluation. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199366088.003.0006.

Full text
Abstract:
The CART principles are essential for designing a right-fit impact evaluation. This chapter explains what it means to conduct credible, actionable, responsible, and transportable impact evaluation. To ensure that impact evaluations follow the CART principles, organizations ought to strive for bias-free data collection and analysis. Bias (systematic error that favors one measure over another) may come from the way data are collected (question wording influences responses), or the way they are analyzed (e.g., influence of external factors or how people are selected into programs). In many cases, an randomizied control trial (RCT) helps generate a credible impact evaluation. In the simplest version of an RCT, individuals are randomly assigned to treatment and control groups. However, in special circumstances, other quasi-experimental evaluation methods can be successful. Actionable impact evaluation requires that organizations commit to learn from the undertaking and act on the results, even if they are disappointing. Organizations can make sure an evaluation is responsible by weighing how much it costs relative to its expected benefits. This chapter also addresses common criticisms about RCTs and identifies some strategies for reducing their cost. Finally, the chapter explains that the transportable principle mandates that evaluations produce useful evidence for others.
APA, Harvard, Vancouver, ISO, and other styles
10

Holzer, Jacob, Robert Kohn, James Ellison, and Patricia Recupero, eds. Geriatric Forensic Psychiatry. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780199374656.001.0001.

Full text
Abstract:
Geriatric Forensic Psychiatry: Principles and Practice is one of the first texts to provide a comprehensive review of important topics in the intersection of geriatric psychiatry, medicine, clinical neuroscience, forensic psychiatry, and law. It will speak to a broad audience among varied fields, including clinical and forensic psychiatry and mental health professionals, geriatricians and internists, attorneys and courts, regulators, and other professionals working with the older population. Topics addressed in this text, applied to the geriatric population, include clinical forensic evaluation, regulations and laws, civil commitment, different forms of capacity, guardianship, patient rights, medical-legal issues related to treatment, long term care and telemedicine, risk management, patient safety and error reduction, elder driving, sociopathy and aggression, offenders and the adjudication process, criminal evaluations, corrections, ethics, culture, cognitive impairment, substance abuse, trauma, older professionals, high risk behavior, and forensic mental health training and research. Understanding the relationship between clinical issues, laws and regulations, and managing risk and improving safety, will help to serve the growing older population.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Form Error Evaluation"

1

Kase, Kiwamu, Hiromasa Suzuki, and Fumihiko Kimura. "An Evaluation of Geometrical Errors by Segmentation with Fitting Form Error Features." In Computer-aided Tolerancing, 328–37. Dordrecht: Springer Netherlands, 1996. http://dx.doi.org/10.1007/978-94-009-1529-9_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Constantinides, George, Fredrik Dahlqvist, Zvonimir Rakamarić, and Rocco Salvia. "Rigorous Roundoff Error Analysis of Probabilistic Floating-Point Computations." In Computer Aided Verification, 626–50. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81688-9_29.

Full text
Abstract:
AbstractWe present a detailed study of roundoff errors in probabilistic floating-point computations. We derive closed-form expressions for the distribution of roundoff errors associated with a random variable, and we prove that roundoff errors are generally close to being uncorrelated with their generating distribution. Based on these theoretical advances, we propose a model of IEEE floating-point arithmetic for numerical expressions with probabilistic inputs and an algorithm for evaluating this model. Our algorithm provides rigorous bounds to the output and error distributions of arithmetic expressions over random variables, evaluated in the presence of roundoff errors. It keeps track of complex dependencies between random variables using an SMT solver, and is capable of providing sound but tight probabilistic bounds to roundoff errors using symbolic affine arithmetic. We implemented the algorithm in the PAF tool, and evaluated it on FPBench, a standard benchmark suite for the analysis of roundoff errors. Our evaluation shows that PAF computes tighter bounds than current state-of-the-art on almost all benchmarks.
APA, Harvard, Vancouver, ISO, and other styles
3

Jalid, Abdelilah, Mohammed Oubrek, and Abdelouahab Salih. "Evaluation of the Form Error of Partial Spherical Part on Coordinate Measuring Machine." In Lecture Notes in Mechanical Engineering, 269–75. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-62199-5_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Yan, Ping-yu Jiang, and Qi-quan An. "The Error Fluctuation Evaluation for Key Machining Form Feature of High-Value Difficult-to-Cut Part." In Proceedings of the 22nd International Conference on Industrial Engineering and Engineering Management 2015, 359–70. Paris: Atlantis Press, 2016. http://dx.doi.org/10.2991/978-94-6239-180-2_35.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wellmer, Friedrich-Wilhelm. "Estimation of the Error." In Statistical Evaluations in Exploration for Mineral Deposits, 45–51. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/978-3-642-60262-7_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Leacock, Claudia, Martin Chodorow, Michael Gamon, and Joel Tetreault. "Evaluating Error Detection Systems." In Automated Grammatical Error Detection for Language Learners, Second Edition, 31–45. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-031-02153-4_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Krämer, Jonas, Lionel Blatter, Eva Darulova, and Mattias Ulbrich. "Inferring Interval-Valued Floating-Point Preconditions." In Tools and Algorithms for the Construction and Analysis of Systems, 303–21. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-99524-9_16.

Full text
Abstract:
AbstractAggregated roundoff errors caused by floating-point arithmetic can make numerical code highly unreliable. Verified postconditions for floating-point functions can guarantee the accuracy of their results under specific preconditions on the function inputs, but how to systematically find an adequate precondition for a desired error bound has not been explored so far. We present two novel techniques for automatically synthesizing preconditions for floating-point functions that guarantee that user-provided accuracy requirements are satisfied. Our evaluation on a standard benchmark set shows that our approaches are complementary and able to find accurate preconditions in reasonable time.
APA, Harvard, Vancouver, ISO, and other styles
8

Wirnshofer, Martin. "Evaluation of the Pre-Error AVS Approach." In Variation-Aware Adaptive Voltage Scaling for Digital CMOS Circuits, 61–71. Dordrecht: Springer Netherlands, 2013. http://dx.doi.org/10.1007/978-94-007-6196-4_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Xie, Jin, Jia Long Guo, and Jing Xu. "Evaluation and Measurement of 3D Form Errors of Ground Curve Surface." In Advances in Grinding and Abrasive Technology XIV, 513–17. Stafa: Trans Tech Publications Ltd., 2007. http://dx.doi.org/10.4028/0-87849-459-6.513.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zimpeck, Alexandra, Cristina Meinhardt, Laurent Artola, and Ricardo Reis. "Evaluation Methodology." In Mitigating Process Variability and Soft Errors at Circuit-Level for FinFETs, 73–79. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-68368-9_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Form Error Evaluation"

1

Jiang, Lin, Jingzhi Huang, Xiangshuai Ding, and Xiangzhang Chao. "Method for spherical form error evaluation using cuckoo search algorithm." In 10th International Symposium on Precision Engineering Measurements and Instrumentation (ISPEMI 2018), edited by Jiubin Tan and Jie Lin. SPIE, 2019. http://dx.doi.org/10.1117/12.2513585.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ghosh, Suhash, Chittaranjan Sahay, and Poorna Pruthvi Chandra Malempati. "Effect of Measuring Instrument Eccentricity and Tilt Error on Circularity Form Error." In ASME 2019 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/imece2019-11937.

Full text
Abstract:
Abstract From power stations to power tools, from the smallest watch to the largest car, all contain round components. In precision machining of cylindrical parts, the measurement and evaluation of roundness (also called circularity in ASME Geometric Dimensioning & Tolerancing Y14.5) and cylindricity are indispensable components to quantify form tolerance. Of all the methods of measuring these form errors, the most precise is the one with accurate spindle/turntable type measuring instrument. On the instrument, the component is rotated on a highly accurate spindle which provides an imaginary circular datum. The workpiece axis is aligned with the axis of the spindle by means of a centering and tilt adjustment leveling table. In this article, the authors have investigated the dependence of circularity form error on instrument’s centering error (also known as eccentricity) and tilt error. It would be intriguing to map this nonlinear relationship within its effective boundaries and to investigate the limits beyond which the measurement costs and time remain no more efficient. In this study, a test part with different circular and cylindrical features were studied with varying levels of predetermined instrument eccentricity and tilt errors. Additionally, this article explores the significance of incorporating these parameters into undergraduate and graduate engineering curricula, and be taught as an improved toolkit to the aspiring engineers, process engineers and quality control professionals.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Xiangchao, Hong Xiao, Hao Zhang, Xiaoying He, and Min Xu. "Uncertainty estimation in form error evaluation of freeform surfaces for precision metrology." In Seventh International Symposium on Precision Mechanical Measurements, edited by Liandong Yu. SPIE, 2016. http://dx.doi.org/10.1117/12.2211994.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Guo, Shao-Wei, Hong-Li Fan, Fa-Ping Zhang, Ti-Guang Zhang, and Ge Wang. "Precision assembly oriented method to determine the evaluation parameters for surface form error." In The 3rd Annual International Conference on Design, Manufacturing and Mechatronics (ICDMM2016). WORLD SCIENTIFIC, 2017. http://dx.doi.org/10.1142/9789813208322_0015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mishima, Nozomu, and Kousuke Ishii. "Robustness Evaluation of a Miniaturized Machine Tool." In ASME 1999 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1999. http://dx.doi.org/10.1115/detc99/dac-8579.

Full text
Abstract:
Abstract This paper applies the method of robust design to machine tool design. The new design focuses on miniaturization that provides significant for energy and space saving. Our approach combines an analytical procedure representing the machining motions of a machine tool (form-shaping theory) with procedures for robust design. The effort identifies the design parameters of a machine tool that significantly influence the machining tolerance and leads to a general design guidelines for robust miniaturization. Further, this research applies the Taguchi method to the form-shaping function of a prototype miniature lathe. The analysis addresses five machine tool dimensions as control factors, while treating local errors in the machine structure as noise factors. The robustness study seeks to identify the importance of each factor in improving performance of the machine tool. The result shows that the thickness of the feed drive unit affects the performance most significantly. Among the local errors, straightness error of the same feed drive unit has a critical importance.
APA, Harvard, Vancouver, ISO, and other styles
6

Bartkowiak, Tomasz, and Roman Staniek. "Application of Order Statistics in the Evaluation of Flatness Error: Sampling Problem." In ASME 2017 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/imece2017-71295.

Full text
Abstract:
The main purpose of this initial paper is to demonstrate the application of order statistics in the estimation of form error from a CMM measurement. Nowadays, modern industry sets high standards for geometrical precision, surface texture and material properties. There are many parameters that can characterize mechanical part, out of which flatness error plays important in the assembly process and performance. Recently, due to the greater availability and price reduction, Coordinate Measurement Techniques have increased their popularity in the industry for on-line and off-line measurements as they allow automated measurements at relatively low uncertainty level. Data obtained from CMM measurements have to be processed and analyzed in order to evaluate component compliance with the required technical specification. The article presents an analysis of a minimal sample selection for the evaluation of flatness error by means of coordinate measurement. In the paper, a statistical approach was presented, assuming that, in the repetitive manufacturing process, the distribution of deviations between surface points and the reference plane is stable. Based on the known, statistical distribution, order statistics theorem was implemented to determine maximal and minimal point deviation statistics, as it played a dominant role in flatness error estimation. A brief analysis of normally distributed deviations was described in the paper. Moreover, the case study was presented for the set of the machined parts which were components of a machine tool mechanical structure. Empirical distributions were derived and minimal sample sizes were estimated for the given confidence levels using the proposed theorem. The estimation errors of flatness values for the derived sample sizes were analyzed and discussed in the paper.
APA, Harvard, Vancouver, ISO, and other styles
7

Brown, Brandon A., Ray W. Daniel, Valeta Carol Chancey, and Tyler F. Rooks. "Parametric Evaluation of Head Center of Gravity Acceleration Error From Rigid Body Kinematics Assumptions Used in Environmental Sensors." In ASME 2021 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/imece2021-69334.

Full text
Abstract:
Abstract Environmental sensors (ES) are a proposed way to identify potentially concussive events using Rigid Body Kinematics (RBK) to get motion at the head CG. This study systematically investigated the extent that errors in RBK assumptions including sensor orientation (SO), head CG position (HCGP), and exposure severity contribute to errors in sensor readings of predicted peak resultant linear acceleration (PRLA) at the head CG. Simulated sensor readings were defined by idealized representations of head motion [extension, lateral bending and axial rotation] using a half sine pulse for linear and angular acceleration. Peak magnitudes of linear acceleration ranged from 12.5 to 100 Gs and peak magnitudes of angular acceleration ranged from 1250 to 10000 rad/s/s. Durations of linear and angular accelerations ranged between 5 and 30 ms. Simulated HCGP variations ranged from −10% to 10% radius of the head (assumed to be a sphere) in each direction and SO variations ranged from −20 to 20 degrees about each axis. True head CG response was calculated using zero error for SO and HCGP. Mean (+/− standard deviation) of calculated errors for maximum percent error (MaxPE) of a given head exposure was 30.3% (+/−9.71). 50% and 38% of all simulated exposures had MaxPE associated with maximum SO and HCGP offset, respectively. MaxPE was likely due to user error, ES form factor, and anthropometric variation.
APA, Harvard, Vancouver, ISO, and other styles
8

Esmaeili, Mehdi, Mohammad Durali, and Nader Jalili. "Ring Microgyroscope Modeling and Performance Evaluation." In ASME 2005 International Mechanical Engineering Congress and Exposition. ASMEDC, 2005. http://dx.doi.org/10.1115/imece2005-80635.

Full text
Abstract:
This paper discusses the effects of substrate motions on the performance of a microgyroscope modeled as a ring structure. Using Extended Hamilton’s Principle, the equations of motion are derived. The natural frequency equation and response of gyroscope are then extracted in closed-form for the case where substrate undergoes normal rotation. The Galerkin approximation is used for discretizing the partial differential equations of motion into ordinary differential equations. In these equations, the effects of angular accelerations, centripetal and coriolis accelerations are well apparent. The response of the system to different inputs is studied and the system sensitivity to input parameter changes is examined. Finally, the sources of error in the measurement of input rotational rate are recognized. The study demonstrates the importance of errors caused by cross axes inputs on the gyroscope output measurements.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Shuaiyin, Zhongyu Wang, Yinbao Cheng, and Xuejiao Dai. "An evaluation method for free- form curve profile based on the distribution characteristics of directed profile error." In 2022 4th International Conference on Intelligent Control, Measurement and Signal Processing (ICMSP). IEEE, 2022. http://dx.doi.org/10.1109/icmsp55950.2022.9859035.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jbira, Ibtissem, Antoine Tahan, Mohamed Ali Mahjoub, and Borhen Louhichi. "Evaluation of the Algorithmic Error of New Specification Tools for an ISO 14405-1:2016 Size." In ASME 2018 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/detc2018-85669.

Full text
Abstract:
Due to machine tool imprecisions during manufacturing, the actual product cannot be the same as the nominal model. The product’s geometric variations influence the geometrical requirements of functionality and assembly [6, 8]; this remains a problem of industrial performance and plays a major role in the quality and cost of products; hence the need for a reliable strategy to evaluate errors in the final inspection of part quality. Among all the geometric characteristics, the circular characteristic is very common on most parts. Therefore, the measurement and evaluation of circularity with a high degree of accuracy is of utmost importance. Size, form and orientation are the basic descriptors of the geometric quality of the objects. The recent publication of ISO 14405-1: 2016 defines the size as the fundamental geometric descriptor; it described a new set of specification tools for the size of part characteristics that directly apply to the ideal geometry of the component [13]. These tools present new challenges for an inspector using a coordinate metrology system. The study of the influence of form defects on the identification of dimensional and geometrical requirements seems necessary. This paper studies four modifiers ISO 14405-1:2016 (Minimum circumscribed size (GN), Maximum recorded size (GX), least squares size Minimum (GG) and Minimum area (MZ)) will be studied. This paper presents simple and effective algorithms for evaluating the circularity error of a large number of points using four specification modifiers of ISO 14405-1:2016, and a study on the influence of measurement system strategies on different algorithms for the evaluation of these new specifications. An analysis software was developed to compare the sensitivity of different parameters (number of points, noise amplitude and circularity defect) on ISO 14405-1:2016 modifiers.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Form Error Evaluation"

1

McKay, S., Nate Richards, and Todd Swannack. Ecological model development : evaluation of system quality. Engineer Research and Development Center (U.S.), September 2022. http://dx.doi.org/10.21079/11681/45380.

Full text
Abstract:
Ecological models are used throughout the US Army Corps of Engineers (USACE) to inform decisions related to ecosystem restoration, water operations, environmental impact assessment, environmental mitigation, and other topics. Ecological models are typically developed in phases of conceptualization, quantification, evaluation, application, and communication. Evaluation is a process for assessing the technical quality, reliability, and ecological basis of a model and includes techniques such as calibration, verification, validation, and review. In this technical note (TN), we describe an approach for evaluating system quality, which generally includes the computational integrity, numerical accuracy, and programming of a model or modeling system. Methods are presented for avoiding computational errors during development, detecting errors through model testing, and updating models based on review and use. A formal structure is proposed for model test plans and subsequently demonstrated for a hypothetical habitat suitability model. Overall, this TN provides ecological modeling practitioners with a rapid guide for evaluating system quality.
APA, Harvard, Vancouver, ISO, and other styles
2

Collins, Clarence O., and Tyler J. Hesser. altWIZ : A System for Satellite Radar Altimeter Evaluation of Modeled Wave Heights. Engineer Research and Development Center (U.S.), February 2021. http://dx.doi.org/10.21079/11681/39699.

Full text
Abstract:
This Coastal and Hydraulics Engineering Technical Note (CHETN) describes the design and implementation of a wave model evaluation system, altWIZ, which uses wave height observations from operational satellite radar altimeters. The altWIZ system utilizes two recently released altimeter databases: Ribal and Young (2019) and European Space Agency Sea State Climate Change Initiative v.1.1 level 2 (Dodet et al. 2020). The system facilitates model evaluation against 1 Hz1 altimeter data or a product created by averaging altimeter data in space and time around model grid points. The system allows, for the first time, quantitative analysis of spatial model errors within the U.S. Army Corps of Engineers (USACE) Wave Information Study (WIS) 30+ year hindcast for coastal United States. The system is demonstrated on the WIS 2017 Atlantic hindcast, using a 1/2° basin scale grid and a 1/4° regional grid of the East Coast. Consistent spatial patterns of increased bias and root-mean-square-error are exposed. Seasonal strengthening and weakening of these spatial patterns are found, related to the seasonal variation of wave energy. Some model errors correspond to areas known for high currents, and thus wave-current interaction. In conjunction with the model comparison, additional functions for pairing altimeter measurements with buoy data and storm tracks have been built. Appendices give information on the code access (Appendix I), organization and files (Appendix II), example usage (Appendix III), and demonstrating options (Appendix IV).
APA, Harvard, Vancouver, ISO, and other styles
3

Berzofsky, Marcus E., Andrew Moore, G. Lance Couzens, Lynn Langton, and Chris Krebs. Potential Survey Error Due to a Panel Design: A Review and Evaluation of the National Crime Victimization Survey. RTI Press, July 2020. http://dx.doi.org/10.3768/rtipress.2020.rr.0039.2007.

Full text
Abstract:
We use a total survey error approach to examine and make recommendations on how to adjust for non-sampling error in longitudinal, mixed-mode surveys. Using data from the National Crime Victimization Survey (NCVS), we examine three major sources of non-sampling error: telescoping, mode effects, and fatigue. We present an assessment of each source of error from a total survey error perspective and propose alternative adjustments to adjust better for this error. Findings suggest that telescoping and fatigue are likely sources of error in the NCVS, but the use of mixed-modes is not. Furthermore, both telescoping and fatigue are present in longitudinal surveys and accounting for one but not the other results in estimates that under- or overestimate the measures of interest—in this case, the rate of crime in the United States.
APA, Harvard, Vancouver, ISO, and other styles
4

Rymer, J. W. Error Detection and Correction -- An Emperical Method for Evaluating Techniques. Fort Belvoir, VA: Defense Technical Information Center, January 2000. http://dx.doi.org/10.21236/ada377970.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gunay, Selim, Fan Hu, Khalid Mosalam, Arpit Nema, Jose Restrepo, Adam Zsarnoczay, and Jack Baker. Blind Prediction of Shaking Table Tests of a New Bridge Bent Design. Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/svks9397.

Full text
Abstract:
Considering the importance of the transportation network and bridge structures, the associated seismic design philosophy is shifting from the basic collapse prevention objective to maintaining functionality on the community scale in the aftermath of moderate to strong earthquakes (i.e., resiliency). In addition to performance, the associated construction philosophy is also being modernized, with the utilization of accelerated bridge construction (ABC) techniques to reduce impacts of construction work on traffic, society, economy, and on-site safety during construction. Recent years have seen several developments towards the design of low-damage bridges and ABC. According to the results of conducted tests, these systems have significant potential to achieve the intended community resiliency objectives. Taking advantage of such potential in the standard design and analysis processes requires proper modeling that adequately characterizes the behavior and response of these bridge systems. To evaluate the current practices and abilities of the structural engineering community to model this type of resiliency-oriented bridges, the Pacific Earthquake Engineering Research Center (PEER) organized a blind prediction contest of a two-column bridge bent consisting of columns with enhanced response characteristics achieved by a well-balanced contribution of self-centering, rocking, and energy dissipation. The parameters of this blind prediction competition are described in this report, and the predictions submitted by different teams are analyzed. In general, forces are predicted better than displacements. The post-tension bar forces and residual displacements are predicted with the best and least accuracy, respectively. Some of the predicted quantities are observed to have coefficient of variation (COV) values larger than 50%; however, in general, the scatter in the predictions amongst different teams is not significantly large. Applied ground motions (GM) in shaking table tests consisted of a series of naturally recorded earthquake acceleration signals, where GM1 is found to be the largest contributor to the displacement error for most of the teams, and GM7 is the largest contributor to the force (hence, the acceleration) error. The large contribution of GM1 to the displacement error is due to the elastic response in GM1 and the errors stemming from the incorrect estimation of the period and damping ratio. The contribution of GM7 to the force error is due to the errors in the estimation of the base-shear capacity. Several teams were able to predict forces and accelerations with only moderate bias. Displacements, however, were systematically underestimated by almost every team. This suggests that there is a general problem either in the assumptions made or the models used to simulate the response of this type of bridge bent with enhanced response characteristics. Predictions of the best-performing teams were consistently and substantially better than average in all response quantities. The engineering community would benefit from learning details of the approach of the best teams and the factors that caused the models of other teams to fail to produce similarly good results. Blind prediction contests provide: (1) very useful information regarding areas where current numerical models might be improved; and (2) quantitative data regarding the uncertainty of analytical models for use in performance-based earthquake engineering evaluations. Such blind prediction contests should be encouraged for other experimental research activities and are planned to be conducted annually by PEER.
APA, Harvard, Vancouver, ISO, and other styles
6

Ruosteenoja, Kimmo. Applicability of CMIP6 models for building climate projections for northern Europe. Finnish Meteorological Institute, September 2021. http://dx.doi.org/10.35614/isbn.9789523361416.

Full text
Abstract:
In this report, we have evaluated the performance of nearly 40 global climate models (GCMs) participating in Phase 6 of the Coupled Model Intercomparison Project (CMIP6). The focus is on the northern European area, but the ability to simulate southern European and global climate is discussed as well. Model evaluation was started with a technical control; completely unrealistic values in the GCM output files were identified by seeking the absolute minimum and maximum values. In this stage, one GCM was rejected totally, and furthermore individual output files from two other GCMs. In evaluating the remaining GCMs, the primary tool was the Model Climate Performance Index (MCPI) that combines RMS errors calculated for the different climate variables into one index. The index takes into account both the seasonal and spatial variations in climatological means. Here, MCPI was calculated for the period 1981—2010 by comparing GCM output with the ERA-Interim reanalyses. Climate variables explored in the evaluation were the surface air temperature, precipitation, sea level air pressure and incoming solar radiation at the surface. Besides MCPI, we studied RMS errors in the seasonal course of the spatial means by examining each climate variable separately. Furthermore, the evaluation procedure considered model performance in simulating past trends in the global-mean temperature, the compatibility of future responses to different greenhouse-gas scenarios and the number of available scenario runs. Daily minimum and maximum temperatures were likewise explored in a qualitative sense, but owing to the non-existence of data from multiple GCMs, these variables were not incorporated in the quantitative validation. Four of the 37 GCMs that had passed the initial technical check were regarded as wholly unusable for scenario calculations: in two GCMs the responses to the different greenhouse gas scenarios were contradictory and in two other GCMs data were missing from one of the four key climate variables. Moreover, to reduce inter-GCM dependencies, no more than two variants of any individual GCM were included; this led to an abandonment of one GCM. The remaining 32 GCMs were divided into three quality classes according to the assessed performance. The users of model data can utilize this grading to select a subset of GCMs to be used in elaborating climate projections for Finland or adjacent areas. Annual-mean temperature and precipitation projections for Finland proved to be nearly identical regardless of whether they were derived from the entire ensemble or by ignoring models that had obtained the lowest scores. Solar radiation projections were somewhat more sensitive.
APA, Harvard, Vancouver, ISO, and other styles
7

Mei, Sun. The Use of GPS for Evaluating Inertial Measurement Unit Errors,. Fort Belvoir, VA: Defense Technical Information Center, March 1996. http://dx.doi.org/10.21236/ada306521.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nantung, Tommy E., Jusang Lee, John E. Haddock, M. Reza Pouranian, Dario Batioja Alvarez, Jongmyung Jeon, Boonam Shin, and Peter J. Becker. Structural Evaluation of Full-Depth Flexible Pavement Using APT. Purdue University, 2021. http://dx.doi.org/10.5703/1288284317319.

Full text
Abstract:
The fundamentals of rutting behavior for thin full-depth flexible pavements (i.e., asphalt thickness less than 12 inches) are investigated in this study. The scope incorporates an experimental study using full-scale Accelerated Pavement Tests (APTs) to monitor the evolution of each pavement structural layer's transverse profiles. The findings were then employed to verify the local rutting model coefficients used in the current pavement design method, the Mechanistic-Empirical Pavement Design Guide (MEPDG). Four APT sections were constructed using two thin typical pavement structures (seven-and ten-inches thick) and two types of surface course material (dense-graded and SMA). A mid-depth rut monitoring and automated laser profile systems were designed to reconstruct the transverse profiles at each pavement layer interface throughout the process of accelerated pavement deterioration that is produced during the APT. The contributions of each pavement structural layer to rutting and the evolution of layer deformation were derived. This study found that the permanent deformation within full-depth asphalt concrete significantly depends upon the pavement thickness. However, once the pavement reaches sufficient thickness (more than 12.5 inches), increasing the thickness does not significantly affect the permanent deformation. Additionally, for thin full-depth asphalt pavements with a dense-graded Hot Mix Asphalt (HMA) surface course, most pavement rutting is caused by the deformation of the asphalt concrete, with about half the rutting amount observed within the top four inches of the pavement layers. However, for thin full-depth asphalt pavements with an SMA surface course, most pavement rutting comes from the closet sublayer to the surface, i.e., the intermediate layer. The accuracy of the MEPDG’s prediction models for thin full-depth asphalt pavement was evaluated using some statistical parameters, including bias, the sum of squared error, and the standard error of estimates between the predicted and actual measurements. Based on the statistical analysis (at the 95% confidence level), no significant difference was found between the version 2.3-predicted and measured rutting of total asphalt concrete layer and subgrade for thick and thin pavements.
APA, Harvard, Vancouver, ISO, and other styles
9

Arhin, Stephen, Babin Manandhar, Kevin Obike, and Melissa Anderson. Impact of Dedicated Bus Lanes on Intersection Operations and Travel Time Model Development. Mineta Transportation Institute, June 2022. http://dx.doi.org/10.31979/mti.2022.2040.

Full text
Abstract:
Over the years, public transit agencies have been trying to improve their operations by continuously evaluating best practices to better serve patrons. Washington Metropolitan Area Transit Authority (WMATA) oversees the transit bus operations in the Washington Metropolitan Area (District of Columbia, some parts of Maryland and Virginia). One practice attempted by WMATA to improve bus travel time and transit reliability has been the implementation of designated bus lanes (DBLs). The District Department of Transportation (DDOT) implemented a bus priority program on selected corridors in the District of Columbia leading to the installation of red-painted DBLs on corridors of H Street, NW, and I Street, NW. This study evaluates the impacts on the performance of transit buses along with the general traffic performance at intersections on corridors with DBLs installed in Washington, DC by using a “before” and “after” approach. The team utilized non-intrusive video data to perform vehicular turning movement counts to assess the traffic flow and delays (measures of effectiveness) with a traffic simulation software. Furthermore, the team analyzed the Automatic Vehicle Locator (AVL) data provided by WMATA for buses operating on the study segments to evaluate bus travel time. The statistical analysis showed that the vehicles traveling on H Street and I Street (NW) experienced significantly lower delays during both AM (7:00–9:30 AM) and PM (4:00–6:30 PM) peak hours after the installation of bus lanes. The approximation error metrics (normalized squared errors) for the testing dataset was 0.97, indicating that the model was predicting bus travel times based on unknown data with great accuracy. WMATA can apply this research to other segments with busy bus schedules and multiple routes to evaluate the need for DBLs. Neural network models can also be used to approximate bus travel times on segments by simulating scenarios with DBLs to obtain accurate bus travel times. Such implementation could not only improve WMATA’s bus service and reliability but also alleviate general traffic delays.
APA, Harvard, Vancouver, ISO, and other styles
10

Mukungu, Andrew, Zita Ekeocha, Stephen Robert Byrn, and Kari L. Clase. Evaluating and Understanding the Reason for an Increase in Nonconformances in the Laboratory. Purdue University, November 2021. http://dx.doi.org/10.5703/1288284317430.

Full text
Abstract:
This is a study of nonconformances experienced by a laboratory of a pharmaceutical manufacturing facility in East Africa. There has been an increase in nonconformances from 216 nonconformances in 2017 to 229 in 2018 and by September 2019, 306 nonconformances were already logged. Increasing nonconformances result in delayed release of tested materials and many resources are wasted (e.g. chemicals, man hours and equipment). Analysts become frustrated, which may result in inexhaustive investigations. Understanding the reason for the increase in nonconformances will enable the facility to derive effective solutions to the identified causes, hence reducing the number of nonconformances and improving the productivity and morale of employees. This quantitative, nonexperimental, longitudinal survey study was intended to evaluate and understand the reason for increasing nonconformances. Trends of the nonconformances, previous investigations, procedure for investigation and the training given to analysts have been reviewed. Laboratory incidences were the most recurring nonconformances; and these were mainly caused by analyst errors. Corrective and Preventive Actions (CAPAs) were derived by cross functional teams whenever root causes were identified. Procedure for investigation of nonconformances refers to investigative tools. Identification of root causes to nonconformances recently became mandatory. Analysts have limited advanced industrial training on investigation of nonconformances. Another study should be carried out to understand the cause of analyst errors. The study can be rolled out to other departments at the manufacturing facility to create similar improvements. Analysts should enroll into advanced courses of industrial pharmacy to gain advanced industrial skills which they can apply in investigations to find root causes to nonconformances.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography