To see the other types of publications on this topic, follow the link: Estimation par interval.

Journal articles on the topic 'Estimation par interval'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Estimation par interval.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Magnussen, Steen, and Johannes Breidenbach. "Retrieval of among-stand variances from one observation per stand." Journal of Forest Science 66, No. 4 (April 30, 2020): 133–49. http://dx.doi.org/10.17221/141/2019-jfs.

Full text
Abstract:
Forest inventories provide predictions of stand means on a routine basis from models with auxiliary variables from remote sensing as predictors and response variables from field data. Many forest inventory sampling designs do not afford a direct estimation of the among-stand variance. As consequence, the confidence interval for a model-based prediction of a stand mean is typically too narrow. We propose a new method to compute (from empirical regression residuals) an among-stand variance under sample designs that stratify sample selections by an auxiliary variable, but otherwise do not allow a direct estimation of this variance. We test the method in simulated sampling from a complex artificial population with an age class structure. Two sampling designs are used (one-per-stratum, and quasi systematic), neither recognize stands. Among-stand estimates of variance obtained with the proposed method underestimated the actual variance by 30-50%, yet 95% confidence intervals for a stand mean achieved a coverage that was either slightly better or at par with the coverage achieved with empirical linear best unbiased estimates obtained under less efficient two-stage designs.
APA, Harvard, Vancouver, ISO, and other styles
2

Gomez, Mayra, Roberta Cimmino, Dario Rossi, Gianluigi Zullo, Giuseppe Campanile, Gianluca Neglia, and Stefano Biffani. "The present of Italian Mediterranean buffalo: precision breeding based on multi-omics data." Acta IMEKO 12, no. 4 (December 5, 2023): 1–4. http://dx.doi.org/10.21014/actaimeko.v12i4.1692.

Full text
Abstract:
Genetic evaluation in the Italian Mediterranean Buffalo (IMB) traditionally relied on the BLUP method (best linear unbiased predictor), a mixed model system incorporating both random and fixed effects simultaneously. However, recent advancements in genome sequencing technologies have opened up the opportunity to incorporate genomic information into genetic evaluations. The ssGBLUP (single-step best linear unbiased predictor) has become the method par excellence. It replaces the traditional relationship matrix with one that combines pedigree and genomic relationships, allowing for the estimation of genetic values for non-genotyped animals. The findings of this study highlight how genomic selection enhances the precision of breeding values, facilitates greater genetic advancement and reduces the generation interval, ultimately enabling a rapid return on investment.
APA, Harvard, Vancouver, ISO, and other styles
3

Bochenina, Marina V. "Price Forecasting in the Housing Market amid Changes in the Primary Trend." Теория и практика общественного развития, no. 8 (August 30, 2023): 137–42. http://dx.doi.org/10.24158/tipor.2023.8.16.

Full text
Abstract:
The development of digital technologies contributes to the growth of the use of nonparametric methods. The presented study proposes a methodology for the forecast assessment of prices in the residential real estate market, taking into account the possible determination of the direction of dynamics in the anticipation period based on the application of nonparametric Nadaraya – Watson estimation. The forecast model construction in the work is considered on the basis of the historically established tendency of determination of the price level of the primary or secondary housing market of Krasnodar Krai and does not take into account other factors. Par-ticular attention is paid to the application of the Chow test to identify the point in time at which there was a struc-tural shift, which allows to determine the period devoid of structural breaks for modeling the trend in order to determine the confidence interval of the forecast. The existing housing problem reflects the relevance of the development of methods for forecasting price dynamics in the housing market, and the absence of additional factors reduces the error and increases the forecast quality.
APA, Harvard, Vancouver, ISO, and other styles
4

Krishna, Hare, Madhulika Dube, and Renu Garg. "Estimation of Stress Strength Reliability of Inverse Weibull Distribution under Progressive First Failure Censoring." Austrian Journal of Statistics 48, no. 1 (December 17, 2018): 14–37. http://dx.doi.org/10.17713/ajs.v47i4.638.

Full text
Abstract:
In this article, estimation of stress-strength reliability $\delta=P\left(Y<X\right)$ based on progressively first failure censored data from two independent inverse Weibull distributions with different shape and scale parameters is studied. Maximum likelihood estimator and asymptotic confidence interval of $\delta$ are obtained. Bayes estimator of $\delta$ under generalized entropy loss function using non-informative and gamma informative priors is derived. Also, highest posterior density credible interval of $\delta$ is constructed. Markov Chain Monte Carlo (MCMC) technique is used for Bayes computation. The performance of various estimation methods are compared by a Monte Carlo simulation study. Finally, a pair of real life data is analyzed to illustrate the proposed methods of estimation.
APA, Harvard, Vancouver, ISO, and other styles
5

Radović, Dunja, and Mirko Stojčić. "Predictive modeling of critical headway based on machine learning techniques." Tehnika 77, no. 3 (2022): 354–59. http://dx.doi.org/10.5937/tehnika2203354r.

Full text
Abstract:
Due to the impossibility of directly measuring of critical headway, numerous methods and procedures have been developed for its estimation. This paper uses the maximum likelihood method for estimating the same at five roundabouts, and based on the obtained results and pairs of accepted and maximum rejected headways, several predictive models based on machine learning techniques were trained and tested. Therefore, the main goal of the research is to create a model for the prediction (classification) of the critical headway, which as inputs, i.e. independent variables use pairs - accepted and maximum rejected headways. The basic task of the model is to associate one of the previously estimated values of the critical headway with a given input pair of headways. The final predictive model is chosen from several offered alternatives based on the accuracy of the prediction. The results of training and testing of various models based on machine learning techniques in IBM SPSS Modeler software indicate that the highest prediction accuracy is shown by the C5 decision tree model (73.266%), which was trained and tested on an extended data set obtained by augmentation or data set augmentation (Data Augmentation - DA).
APA, Harvard, Vancouver, ISO, and other styles
6

Queiroga, F., J. Epstein, M. L. Erpelding, L. King, M. Soudant, E. Spitz, J. F. Maillefert, et al. "AB1678 CONSTRUCTION OF A COMPOSITE SCORE FOR PATIENT SELF-REPORT OF FLARE IN OSTEOARTHRITIS: A COMPARISON OF METHODS WITH THE FLARE-OA-16 QUESTIONNAIRE." Annals of the Rheumatic Diseases 82, Suppl 1 (May 30, 2023): 2076.1–2077. http://dx.doi.org/10.1136/annrheumdis-2023-eular.1380.

Full text
Abstract:
BackgroundHaving a score to assess the occurrence and severity of flares of knee or hip osteoarthritis (OA) to guide interventions is essential.ObjectivesTo compare different methods of constructing a composite score for the Flare-OA-16 self-reported questionnaire for measuring knee and hip OA flare, defined as a cluster of symptoms of sufficient duration and intensity to require initiation, change or increase in therapy [1].MethodsParticipants with a physician diagnosis of knee and hip OA completed a validated 16-item questionnaire [2,3] assessing five dimensions of flare in OA: pain, swelling, stiffness, psychological aspects, and consequences of symptoms, endorsed by OMERACT. Three estimation methods were compared: the score obtained i) by second-order confirmatory factor analysis (CFA) weighting the factor loadings in a linear combination of the five dimensions; ii) by logistic regression, modeling the probability of having a flare according to the participant’s self-report (yes/no); and iii) by Rasch method, using the average of the weighted scores from a Rasch model in each dimension. For the scores obtained by the three methods, the disordered items were modified, and then the scores were standardized on a scale from 0 to 10. The distribution (floor effect without flare (FF) and ceiling effect with flare (CF)) of the scores in each model was compared. The similarity between the scores was analyzed by intraclass correlation coefficient (ICC) and their performance were compared by areas under the ROC curves (AUC) and 95% confidence interval. The intra-score test-retest reliability at 15 days was assessed by ICC.ResultsIn a sample of 381 participants with complete questionnaires, 247 reported having a flare. With CFA, good fit indices (CFI=0.94; RMSEA=.08) justified the estimation of an overall score mean=3.90 (SD=2.79), with FF effect 27.6% and CF 2.0%. For the logistic regression estimation, the overall score was mean=6.48 (SD=3.13), with FF 0% and CF 34.0% effect. With the Rasch model, the composite score was mean=4.15 (SD=2.45), with FF 18.7% and CF 0% effect. Similarity analysis indicated a greater concordance between the CFA and Rasch scores (ICC=.99) than between the logistic regression score and the two others (ICC=.87 for each). The ROC curve indicated similar performance of the overall scores estimated by logistic model (AUC=.88 [.85-.92]), by CFA (AUC=.86 [.82-.90]) and by Rasch model (AUC=.86 [.82-.90]). The performance in terms of reproducibility was ICC=.84[.95-.90] for Rasch and CFA scores and ICC=.78[.66-86] for logistic model.ConclusionThis comparison of methods for constructing a global score for knee and hip OA flare explored three satisfactory alternatives. The second-order CFA confirmed the uniqueness of the flare construct measure, the logistic model had a slight superiority explained by the anchor variable used (patient-reported flare), and the Rasch model ensured that an interval scale was obtained for each dimension. The distribution of scores with the lowest combination of floor and ceiling effects was in favor of the Rasch model. The next step will be to document their respective performance in terms of sensitivity to change. A score obtained from the patient’s point of view can help increase the adherence to the prescribed treatment and help physicians to optimize the scheduling and delivery of medical consultation.References[1]Guillemin F., et al. Developing a Preliminary Definition and Domains of Flare in Knee and Hip Osteoarthritis (OA): Consensus Building of the Flare-in-OA OMERACT Group. J Rheumatol. 2019 Sep;46(9):1188–91.[2]Traore Y., et al. Development and validation of the Flare-OA questionnaire for measuring flare in knee and hip osteoarthritis. Osteoarthritis Cartilage. 2022 May;30(5):689–96.[3]Queiroga F., et al. Validation et réduction d’une échelle de mesure des poussées dans l’arthrose de la hanche et du genou par un modèle de Rasch. Rev Épidémiol Santé Publique. 2022 May;70:S70.AcknowledgementsWe acknowledge the participants of the study samples that were used in the study, without whom this research could not been possible.Financial supportThis work was supported partly by the French PIA project “Lorraine Université d’Excellence”, reference ANR-15-IDEX-04-LUE.Disclosure of InterestsNone Declared.
APA, Harvard, Vancouver, ISO, and other styles
7

Mihailovic, Zoran, Tatjana Atanasijevic, Vesna Popovic, Miroslav B. Milosevic, and Jan P. Sperhake. "Estimation of the Postmortem Interval by Analyzing Potassium in the Vitreous Humor." American Journal of Forensic Medicine and Pathology 33, no. 4 (December 2012): 400–403. http://dx.doi.org/10.1097/paf.0b013e31826627d0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Youn Ta, Marc, Amandine Carine Njeugeut Mbiafeu, Jean-Robert Kamenan Satti, Tchimou Vincent Assoma, and Jean Patrice Jourda. "Cartographie Automatique des Zones Inondées et Evaluation des Dommages dans le District d’Abidjan depuis Google Earth Engine." European Scientific Journal, ESJ 19, no. 32 (November 30, 2023): 54. http://dx.doi.org/10.19044/esj.2023.v19n32p54.

Full text
Abstract:
L'objectif de cette étude est de générer automatiquement des cartes de l'étendue des zones inondées dans le district d'Abidjan et d’évaluer les dommages causés. L’approche méthodologique a consisté à cartographier l'étendue des zones inondées en utilisant une méthode de détection des changements basée sur les données Sentinel-1 (SAR) avant et après une crue spécifique. Ensuite, les différentes classes d'enjeux (telles que les cultures, les zones habitées, les bâtiments, les routes et la densité de la population) ont été extraites à partir de diverses sources de données gratuites. Puis la superficie des enjeux affectés a été évaluée, en superposant les classes d’enjeux sur les zones inondées. De plus, une interface web a été conçue à l'aide des packages de Google Earth Engine. Cette interface web offre à l'utilisateur la possibilité de visualiser l'étendue des zones inondées et les cartes des enjeux affectés, avec une estimation statistique, pour une date donnée dans l'intervalle allant de 2015 à la date actuelle. La cartographie des zones inondées à la date du 25 juin 2020 a révélé une superficie totale de 25219,23 hectares de zones inondées soit 11,50% de la superficie totale du District d’Abidjan. Une estimation des dégâts causés par cette crue indique que 22 307,53 hectares d'enjeux ont été affectés en moyenne, ce qui représente 88,45 % des zones inondées. Cette répartition se décompose en 13 538,49 hectares (soit 53,68 %) de terres agricoles touchées et 8 769,04 hectares (soit 34,77 %) de zones urbaines touchées, impactant en moyenne 35 065 personnes. Les résultats de cette étude ont permis de constater que la partie centrale de la zone d'étude, au-dessus de la lagune, présente le plus grand potentiel de risque d'inondation en raison de la morphologie du terrain et de la vulnérabilité élevée des zones construites qui occupent la plaine inondable. The objective of this study is to automatically generate maps of the extent of flooded areas in the Abidjan district and assess the resulting damages. The methodological approach involved mapping the extent of flooded areas using a change detection method based on Sentinel-1 (SAR) data before and after a specific flood event. Subsequently, various classes of assets, such as crops, residential areas, buildings, roads, and population density, were extracted from various free data sources. The affected asset areas were then evaluated by overlaying the asset classes on the flooded areas. Furthermore, a web interface was designed using Google Earth Engine packages. This web interface allows users to visualize the extent of flooded areas and maps of the affected assets, along with statistical estimates, for a specific date within the interval from 2015 to the current date. Mapping of the flooded areas as of June 25, 2020, revealed a total area of 25219.23 hectares of flooded areas, representing 11.50% of the total area of the Abidjan District. An estimation of the damages caused by this flood indicates that, on average, 22307.53 hectares of assets were affected, accounting for 88.45% of the flooded areas. This distribution breaks down into 13538.49 hectares (53.68%) of affected agricultural lands and 8769.04 hectares (34.77%) of affected urban areas, impacting an average of 35,065 people. The study results revealed that the central part of the study area, located above the lagoon, presents the highest flood risk potential due to the terrain's morphology and the high vulnerability of built-up areas occupying the floodplain.
APA, Harvard, Vancouver, ISO, and other styles
9

Nikulchev, Evgeny, and Alexander Chervyakov. "Prediction Intervals: A Geometric View." Symmetry 15, no. 4 (March 23, 2023): 781. http://dx.doi.org/10.3390/sym15040781.

Full text
Abstract:
This article provides a review of the approaches to the construction of prediction intervals. To increase the reliability of prediction, point prediction methods are replaced by intervals for many aims. The interval prediction generates a pair as future values, including the upper and lower bounds for each prediction point. That is, according to historical data, which include a graph of a continuous and discrete function, two functions will be obtained as a prediction, i.e., the upper and lower bounds of estimation. In this case, the prediction boundaries should provide guaranteed probability of the location of the true values inside the boundaries found. The task of building a model from a time series is, by its very nature, incorrect. This means that there is an infinite set of equations whose solution is close to the time series for machine learning. In the case of interval use, the inverse problem of dynamics allows us to choose from the entire range of modeling methods, using confidence intervals as solutions, or intervals of a given width, or those chosen as a solution to the problems of multi-criteria optimization of the criteria for evaluating interval solutions. This article considers a geometric view of the prediction intervals and a new approach is given.
APA, Harvard, Vancouver, ISO, and other styles
10

Gubarev, Vyacheslav, Serhiy Melnychuk, and Nikolay Salnikov. "METHOD AND ALGORITHMS FOR CALCULATING HIGH-PRECISION ORIENTATION AND MUTUAL BINDING OF COORDINATE SYSTEMS OF SPACECRAFT STAR TRACKERS CLUSTER BASED ON INACCURATE MEASUREMENTS." Journal of Automation and Information sciences 1 (January 1, 2022): 74–92. http://dx.doi.org/10.34229/1028-0979-2022-1-8.

Full text
Abstract:
The problem of increasing the accuracy of determining the orientation of a spacecraft (SC) using a system of star trackers (ST) is considered. Methods are proposed that make it possible to use a joint field of view and refine the relative position of ST to improve the accuracy of orientation determination. The use of several star trackers leads to an increase in the angle between the directions to the stars into the joint field of view, which makes it possible to reduce the condition number of the matrices used in calculating the orientation parameters. The paper develops a combinatorial method for interval estimation of the SC orientation with an arbitrary number of star trackers. To calculate the ST orientation, a linear problem of interval estimation of the orthogonal orientation matrix for a sufficiently large number of stars is solved. The orientation quaternion is determined under the condition that the corresponding orientation matrix belongs to the obtained interval estimates. The case is considered when the a priori estimate of the mutual binding of star trackers can have an error comparable to or greater than the error in measuring the angular coordinates of stars. With inaccurately specified matrices of the mutual orientation of the star trackers, the errors in the mutual orientations of the STs are added to the errors of measuring the directions to the stars, which leads to an expansion of the uncertainty intervals of the right-hand sides of the system of linear algebraic equations used to determine the orientation parameters. A method is proposed for solving the problem of refining the mutual reference of the internal coordinate systems of a pair of ST as an independent task, after which the main problem of increasing the accuracy of spacecraft orientation is solved. The developed method and algorithms for solving such a complex problem are based on interval estimates of orthogonal orientation matrices. For additional narrowing of the intervals, the property of orthogonality of orientation matrices is used. The numerical simulation carried out made it possible to evaluate the advantages and disadvantages of each of the proposed methods.
APA, Harvard, Vancouver, ISO, and other styles
11

Park, Keon-woo, Yoo-Jeong Shim, Myeong-jin Lee, and Heejune Ahn. "Multi-Frame Based Homography Estimation for Video Stitching in Static Camera Environments." Sensors 20, no. 1 (December 22, 2019): 92. http://dx.doi.org/10.3390/s20010092.

Full text
Abstract:
In this paper, a multi-frame based homography estimation method is proposed for video stitching in static camera environments. A homography that is robust against spatio-temporally induced noise can be estimated by intervals, using feature points extracted during a predetermined time interval. The feature point with the largest blob response in each quantized location bin, a representative feature point, is used for matching a pair of video sequences. After matching representative feature points from each camera, the homography for the interval is estimated by random sample consensus (RANSAC) on the matched representative feature points, with their chances of being sampled proportional to their numbers of occurrences in the interval. The performance of the proposed method is compared with that of the per-frame method by investigating alignment distortion and stitching scores for daytime and noisy video sequence pairs. It is shown that alignment distortion in overlapping regions is reduced and the stitching score is improved by the proposed method. The proposed method can be used for panoramic video stitching with static video cameras and for panoramic image stitching with less alignment distortion.
APA, Harvard, Vancouver, ISO, and other styles
12

Yamaguchi, Hiromu, Daisuke Yasutake, Tomoyoshi Hirota, and Koichi Nomura. "Nondestructive Measurement Method of Leaf Area Index Using Near-infrared Radiation and Photosynthetically Active Radiation Transmitted through a Leafy Vegetable Canopy." HortScience 58, no. 1 (January 2023): 16–22. http://dx.doi.org/10.21273/hortsci16761-22.

Full text
Abstract:
Because the leaf area index (LAI) is an essential parameter for understanding the structure and growth status of plant canopies, nondestructive and continuous estimation methods have been required. Recently, an LAI estimation method using the ratio of near-infrared radiation (NIR; 700–1000 nm) to photosynthetically active radiation (PAR; 400–700 nm) (NIRin/PARin) transmitted through a canopy has been proposed. However, because previous studies on this NIRin/PARin-based LAI estimation method are limited to tall plants (e.g., forest and rice canopies), in this study, we applied this method to a short canopy (i.e., spinach) and investigated its validity. NIRin/PARin and three other traditional indices for indirect LAI estimation—relative PPF density (rPPFD), normalized difference vegetation index (NDVI), and simple ratio (SR)—were measured in 25 canopies with different LAI. NIRin/PARin showed better estimation sensitivity (R2 = 0.88) to the observed LAI than the other three indices, particularly when LAI was greater than 3 m2·m−2. In addition, the LAI estimated from NIRin/PARin measured at 10-min intervals in the entire growth period could capture an increasing trend in the measured LAI throughout the entire growth stage (mean absolute error = 0.87 m2·m−2). Errors in long-term LAI estimations may be caused by the sensor location and insufficient data due to unsuitable weather conditions for measuring NIRin/PARin. The current study demonstrates the merits and limitations of the NIRin/PARin-based LAI estimation method applied to low height canopies, thereby contributing to its practical use in horticultural crops.
APA, Harvard, Vancouver, ISO, and other styles
13

Vraka, Aikaterini, Vicente Bertomeu-González, Fernando Hornero, Aurelio Quesada, Raúl Alcaraz, and José J. Rieta. "Splitting the P-Wave: Improved Evaluation of Left Atrial Substrate Modification after Pulmonary Vein Isolation of Paroxysmal Atrial Fibrillation." Sensors 22, no. 1 (December 31, 2021): 290. http://dx.doi.org/10.3390/s22010290.

Full text
Abstract:
Atrial substrate modification after pulmonary vein isolation (PVI) of paroxysmal atrial fibrillation (pAF) can be assessed non-invasively by analyzing P-wave duration in the electrocardiogram (ECG). However, whether right (RA) and left atrium (LA) contribute equally to this phenomenon remains unknown. The present study splits fundamental P-wave features to investigate the different RA and LA contributions to P-wave duration. Recordings of 29 pAF patients undergoing first-ever PVI were acquired before and after PVI. P-wave features were calculated: P-wave duration (PWD), duration of the first (PWDon-peak) and second (PWDpeak-off) P-wave halves, estimating RA and LA conduction, respectively. P-wave onset (PWon-R) or offset (PWoff-R) to R-peak interval, measuring combined atrial/atrioventricular and single atrioventricular conduction, respectively. Heart-rate fluctuation was corrected by scaling. Pre- and post-PVI results were compared with Mann–Whitney U-test. PWD was correlated with the remaining features. Only PWD (non-scaling: Δ=−9.84%, p=0.0085, scaling: Δ=−17.96%, p=0.0442) and PWDpeak-off (non-scaling: Δ=−22.03%, p=0.0250, scaling: Δ=−27.77%, p=0.0268) were decreased. Correlation of all features with PWD was significant before/after PVI (p<0.0001), showing the highest value between PWD and PWon-R (ρmax=0.855). PWD correlated more with PWDon-peak (ρ= 0.540–0.805) than PWDpeak-off (ρ= 0.419–0.710). PWD shortening after PVI of pAF stems mainly from the second half of the P-wave. Therefore, noninvasive estimation of LA conduction time is critical for the study of atrial substrate modification after PVI and should be addressed by splitting the P-wave in order to achieve improved estimations.
APA, Harvard, Vancouver, ISO, and other styles
14

Kharisov, V. N., and D. A. Eremeev. "Optimal algorithm for distinguishing radio navigation signals by a set of “spaced” correlators." Radioengineering 8 (2021): 11–23. http://dx.doi.org/10.18127/j00338486-202108-02.

Full text
Abstract:
The classical algorithm for signal distinction, signal detecting and estimating signal parameters consists in analyzing discrete parameter values using a correlator. The value of the parameter with the maximum absolute value of the correlator is taken as an estimate. Obviously, this is accompanied by losses in sensitivity and noise immunity, since the specified discrete parameter values do not accurately correspond to the true parameter values of the real signal. In this case, the accuracy of the parameter estimation, even at large signal-to-noise ratios, is limited by the value of the correlators placement interval. Therefore, it is of interest to optimally use the entire set of correlators for parameter estimation and signal detection. The article presents the derivation of algorithm for distinguishing signals by a given parameter by a set of "spaced" correlators. Unlike the classical algorithm, it uses decisive statistics not by one, but by a pair of neighboring correlators, detuned by the correlation interval. In this case, at first, the number of the interval between correlators is estimated according to the maximum of the decisive statistics, and then the value of the parameter is refined within this interval. Additionally, the algorithm allows you to estimate the signal amplitude. The proposed algorithm is compared with the classical one. By means of simulation, the dependences on the energy potential of the average probability of signal distinction for both algorithms are plotted. It is shown that the proposed algorithm has a higher probability of correct distinction than the classical algorithm. It is also shown that the maximum and average energy losses of the distinction algorithm based on a set of "spaced" correlators are less than the losses of the classical algorithm. Thus, the proposed algorithm for distinction signals by a set of "spaced" correlators has greater noise immunity and accuracy of estimating the desired parameter than the classical distinction algorithm.
APA, Harvard, Vancouver, ISO, and other styles
15

Abo El-Noor, Mona Mohamed, Naema Mahmoud Elhosary, Naglaa Fathi Khedr, and Kareema Ibraheem El-Desouky. "Estimation of Early Postmortem Interval Through Biochemical and Pathological Changes in Rat Heart and Kidney." American Journal of Forensic Medicine and Pathology 37, no. 1 (March 2016): 40–46. http://dx.doi.org/10.1097/paf.0000000000000214.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Ratkovic, Marc, and Dustin Tingley. "Sparse Estimation and Uncertainty with Application to Subgroup Analysis." Political Analysis 25, no. 1 (January 2017): 1–40. http://dx.doi.org/10.1017/pan.2016.14.

Full text
Abstract:
We introduce a Bayesian method, LASSOplus, that unifies recent contributions in the sparse modeling literatures, while substantially extending pre-existing estimators in terms of both performance and flexibility. Unlike existing Bayesian variable selection methods, LASSOplus both selects and estimates effects while returning estimated confidence intervals for discovered effects. Furthermore, we show how LASSOplus easily extends to modeling repeated observations and permits a simple Bonferroni correction to control coverage on confidence intervals among discovered effects. We situate LASSOplus in the literature on how to estimate subgroup effects, a topic that often leads to a proliferation of estimation parameters. We also offer a simple preprocessing step that draws on recent theoretical work to estimate higher-order effects that can be interpreted independently of their lower-order terms. A simulation study illustrates the method’s performance relative to several existing variable selection methods. In addition, we apply LASSOplus to an existing study on public support for climate treaties to illustrate the method’s ability to discover substantive and relevant effects. Software implementing the method is publicly available in theRpackagesparsereg.
APA, Harvard, Vancouver, ISO, and other styles
17

Zhou, Qiang, and Xin Li. "Deep Homography Estimation and Its Application to Wall Maps of Wall-Climbing Robots." Applied Sciences 9, no. 14 (July 20, 2019): 2908. http://dx.doi.org/10.3390/app9142908.

Full text
Abstract:
When locating wall-climbing robots with vision-based methods, locating and controlling the wall-climbing robot in the pixel coordinate of the wall map is an effective alternative that eliminates the need to calibrate the internal and external parameters of the camera. The estimation accuracy of the homography matrix between the camera image and the wall map directly impacts the pixel positioning accuracy of the wall-climbing robot in the wall map. In this study, we focused on the homography estimation between the camera image and wall map. We proposed HomographyFpnNet and obtained a smaller homography estimation error for a center-aligned image pair compared with the state of the art. The proposed hierarchical HomographyFpnNet for a non-center-aligned image pair significantly outperforms the method based on artificially designed features + Random Sample Consensus. The experiments conducted with a trained three-stage hierarchical HomographyFpnNet model on wall images of climbing robots also achieved small mean corner pixel error and proved its potential for estimating the homography between the wall map and camera images. The three-stage hierarchical HomographyFpnNet model has an average processing time of 10.8 ms on a GPU. The real-time processing speed satisfies the requirements of wall-climbing robots.
APA, Harvard, Vancouver, ISO, and other styles
18

Aydn, Berna, Başar Çolak, Yasemin Balc, and Canan Demirüstü. "Consistency of Postmortem Interval Estimations of Physicians Using Only Postmortem Changes of Putrefied Dead Bodies." American Journal of Forensic Medicine and Pathology 31, no. 3 (September 2010): 243–46. http://dx.doi.org/10.1097/paf.0b013e3181ee01d9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Wang, Steve C., Philip J. Everson, Heather Jianan Zhou, Dasol Park, and David J. Chudzicki. "Adaptive credible intervals on stratigraphic ranges when recovery potential is unknown." Paleobiology 42, no. 2 (February 19, 2016): 240–56. http://dx.doi.org/10.1017/pab.2015.37.

Full text
Abstract:
AbstractNumerous methods exist for estimating the true stratigraphic range of a fossil taxon based on the stratigraphic positions of its fossil occurrences. Many of these methods require the assumption of uniform fossil recovery potential—that fossils are equally likely to be found at any point within the taxon's true range. This assumption is unrealistic, because factors such as stratigraphic architecture, sampling effort, and the taxon's abundance and geographic range affect recovery potential. Other methods do not make this assumption, but they instead require a priori quantitative knowledge of recovery potential that may be difficult to obtain. We present a new Bayesian method, the Adaptive Beta method, for estimating the true stratigraphic range of a taxon that works for both uniform and non-uniform recovery potential. In contrast to existing methods, we explicitly estimate recovery potential from the positions of the occurrences themselves, so that a priori knowledge of recovery potential is not required. Using simulated datasets, we compare the performance of our method with existing methods. We show that the Adaptive Beta method performs well in that it achieves or nearly achieves nominal coverage probabilities and provides reasonable point estimates of the true extinction in a variety of situations. We demonstrate the method using a dataset of the Cambrian molluscAnabarella.
APA, Harvard, Vancouver, ISO, and other styles
20

Betz, Timm. "Robust Estimation with Nonrandom Measurement Error and Weak Instruments." Political Analysis 21, no. 1 (2013): 86–96. http://dx.doi.org/10.1093/pan/mps037.

Full text
Abstract:
Two common problems in applications of two-stage least squares (2SLS) are nonrandom measurement error in the endogenous variable and weak instruments. In the presence of nonrandom measurement error, 2SLS yields inconsistent estimates. In the presence of weak instruments, confidence intervals andp-values can be severely misleading. This article introduces a rank-based estimator, grounded in randomization inference, which addresses both problems within a unified framework. Monte Carlo studies illustrate the deficiencies of 2SLS and the virtues of the rank-based estimator in terms of bias and efficiency. A replication of a study of the effect of economic shocks on democratic transitions demonstrates the practical implications of accounting for nonrandom measurement error and weak instruments.
APA, Harvard, Vancouver, ISO, and other styles
21

Cruz, Pedro Alexandre. "Modelagem matem�tica do padr�o de crescimento do Tambaqui (Colossoma Macropomum) atrav�s do modelo de Gomertz." Revista Brasileira de Engenharia de Pesca 11, no. 2 (January 21, 2019): 18. http://dx.doi.org/10.18817/repesca.v11i2.1619.

Full text
Abstract:
Uma avalia��o do comportamento do crescimento de tambaqui foi realizada utilizando o modelo matem�tico de Gompertz. Assim, proje��es dos par�metros relacionados ao crescimento do peixe foram constru�das, estas curvas consideraram as medi��es destes para o peixe da fase de alevino/juvenil at� a fase de engorda. Os par�metros considerados foram o peso do peixe, o tamanho da cabe�a, a altura da cabe�a e o comprimento do t�rax. O modelo constru�do teve por finalidade fornecer uma estimativa do padr�o de crescimento de toda a popula��o de tambaqui, sem diferenci�-los por sexo. Al�m disso, uma melhor compreens�o acerca do crescimento de tal peixe pode ser de grande ajuda para o desenvolvimento de novos m�todos de cultivo ou melhoramento gen�tico, isto, objetivando uma otimiza��o da produ��o de tambaqui. Encontrar novas maneiras de otimizar o cultivo (quando o peixe atinge as dimens�es m�nimas necess�rias para abate em um menor intervalo de tempo) � importante para a ind�stria ser capaz de atender a crescente demanda do mercado em consumir animais de peso mais elevado. Adicionalmente, o conhecimento do padr�o de crescimento de tambaqui possibilita a implementa��o de novas metodologias de cultivo sem causar danos ou modifica��es � popula��o natural de tal peixe.
APA, Harvard, Vancouver, ISO, and other styles
22

Ette, Ene I., Andrew W. Kelman, Catherine A. Howie, and Brian Whiting. "Interpretation of Simulation Studies for Efficient Estimation of Population Pharmacokinetic Parameters." Annals of Pharmacotherapy 27, no. 9 (September 1993): 1034–39. http://dx.doi.org/10.1177/106002809302700903.

Full text
Abstract:
OBJECTIVE: To develop new approaches for evaluating results obtained from simulation studies used to determine sampling strategies for efficient estimation of population pharmacokinetic parameters. METHODS: One-compartment kinetics with intravenous bolus injection was assumed and the simulated data (one observation made on each experimental unit [human subject or animal]), were analyzed using NONMEM. Several approaches were used to judge the efficiency of parameter estimation. These included: (1) individual and joint confidence intervals (CIs) coverage for parameter estimates that were computed in a manner that would reveal the influence of bias and standard error (SE) on interval estimates; (2) percent prediction error (%PE) approach; (3) the incidence of high pair-wise correlations; and (4) a design number approach. The design number (Φ) is a new statistic that provides a composite measure of accuracy and precision (using SE). RESULTS: The %PE approach is useful only in examining the efficiency of estimation of a parameter considered independently. The joint CI coverage approach permitted assessment of the accuracy and reliability of all model parameter estimates. The Φ approach is an efficient method of achieving an accurate estimate of parameter(s) with good precision. Both the Φ for individual parameter estimation and the overall Φ for the estimation of model parameters led to optimal experimental design. CONCLUSIONS: Application of these approaches to the analyses of the results of the study was found useful in determining the best sampling design (from a series of two sampling times designs within a study) for efficient estimation of population pharmacokinetic parameters.
APA, Harvard, Vancouver, ISO, and other styles
23

Rovira-Más, Francisco, Qi Wang, and Qin Zhang. "Bifocal Stereoscopic Vision for Intelligent Vehicles." International Journal of Vehicular Technology 2009 (March 29, 2009): 1–9. http://dx.doi.org/10.1155/2009/123231.

Full text
Abstract:
The numerous benefits of real-time 3D awareness for autonomous vehicles have motivated the incorporation of stereo cameras to the perception units of intelligent vehicles. The availability of the distance between camera and objects is essential for such applications as automatic guidance and safeguarding; however, a poor estimation of the position of the objects in front of the vehicle can result in dangerous actions. There is an emphasis, therefore, in the design of perception engines that can make available a rich and reliable interval of ranges in front of the camera. The objective of this research is to develop a stereo head that is capable of capturing 3D information from two cameras simultaneously, sensing different, but complementary, fields of view. In order to do so, the concept of bifocal perception was defined and physically materialized in an experimental bifocal stereo camera. The assembled system was validated through field tests, and results showed that each stereo pair of the head excelled at a singular range interval. The fusion of both intervals led to a more faithful representation of reality.
APA, Harvard, Vancouver, ISO, and other styles
24

Nguyen, Thi Tuyet Mai, Elaine Evans, and Meiting Lu. "Independent directors, ownership concentration and firm performance in listed companies." Pacific Accounting Review 29, no. 2 (April 3, 2017): 204–26. http://dx.doi.org/10.1108/par-07-2016-0070.

Full text
Abstract:
Purpose The purpose of this paper is to investigate the impact of independent directors on firm performance in Vietnam and identify how different types of ownership structure and the presence of controlling shareholders influence the relationship. Design/methodology/approach For a sample of 217 non-financial Vietnam-listed companies during the period from 2010 to 2014, this study uses the ordinary least squares regressions to estimate the relationship between independent directors and firm performance. Two econometric techniques – the fixed effects estimation and the difference in difference estimation – are used to control for endogeneity. The results are also robust to the lag variable of independent directors. Findings The results reveal that independent directors have an overall negative effect on firm operating performance. This finding may be because of information asymmetry, expertise disadvantage and the dominance of ownership concentration that prevent independent directors from fulfilling their monitoring function in governance. The negative relationship between independent directors and firm performance is stronger in firms where the State is a controlling shareholder. Research limitations/implications Findings suggest that changes relating to independent directors, as a response to the new corporate governance code in 2012, do not have a positive effect on the relationship between corporate governance and firm performance. Further reform is required to improve internal control mechanisms and corporate governance systems in Vietnam. Originality/value This is the first study to provide a robust evidence on the relationship between independent directors and firm performance in Vietnam as well as to explore the impact of the type of controlling shareholders on the relationship.
APA, Harvard, Vancouver, ISO, and other styles
25

Hu, Jiaochan, Liangyun Liu, Jian Guo, Shanshan Du, and Xinjie Liu. "Upscaling Solar-Induced Chlorophyll Fluorescence from an Instantaneous to Daily Scale Gives an Improved Estimation of the Gross Primary Productivity." Remote Sensing 10, no. 10 (October 21, 2018): 1663. http://dx.doi.org/10.3390/rs10101663.

Full text
Abstract:
Solar-induced chlorophyll fluorescence (SIF) is closely linked to the photosynthesis of plants and has the potential to estimate gross primary production (GPP) at different temporal and spatial scales. However, remotely sensed SIF at a ground or space level is usually instantaneous, which cannot represent the daily total SIF. The temporal mismatch between instantaneous SIF (SIFinst) and daily GPP (GPPdaily) impacts their correlation across space and time. Previous studies have upscaled SIFinst to the daily scale based on the diurnal cycle in the cosine of the solar zenith angle ( cos ( SZA ) ) to correct the effects of latitude and length of the day on the variations in the SIF-GPP correlation. However, the important effects of diurnal weather changes due to cloud and atmospheric scattering were not considered. In this study, we present a SIF upscaling method using photosynthetically active radiation (PAR) as a driving variable. First, a conversion factor (i.e., the ratio of the instantaneous PAR (PARinst) to daily PAR (PARdaily)) was used to upscale in-situ SIF measurements from the instantaneous to daily scale. Then, the performance of the SIF upscaling method was evaluated under changing weather conditions and different latitudes using continuous tower-based measurements at two sites. The results prove that our PAR-based method can reduce not only latitude-dependent but also the weather-dependent variations in the SIF-GPP model. Specifically, the PAR-based method gave a more accurate prediction of diurnal and daily SIF (SIFdaily) than the cos ( SZA ) -based method, with decreased relative root mean square error (RRMSE) values from 42.2% to 25.6% at half-hour intervals and from 25.4% to 13.3% at daily intervals. Moreover, the PAR-based upscaled SIFdaily had a stronger correlation with the daily absorbed PAR (APAR) than both the SIFinst and cos ( SZA ) -based upscaled SIFdaily, especially for cloudy days with a coefficient of determination (R2) that increased from approximately 0.5 to 0.8. Finally, the PAR-based SIFdaily was linked to GPPdaily and compared to the SIFinst or cos ( SZA ) -based SIFdaily. The results indicate that the SIF-GPP correlation can obviously be improved, with an increased R2 from approximately 0.65 to 0.75. Our study confirms the importance of upscaling SIF from the instantaneous to daily scale when linking SIF with GPP and emphasizes the need to take diurnal weather changes into account for SIF temporal upscaling.
APA, Harvard, Vancouver, ISO, and other styles
26

Cyrino Oliveira, Fernando Luiz, Pedro Guilherme Costa Ferreira, and Reinaldo Castro Souza. "A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series." Mathematical Problems in Engineering 2014 (2014): 1–10. http://dx.doi.org/10.1155/2014/158689.

Full text
Abstract:
The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p), one for each period (month) of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p) models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM) produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.
APA, Harvard, Vancouver, ISO, and other styles
27

Dietterich, Thomas, Majid Alkaee Taleghan, and Mark Crowley. "PAC Optimal Planning for Invasive Species Management: Improved Exploration for Reinforcement Learning from Simulator-Defined MDPs." Proceedings of the AAAI Conference on Artificial Intelligence 27, no. 1 (June 29, 2013): 1270–76. http://dx.doi.org/10.1609/aaai.v27i1.8487.

Full text
Abstract:
Often the most practical way to define a Markov Decision Process (MDP) is as a simulator that, given a state and an action, produces a resulting state and immediate reward sampled from the corresponding distributions. Simulators in natural resource management can be very expensive to execute, so that the time required to solve such MDPs is dominated by the number of calls to the simulator. This paper presents an algorithm, DDV, that combines improved confidence intervals on the Q values (as in interval estimation) with a novel upper bound on the discounted state occupancy probabilities to intelligently choose state-action pairs to explore. We prove that this algorithm terminates with a policy whose value is within epsilon of the optimal policy (with probability 1-delta) after making only polynomially-many calls to the simulator. Experiments on one benchmark MDP and on an MDP for invasive species management show very large reductions in the number of simulator calls required.
APA, Harvard, Vancouver, ISO, and other styles
28

Lehtinen, Sonja, Peter Ashcroft, and Sebastian Bonhoeffer. "On the relationship between serial interval, infectiousness profile and generation time." Journal of The Royal Society Interface 18, no. 174 (January 2021): 20200756. http://dx.doi.org/10.1098/rsif.2020.0756.

Full text
Abstract:
The timing of transmission plays a key role in the dynamics and controllability of an epidemic. However, observing generation times—the time interval between the infection of an infector and an infectee in a transmission pair—requires data on infection times, which are generally unknown. The timing of symptom onset is more easily observed; generation times are therefore often estimated based on serial intervals—the time interval between symptom onset of an infector and an infectee. This estimation follows one of two approaches: (i) approximating the generation time distribution by the serial interval distribution or (ii) deriving the generation time distribution from the serial interval and incubation period—the time interval between infection and symptom onset in a single individual—distributions. These two approaches make different—and not always explicitly stated—assumptions about the relationship between infectiousness and symptoms, resulting in different generation time distributions with the same mean but unequal variances. Here, we clarify the assumptions that each approach makes and show that neither set of assumptions is plausible for most pathogens. However, the variances of the generation time distribution derived under each assumption can reasonably be considered as upper (approximation with serial interval) and lower (derivation from serial interval) bounds. Thus, we suggest a pragmatic solution is to use both approaches and treat these as edge cases in downstream analysis. We discuss the impact of the variance of the generation time distribution on the controllability of an epidemic through strategies based on contact tracing, and we show that underestimating this variance is likely to overestimate controllability.
APA, Harvard, Vancouver, ISO, and other styles
29

Shapchenko, M. M., T. A. Shapchenko, S. E. Kligman, L. G. Maminov, O. V. Sasson, and V. M. Chernyak. "ALGORITHM OF LOCATION OF WATER INFLUX INTERVALS IN GAS WELLS." Oil and Gas Studies, no. 6 (December 30, 2015): 41–46. http://dx.doi.org/10.31660/0445-0108-2015-6-41-46.

Full text
Abstract:
A method is proposed for identifying the well caused the pad flooding in the active operating well stock for locating the interval of water-influx interlayer in order to perform water isolation jobs based on the diagnostics of the producing formation current model. The searching for the pad flooding well is run in three steps: 1) Locating the group of potential wells capable to flood a pad based on estimation of the vapor phase, dynamic characteristics (velocity, flow rate), collection of dropping liquid samples for determination of additional water salinity. 2) Running the monitoring studies of the selected group of wells under the conditions variation (pressure depression) for identifying a well in this group producing with additional water and effecting the gas flow rates in the adjacent wells, the observation system offered being run without the wells shutin with a fast coverage of the entire well stock and with a higher reliability, ensuring safety and cost efficiency. 3) Performing the wide-range spectral neutron gamma-ray logging for diagnostics of the current model of the producing formation based on which the active water-influx intervals and their filtration-capacity properties can be identified to choose a way to perform the water isolation operations.
APA, Harvard, Vancouver, ISO, and other styles
30

Björk, Mats, Maria E. Asplund, Diana Deyanova, and Martin Gullström. "The amount of light reaching the leaves in seagrass (Zostera marina) meadows." PLOS ONE 16, no. 9 (September 21, 2021): e0257586. http://dx.doi.org/10.1371/journal.pone.0257586.

Full text
Abstract:
Seagrass meadows, and other submerged vegetated habitats, support a wide range of essential ecological services, but the true extents of these services are in many ways still not quantified. One important tool needed to assess and model many of these services is accurate estimations of the systems´ primary productivity. Such productivity estimations require an understanding of the underwater light field, especially regarding the amount of light that actually reaches the plants’ photosynthetic tissue. In this study, we tested a simple practical approach to estimate leaf light exposure, relative to incoming light at the canopy, by attaching light sensitive film at different positions on leaves of Zostera marina, eelgrass, in four seagrass meadows composed of different shoot density and at two different depths. We found that the light reaching the leaves decreased linearly down through the canopy. While the upper parts of the leaves received approximately the same level of light (photosynthetic photon flux density, PPFD) as recorded with a PAR meter at the canopy top, the average light that the seagrass leaves were exposed to varied between 40 and 60% of the light on top of the canopy, with an overall average of 48%. We recommend that actual light interception is measured when assessing or modelling light depending processes in submerged vegetation, but if this is not achievable a rough estimation for vegetation similar to Z. marina would be to use a correction factor of 0.5 to compensate for the reduced light due to leaf orientation and internal shading.
APA, Harvard, Vancouver, ISO, and other styles
31

GU, W., A. R. VIEIRA, R. M. HOEKSTRA, P. M. GRIFFIN, and D. COLE. "Use of random forest to estimate population attributable fractions from a case-control study of Salmonella enterica serotype Enteritidis infections." Epidemiology and Infection 143, no. 13 (February 12, 2015): 2786–94. http://dx.doi.org/10.1017/s095026881500014x.

Full text
Abstract:
SUMMARYTo design effective food safety programmes we need to estimate how many sporadic foodborne illnesses are caused by specific food sources based on case-control studies. Logistic regression has substantive limitations for analysing structured questionnaire data with numerous exposures and missing values. We adapted random forest to analyse data of a case-control study of Salmonella enterica serotype Enteritidis illness for source attribution. For estimation of summary population attributable fractions (PAFs) of exposures grouped into transmission routes, we devised a counterfactual estimator to predict reductions in illness associated with removing grouped exposures. For the purpose of comparison, we fitted the data using logistic regression models with stepwise forward and backward variable selection. Our results show that the forward and backward variable selection of logistic regression models were not consistent for parameter estimation, with different significant exposures identified. By contrast, the random forest model produced estimated PAFs of grouped exposures consistent in rank order with results obtained from outbreak data, with egg-related exposures having the highest estimated PAF (22·1%, 95% confidence interval 8·5–31·8). Random forest might be structurally more coherent and efficient than logistic regression models for attributing Salmonella illnesses to sources involving many causal pathways.
APA, Harvard, Vancouver, ISO, and other styles
32

Asharjabi, Sami, Hefdhallah Sakran, and Azzam Al-nahari. "Time-Domain Channel Estimation Scheme for OFDM over Fast Fading Channels." Wireless Communications and Mobile Computing 2022 (February 27, 2022): 1–9. http://dx.doi.org/10.1155/2022/7839430.

Full text
Abstract:
In high-mobility scenarios, the time variation of mobile radio channels leads to a loss of orthogonality among subcarriers in orthogonal frequency division multiplexing (OFDM) systems, resulting in intercarrier interference (ICI) and performance deterioration. Conventional channel estimation schemes are usually based on pilot tones, which are distributed in each OFDM symbol to estimate the channel variation. Hence, the channel estimator itself suffers from ICI. In this study, a new estimation scheme, which does not suffer from ICI, is proposed to estimate the channel variation within OFDM symbols. The main idea is to zero-pad (ZP) the OFDM symbol in the time domain. Then, in the middle of the ZP interval, an impulse signal is inserted as a pilot sample, which is used to estimate the channel at the pilot signal in the OFDM symbol. Finally, a linear model is used to estimate the channel variation over an OFDM symbol. Additionally, we derive the mean squared error (MSE) of the proposed estimation technique under the constraint that the channel varies linearly within OFDM symbols. Simulation results show that our scheme can achieve a substantial improvement in the bit error rate (BER) performance of OFDM, in spite of the OFDM symbol length being increased. Moreover, in many cases, the new scheme can achieve the same BER performance as the perfect knowledge of channel state information (CSI). Theoretical analysis and numerical simulations show that our scheme achieves excellent performance with much lower computational complexity.
APA, Harvard, Vancouver, ISO, and other styles
33

Claassen, Christopher. "Estimating Smooth Country–Year Panels of Public Opinion." Political Analysis 27, no. 1 (July 4, 2018): 1–20. http://dx.doi.org/10.1017/pan.2018.32.

Full text
Abstract:
At the microlevel, comparative public opinion data are abundant. But at the macrolevel—the level where many prominent hypotheses in political behavior are believed to operate—data are scarce. In response, this paper develops a Bayesian dynamic latent trait modeling framework for measuring smooth country–year panels of public opinion even when data are fragmented across time, space, and survey item. Six models are derived from this framework, applied to opinion data on support for democracy, and validated using tests of internal, external, construct, and convergent validity. The best model is reasonably accurate, with predicted responses that deviate from the true response proportions in a held-out test dataset by 6 percentage points. In addition, the smoothed country–year estimates of support for democracy have both construct and convergent validity, with spatiotemporal patterns and associations with other covariates that are consistent with previous research.
APA, Harvard, Vancouver, ISO, and other styles
34

Januel, Jean-Marie, Stephan Harbarth, Robert Allard, Nicolas Voirin, Alain Lepape, Bernard Allaouchiche, Claude Guerin, et al. "Estimating Attributable Mortality Due to Nosocomial Infections Acquired in Intensive Care Units." Infection Control & Hospital Epidemiology 31, no. 4 (April 2010): 388–94. http://dx.doi.org/10.1086/650754.

Full text
Abstract:
Background.The strength of the association between intensive care unit (ICU)-acquired nosocomial infections (NIs) and mortality might differ according to the methodological approach taken.Objective.TO assess the association between ICU-acquired NIs and mortality using the concept of population-attributable fraction (PAF) for patient deaths caused by ICU-acquired NIs in a large cohort of critically ill patients.Setting.Eleven ICUs of a French university hospital.Design.We analyzed surveillance data on ICU-acquired NIs collected prospectively during the period from 1995 through 2003. The primary outcome was mortality from ICU-acquired NI stratified by site of infection. A matched-pair, case-control study was performed. Each patient who died before ICU discharge was defined as a case patient, and each patient who survived to ICU discharge was denned as a control patient. The PAF was calculated after adjustment for confounders by use of conditional logistic regression analysis.Results.Among 8,068 ICU patients, a total of 1,725 deceased patients were successfully matched with 1,725 control Patients. The adjusted PAF due to ICU-acquired NI for patients who died before ICU discharge was 14.6% (95% confidence interval [CI], 14.4%—14.8%). Stratified by the type of infection, the PAF was 6.1% (95% CI, 5.7%–6.5%) for pulmonary infection, 3.2% (95% CI, 2.8%–3.5%) for central venous catheter infection, 1.7% (95% CI, 0.9%–2.5%) for bloodstream infection, and 0.0% (95% CI, –0.4% to 0.4%) for urinary tract infection.Conclusions.ICU-acquired NI had an important effect on mortality. However, the statistical association between ICU-acquired NI and mortality tended to be less pronounced in findings based on the PAF than in study findings based on estimates of relative risk. Therefore, the choice of methods does matter when the burden of NI needs to be assessed.
APA, Harvard, Vancouver, ISO, and other styles
35

Han, Dong, Syed Khairul Bashar, Jesús Lázaro, Fahimeh Mohagheghian, Andrew Peitzsch, Nishat Nishita, Eric Ding, et al. "A Real-Time PPG Peak Detection Method for Accurate Determination of Heart Rate during Sinus Rhythm and Cardiac Arrhythmia." Biosensors 12, no. 2 (January 29, 2022): 82. http://dx.doi.org/10.3390/bios12020082.

Full text
Abstract:
Objective: We have developed a peak detection algorithm for accurate determination of heart rate, using photoplethysmographic (PPG) signals from a smartwatch, even in the presence of various cardiac rhythms, including normal sinus rhythm (NSR), premature atrial contraction (PAC), premature ventricle contraction (PVC), and atrial fibrillation (AF). Given the clinical need for accurate heart rate estimation in patients with AF, we developed a novel approach that reduces heart rate estimation errors when compared to peak detection algorithms designed for NSR. Methods: Our peak detection method is composed of a sequential series of algorithms that are combined to discriminate the various arrhythmias described above. Moreover, a novel Poincaré plot scheme is used to discriminate between basal heart rate AF and rapid ventricular response (RVR) AF, and to differentiate PAC/PVC from NSR and AF. Training of the algorithm was performed only with Samsung Simband smartwatch data, whereas independent testing data which had more samples than did the training data were obtained from Samsung’s Gear S3 and Galaxy Watch 3. Results: The new PPG peak detection algorithm provides significantly lower average heart rate and interbeat interval beat-to-beat estimation errors—30% and 66% lower—and mean heart rate and mean interbeat interval estimation errors—60% and 77% lower—when compared to the best of the seven other traditional peak detection algorithms that are known to be accurate for NSR. Our new PPG peak detection algorithm was the overall best performers for other arrhythmias. Conclusion: The proposed method for PPG peak detection automatically detects and discriminates between various arrhythmias among different waveforms of PPG data, delivers significantly lower heart rate estimation errors for participants with AF, and reduces the number of false negative peaks. Significance: By enabling accurate determination of heart rate despite the presence of AF with rapid ventricular response or PAC/PVCs, we enable clinicians to make more accurate recommendations for heart rate control from PPG data.
APA, Harvard, Vancouver, ISO, and other styles
36

Tomioka, Kimiko, Teruyo Kitahara, Midori Shima, and Keigo Saeki. "Fraction and Number of Unemployed Associated with Self-Reported Low Back Pain: A Nation-Wide Cross-Sectional Study in Japan." International Journal of Environmental Research and Public Health 18, no. 20 (October 13, 2021): 10760. http://dx.doi.org/10.3390/ijerph182010760.

Full text
Abstract:
This study examined a cross-sectional association between self-reported low back pain (LBP) and unemployment among working-age people, and estimated the impact of self-reported LBP on unemployment. We used anonymized data from a nationally representative survey (24,854 men and 26,549 women aged 20–64 years). The generalized estimating equations of the multivariable Poisson regression models stratified by gender were used to estimate the adjusted prevalence ratio (PR) and 95% confidence interval (CI) for unemployment. The population attributable fraction (PAF) was calculated using Levin’s method, with the substitution method for 95% CI estimation. The prevalence of self-reported LBP was 9.0% in men and 11.1% in women. The prevalence of unemployment was 9.3% in men and 31.7% in women. After adjusting for age, socio-economic status, lifestyle habits, and comorbidities, the PR (95% CI) for the unemployment of the LBP group was 1.32 (1.19–1.47) in men and 1.01 (0.96–1.07) in women, compared with the respective non-LBP group. The PAF (95% CI) of unemployment associated with self-reported LBP was 2.8% (1.6%, 4.2%) in men. Because the total population of Japanese men aged 20–64 in 2013 was 36,851 thousand, it was estimated that unemployment in 1037 thousand of the Japanese male working population was LBP-related.
APA, Harvard, Vancouver, ISO, and other styles
37

Tomioka, Kimiko, Teruyo Kitahara, Midori Shima, and Keigo Saeki. "Fraction and Number of Unemployed Associated with Self-Reported Low Back Pain: A Nation-Wide Cross-Sectional Study in Japan." International Journal of Environmental Research and Public Health 18, no. 20 (October 13, 2021): 10760. http://dx.doi.org/10.3390/ijerph182010760.

Full text
Abstract:
This study examined a cross-sectional association between self-reported low back pain (LBP) and unemployment among working-age people, and estimated the impact of self-reported LBP on unemployment. We used anonymized data from a nationally representative survey (24,854 men and 26,549 women aged 20–64 years). The generalized estimating equations of the multivariable Poisson regression models stratified by gender were used to estimate the adjusted prevalence ratio (PR) and 95% confidence interval (CI) for unemployment. The population attributable fraction (PAF) was calculated using Levin’s method, with the substitution method for 95% CI estimation. The prevalence of self-reported LBP was 9.0% in men and 11.1% in women. The prevalence of unemployment was 9.3% in men and 31.7% in women. After adjusting for age, socio-economic status, lifestyle habits, and comorbidities, the PR (95% CI) for the unemployment of the LBP group was 1.32 (1.19–1.47) in men and 1.01 (0.96–1.07) in women, compared with the respective non-LBP group. The PAF (95% CI) of unemployment associated with self-reported LBP was 2.8% (1.6%, 4.2%) in men. Because the total population of Japanese men aged 20–64 in 2013 was 36,851 thousand, it was estimated that unemployment in 1037 thousand of the Japanese male working population was LBP-related.
APA, Harvard, Vancouver, ISO, and other styles
38

Zhan, Yiqiang, Jie Zhuang, Ying Dong, Hong Xu, Dayi Hu, and Jinming Yu. "Predicting the prevalence of peripheral arterial diseases: modelling and validation in different cohorts." Vasa 45, no. 1 (February 2016): 31–36. http://dx.doi.org/10.1024/0301-1526/a000492.

Full text
Abstract:
Abstract. Background: To develop models for prevalence estimation of peripheral arterial disease (PAD) and to validate them in an external cohort. Methods: Model training cohort was a population based cross-sectional survey. Age, sex, smoking status, body mass index, total cholesterol (TC), high density lipoprotein (HDL), TC/HDL ratio, low density lipoprotein, fasting glucose, diabetes, hypertension, pulse pressure, and stroke history were considered candidate predicting variables. Ankle brachial index ≤ 0.9 was defined as the presence of peripheral arterial disease. Logistic regression method was used to build the prediction models. The likelihood ratio test was applied to select predicting variables. The bootstrap method was used for model internal validation. Model performance was validated in an external cohort. Results: The final models included age, sex, pulse pressure, TC/HDL ratio, smoking status, diabetes, and stroke history. Area under receiver operating characteristics (AUC) with 95% confidence interval (CI) of the final model from the training cohort was 0.74 (0.70, 0.77). Model validation in another cohort revealed AUC (95% CI) of 0.72 (0.70, 0.73). P value of Hosmer-Lemeshow’s model goodness of fit test was 0.75 indicating good model calibration. Conclusions: The developed model yielded a moderate usefulness for predicting the prevalence of PAD in general population.
APA, Harvard, Vancouver, ISO, and other styles
39

Orem, Caitlin A., and Jon D. Pelletier. "Constraining frequency–magnitude–area relationships for rainfall and flood discharges using radar-derived precipitation estimates: example applications in the Upper and Lower Colorado River basins, USA." Hydrology and Earth System Sciences 20, no. 11 (November 8, 2016): 4483–501. http://dx.doi.org/10.5194/hess-20-4483-2016.

Full text
Abstract:
Abstract. Flood-envelope curves (FECs) are useful for constraining the upper limit of possible flood discharges within drainage basins in a particular hydroclimatic region. Their usefulness, however, is limited by their lack of a well-defined recurrence interval. In this study we use radar-derived precipitation estimates to develop an alternative to the FEC method, i.e., the frequency–magnitude–area-curve (FMAC) method that incorporates recurrence intervals. The FMAC method is demonstrated in two well-studied US drainage basins, i.e., the Upper and Lower Colorado River basins (UCRB and LCRB, respectively), using Stage III Next-Generation-Radar (NEXRAD) gridded products and the diffusion-wave flow-routing algorithm. The FMAC method can be applied worldwide using any radar-derived precipitation estimates. In the FMAC method, idealized basins of similar contributing area are grouped together for frequency–magnitude analysis of precipitation intensity. These data are then routed through the idealized drainage basins of different contributing areas, using contributing-area-specific estimates for channel slope and channel width. Our results show that FMACs of precipitation discharge are power-law functions of contributing area with an average exponent of 0.82 ± 0.06 for recurrence intervals from 10 to 500 years. We compare our FMACs to published FECs and find that for wet antecedent-moisture conditions, the 500-year FMAC of flood discharge in the UCRB is on par with the US FEC for contributing areas of ∼ 102 to 103 km2. FMACs of flood discharge for the LCRB exceed the published FEC for the LCRB for contributing areas in the range of ∼ 103 to 104 km2. The FMAC method retains the power of the FEC method for constraining flood hazards in basins that are ungauged or have short flood records, yet it has the added advantage that it includes recurrence-interval information necessary for estimating event probabilities.
APA, Harvard, Vancouver, ISO, and other styles
40

Orem, C. A., and J. D. Pelletier. "Constraining frequency-magnitude-area relationships for precipitation and flood discharges using radar-derived precipitation estimates: example applications in the Upper and Lower Colorado River Basins, USA." Hydrology and Earth System Sciences Discussions 12, no. 11 (November 10, 2015): 11739–82. http://dx.doi.org/10.5194/hessd-12-11739-2015.

Full text
Abstract:
Abstract. Flood-envelope curves (FEC) are useful for constraining the upper limit of possible flood discharges within drainage basins in a particular hydroclimatic region. Their usefulness, however, is limited by their lack of a well-defined recurrence interval. In this study we use radar-derived precipitation estimates to develop an alternative to the FEC method, i.e. the frequency-magnitude-area-curve (FMAC) method, that incorporates recurrence intervals. The FMAC method is demonstrated in two well-studied U.S. drainage basins, i.e. the Upper and Lower Colorado River basins (UCRB and LCRB, respectively), using Stage III Next-Generation-Radar (NEXRAD) gridded products and the diffusion-wave flow-routing algorithm. The FMAC method can be applied worldwide using any radar-derived precipitation estimates. In the FMAC method, idealized basins of similar contributing area are grouped together for frequency-magnitude analysis of precipitation intensity. These data are then routed through the idealized drainage basins of different contributing areas, using contributing-area-specific estimates for channel slope and channel width. Our results show that FMACs of precipitation discharge are power-law functions of contributing area with an average exponent of 0.79 ± 0.07 for recurrence intervals from 10 to 500 years. We compare our FMACs to published FECs and find that for wet antecedent-moisture conditions, the 500-year FMAC of flood discharge in the UCRB is on par with the US FEC for contributing areas of ~ 102 to 103 km2. FMACs of flood discharge for the LCRB exceed the published FEC for the LCRB for contributing areas in the range of ~ 102 to 104 km2. The FMAC method retains the power of the FEC method for constraining flood hazards in basins that are ungauged or have short flood records, yet it has the added advantage that it includes recurrence interval information necessary for estimating event probabilities.
APA, Harvard, Vancouver, ISO, and other styles
41

Rajoriya, Deepika, and Diwakar Shukla. "Under military war weapon support the economic bond level estimation using generalized Petersen graph with imputation." Statistics in Transition new series 24, no. 1 (February 24, 2023): 295–320. http://dx.doi.org/10.59170/stattrans-2023-016.

Full text
Abstract:
Several countries of the world are involved in mutual and collaborative business of military equipments, weapons in terms of their production, sales, technical maintenance, training and services. As a consequence, manufacturing of boms, rockets, missiles and other ammunitions have taken structured and smooth shape to help others where and when needed. Often the military support among countries remain open for information to the media, but sometime remain secret due to the national security and international political pressure. Such phenomenon (hidden or open support ) is a part of military supply chain and could be modeled like a Petersen graph considering vertices as countries and edges as economic bonds. For a large graphical structure, without sampling, it is difficult to find out average economic bonding (open & secret) between any pair of countries involved in the military business or support. This paper presents a sample based estimation methodology for estimating the mean economic bond value among countries involved in the military support or business. Motivation to the problem is derived from current Russia-Ukraine war situation and a kind of hidden support to war by NATO countries. A node sampling procedure is proposed whose bias, mean-squared error and other properties are derived. Results are supported with empirical studies. Findings are compared with particular cases and confidence intervals are used as a basic tool of comparison. Pattern imputation is used together with a new proposal of CIImputation method who has been proved useful for filling the missing value, specially when secret economic support data from involved countries found missing. The current undergoing war between Ukraine and Russia and secret weapon, economic support from NATO countries is an application of the proposed methodology contained in this paper. Key words: Graph, Petersen Graph, Estimator, Bias, Mean Squared Error (MSE), Optimum Choice, Confidence intervals (CI), Nodes (vertices), Pattern Imputation, CI-Imputation (LLimputation and UL-imputation), Economic Bonds, Military War, Weapon Support
APA, Harvard, Vancouver, ISO, and other styles
42

Jusko, Karen Long, and W. Phillips Shively. "Applying a Two-Step Strategy to the Analysis of Cross-National Public Opinion Data." Political Analysis 13, no. 4 (2005): 327–44. http://dx.doi.org/10.1093/pan/mpi030.

Full text
Abstract:
In recent years, large sets of national surveys with shared content have increasingly been used for cross-national opinion research. But scholars have not yet settled on the most flexible and efficient models for utilizing such data. We present a two-step strategy for such analysis that takes advantage of the fact that in such datasets each “cluster” (i.e., country sample) is large enough to sustain separate analysis of its internal variances and covariances. We illustrate the method by examining a puzzle of comparative electoral behavior—why does turnout decline rather than increase with the number of parties competing in an election (Blais and Dobryzynska 1998, for example)? This discussion demonstrates the ease with which a two-step strategy incorporates confounding variables operating at different levels of analysis. Technical appendices demonstrate that the two-step strategy does not lose efficiency of estimation as compared with a pooling strategy.
APA, Harvard, Vancouver, ISO, and other styles
43

Holst, Karen, Hervé Liebgott, Jens E. Wilhjelm, Svetoslav Nikolov, Søren T. Torp-Pedersen, Philippe Delachartre, and Jørgen A. Jensen. "Internal strain estimation for quantification of human heel pad elastic modulus: A phantom study." Ultrasonics 53, no. 2 (February 2013): 439–46. http://dx.doi.org/10.1016/j.ultras.2012.08.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Wang, Lei, Yi Nong Li, Feng Zhang, and Qing Zhong Ding. "Experimental Study on Active Vibration Control of a Gear Pair System Based on FxLMS Algorithm." Advanced Materials Research 562-564 (August 2012): 532–35. http://dx.doi.org/10.4028/www.scientific.net/amr.562-564.532.

Full text
Abstract:
To treat gear pair vibration due to internal excitation, an active internal gearbox structure near the gear pair is developed based on an active shaft transverse vibration control concept in the effort to tackle the gear excitation problem more directly. A controller of the FxLMS algorithm combined with frequency estimation technique is designed. And it is evaluated based on the experimental setup of the active gearbox vibration control system in contrast to fuzzy-PD controller. The experimental results show that 9.94dB attenuation in the gearbox housing vibration response can be achieved at the first gear mesh frequency by FxLMS controller, but only 5.62dB attenuation by fuzzy-PD controller. Besides, the insufficient effect at 210.8Hz and 216.2Hz is caused by typical out-of-band overshoot.
APA, Harvard, Vancouver, ISO, and other styles
45

Waples, R. S. "Estimation of allele frequencies at isoloci." Genetics 118, no. 2 (February 1, 1988): 371–84. http://dx.doi.org/10.1093/genetics/118.2.371.

Full text
Abstract:
Abstract In some polyploid animals and plants, pairs of duplicated loci occur that share alleles encoding proteins with identical electrophoretic mobilities. Except in cases where these "isoloci" are known to be inherited tetrasomically, individual genotypes cannot be determined unambiguously, and there is no direct way to assign observed variation to a particular locus of the pair. For a pair of diallelic isoloci, nine genotypes are possible but only five phenotypes can be identified, corresponding to individuals with 0-4 doses of the variant allele. A maximum likelihood (ML) approach is used here to identify the set of allele frequencies (p, q) at the individual gene loci with the highest probability of producing the observed phenotypic distribution. A likelihood ratio test is used to generate the asymmetrical confidence intervals around ML estimates. Simulations indicate that the standard error of p is typically about twice the binomial sampling error associated with single locus allele frequency estimates. ML estimates can be used in standard indices of genetic diversity and differentiation and in goodness-of-fit tests of genetic hypotheses. The noncentral chi 2 distribution is used to evaluate the power of a test of apparent heterozygote deficiency that results from attributing all variation to one locus when both loci are polymorphic.
APA, Harvard, Vancouver, ISO, and other styles
46

Das, Ujjwal, and Ranojoy Basu. "Approximate confidence intervals for the difference in proportions for partially observed binary data." Statistical Methods in Medical Research 31, no. 3 (November 29, 2021): 488–509. http://dx.doi.org/10.1177/09622802211060528.

Full text
Abstract:
We consider partially observed binary matched-pair data. We assume that the incomplete subjects are missing at random. Within this missing framework, we propose an EM-algorithm based approach to construct an interval estimator of the proportion difference incorporating all the subjects. In conjunction with our proposed method, we also present two improvements to the interval estimator through some correction factors. The performances of the three competing methods are then evaluated through extensive simulation. Recommendation for the method is given based on the ability to preserve type-I error for various sample sizes. Finally, the methods are illustrated in two real-world data sets. An R-function is developed to implement the three proposed methods.
APA, Harvard, Vancouver, ISO, and other styles
47

Nam, Jun-Mo. "Efficient interval estimation of a ratio of marginal probabilities in matched-pair data: Non-iterative method." Statistics in Medicine 28, no. 23 (October 15, 2009): 2929–35. http://dx.doi.org/10.1002/sim.3685.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Latella, Claudia, Silvio Traversaro, Diego Ferigo, Yeshasvi Tirupachuri, Lorenzo Rapetti, Francisco Javier Andrade Chavez, Francesco Nori, and Daniele Pucci. "Simultaneous Floating-Base Estimation of Human Kinematics and Joint Torques." Sensors 19, no. 12 (June 21, 2019): 2794. http://dx.doi.org/10.3390/s19122794.

Full text
Abstract:
The paper presents a stochastic methodology for the simultaneous floating-base estimation of the human whole-body kinematics and dynamics (i.e., joint torques, internal and external forces). The paper builds upon our former work where a fixed-base formulation had been developed for the human estimation problem. The presented approach is validated by presenting experimental results of a health subject equipped with a wearable motion tracking system and a pair of shoes sensorized with force/torque sensors while performing different motion tasks, e.g., walking on a treadmill. The results show that joint torque estimates obtained by using floating-base and fixed-base approaches match satisfactorily, thus validating the present approach.
APA, Harvard, Vancouver, ISO, and other styles
49

DESPOTIS, DIMITRIS K., and DIMITRIS DERPANIS. "A MIN–MAX GOAL PROGRAMMING APPROACH TO PRIORITY DERIVATION IN AHP WITH INTERVAL JUDGEMENTS." International Journal of Information Technology & Decision Making 07, no. 01 (March 2008): 175–82. http://dx.doi.org/10.1142/s0219622008002867.

Full text
Abstract:
We deal with the problem of priority elicitation in the analytic hierarchy process (AHP) on the basis of imprecise pair-wise comparison judgements on decision elements. We propose a min–max goal programming formulation to derive the AHP priorities in the case that the decision maker provides preference judgements in the form of interval numbers. By applying variable transformations, we formulate a linear programming model that is capable of estimating the priorities from both consistent and inconsistent interval judgements. The proposed method is illustrated by numerical examples.
APA, Harvard, Vancouver, ISO, and other styles
50

Yan-Wei, Shi, Liu Xiao-Shan, Wang Hai-Yang, and Zhang Run-Jie. "Effects of Malathion on the Insect Succession and the Development of Chrysomya megacephala (Diptera: Calliphoridae) in the Field and Implications for Estimating Postmortem Interval." American Journal of Forensic Medicine and Pathology 31, no. 1 (March 2010): 46–51. http://dx.doi.org/10.1097/paf.0b013e3181c215b4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography