Articoli di riviste sul tema "Estimation de normales"

Segui questo link per vedere altri tipi di pubblicazioni sul tema: Estimation de normales.

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Vedi i top-50 articoli di riviste per l'attività di ricerca sul tema "Estimation de normales".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Vedi gli articoli di riviste di molte aree scientifiche e compila una bibliografia corretta.

1

Bolduc, Denis, e Mustapha Kaci. "Estimation des modèles probit polytomiques : un survol des techniques". Articles 69, n. 3 (23 marzo 2009): 161–91. http://dx.doi.org/10.7202/602113ar.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
RÉSUMÉ Parce qu’il admet des structures très générales d’interdépendance entre les modalités, le probit polytomique (MNP) fournit une des formes les plus intéressantes pour modéliser les choix discrets qui découlent d’une maximisation d’utilité aléatoire. L’obstacle majeur et bien connu dans l’estimation de ce type de modèle tient à la complexité que prennent les calculs lorsque le nombre de modalités considérées est élevé. Cette situation est due essentiellement à la présence d’intégrales normales multidimensionnelles qui définissent les probabilités de sélection. Au cours des deux dernières décennies, de nombreux efforts ont été effectués visant à produire des méthodes qui permettent de contourner les difficultés de calcul liées à l’estimation des modèles probit polytomiques. L’objectif de ce texte consiste à produire un survol critique des principales méthodes mises de l’avant jusqu’à maintenant pour rendre opérationnel le cadre MNP. Nous espérons qu’il éclairera les praticiens de ces modèles quant au choix de technique d’estimation à favoriser au cours des prochaines années.
2

Camero Jiménez, Carlos W., Erick A. Chacón Montalvan, Vilma S. Romero Romero e Luisa E. Quispe Ortiz. "METODOLOGÍA PARA LA ESTIMACIÓN DE ÍNDICES DE CAPACIDAD EN PROCESOS PARA DATOS NO NORMALES". Revista Cientifica TECNIA 24, n. 1 (6 febbraio 2017): 43. http://dx.doi.org/10.21754/tecnia.v24i1.32.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
La globalización ha ido intensificando la competencia en muchos mercados. Con el fin de mantener su competitividad, las empresas buscan satisfacer las necesidades de los clientes mediante el cumplimiento de los requerimientos del mercado. En este contexto, los Índices de Capacidad de Proceso (ICP) juegan un rol trascendental en el análisis de capacidad de los procesos. Para el caso de datos no normales existen dos enfoques generales basados en transformaciones (Transformación de Box –Cox y de Johnson) y percentiles (Sistemas de distribuciones de Pearson y de Burr). Sin embargo, estudios anteriores sobre la comparación de tales métodos muestran distintas conclusiones y por ello nace la necesidad de aclarar las diferencias que existen entre estos métodos para poder implementar una correcta estimación de estos índices. En este trabajo, se realiza un estudio de simulación con el objetivo de comparar los métodos mencionados y proponer una metodología adecuada para la estimación del ICP en datos no normales. Además, se concluye que el mejor método a emplear depende del tipo de distribución, el nivel de asimetría de la misma y el valor del ICP. Palabras clave.- Ajuste de distribuciones de frecuencia, Índice de capacidad del proceso, normalidad, Transformación de datos, Simulación. ABSTRACTGlobalization has intensified competition in many markets. To remain competitive, the companies look for satisfying the needs of customers by meeting market requirements. In this context, Process Capability Indices (PCI) play a crucial role in assessing the quality of processes. In the case of non-normal data there are two general approaches based on transformations (Box-Cox and Johnson Transformation) and Percentiles (Pearson’s and Burr’s Distribution Systems). However, previous studies on the comparison of these methods show different conclusions, and thus arises the need to clarify the differences between these methods to implement a proper estimation of these indices. In this paper, a simulation study is made in order to compare the above methods and to propose an appropriate methodology for estimating the PCI in non-normal data. Furthermore, it is concluded that the best method used depends on the type of distribution, the asymmetry level of the distribution and the ICP value. Keywords.- Approximation to frequency distributions, Process capability indices, Normality, data transformations, Simulation.
3

Notton, Gilles, Ionut Caluianu, Iolanda Colda e Sorin Caluianu. "Influence d’un ombrage partiel sur la production électrique d’un module photovoltaïque en silicium monocristallin". Journal of Renewable Energies 13, n. 1 (25 ottobre 2023): 49–62. http://dx.doi.org/10.54966/jreen.v13i1.177.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Le développement du marché photovoltaïque nécessite de connaître parfaitement la production électrique de ces systèmes sur différents sites en particulier pour estimer sa rentabilité économique. Cette estimation précise ne peut se faire qu’en prenant en compte les effets d’ombrage qui ont des conséquences dramatiques sur la puissance électrique délivrée. Dans cet article, nous avons testé un modèle double-diode de courbes I-V sur un module photovoltaïque au silicium monocristallin (BP585F) sous des conditions normales d’éclairement; puis après une brève explication du comportement électrique d’une cellule totalement ou partiellement ombrée, nous présentons l’expérimentation mise en place. Les tests effectués ont permis de valider ce modèle de comportement de modules PV sous éclairement partiel. Enfin, les pertes de puissance induites par un tel ombrage ont pu être estimées.
4

García, Jhon Jario. "Los estudios de acontecimiento y la importancia de la metodología de estimación". Lecturas de Economía, n. 70 (11 settembre 2009): 223–35. http://dx.doi.org/10.17533/udea.le.n70a2262.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Esta nota realiza una revisión bibliográfica a partir de la literatura existente sobre los estudios de acontecimiento (event study). Su foco principal es la importancia de la metodología utilizada para la estimación de los retornos anormales en el largo plazo, ya que en este periodo los estudios de acontecimientos son sensibles al proceso de generación de los retornos Savickas, 2003 y Aktas et al., 2007). Se encuentra que, en el largo plazo se debe controlar el impacto de acontecimientos sin relación, con la estimación de la ventana para los retornos normales, donde la mejor alternativa para la estimación, en términos de su robustez, es el modelo de mercado en dos estados. Palabras Claves: Estudios de acontecimiento, retornos anormales. Clasificación JEL: G14, G34, G38. Abstract: This paper aims to review the literature on event studies. It highlights the importance of the methodology used for estimating long-run abnormal returns, as event studies are sensitive to the returns generating process in this time horizon (Savickas, 2003, and Aktas et al., 2007). We find that the impact of events has to be controlled for in the long run without regard to the estimation of the window for abnormal returns, in which the best estimation ethodology, in terms of robustness, turns out to be the two-state market model. Keywords: event studies, abnormal returns. Clasificación JEL: G14, G34, G38. Résumé: Cette note présente une révision bibliographique de la littérature existante concernant les études d'événement (event study). L'intérêt de cette littérature repose sur la méthodologie utilisée pour l'estimation des rendements anormaux de long terme, puisque dans cette période les études d'événements sont très sensibles au processus de génération des rendements (Savickas, 2003 et Aktas et al, 2007). On montre que, dans le long terme on doit contrôler l'impact d'événements sans relation avec l'estimation de la fenêtre pour les rendements anormaux, où la meilleure alternative pour l'estimation, en termes de robustesse, est le modèle de marché en deux états. Mots clé: Études événement, rendements anormaux. Classification JEL : G14, G34, G38
5

Wu, Zhaohao, Deyun Zhong, Zhaopeng Li, Liguan Wang e Lin Bi. "Orebody Modeling Method Based on the Normal Estimation of Cross-Contour Polylines". Mathematics 10, n. 3 (1 febbraio 2022): 473. http://dx.doi.org/10.3390/math10030473.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The normal estimation of cross-contour polylines largely determines the implicit orebody modeling result. However, traditional methods cannot estimate normals effectively due to the complex topological adjacency relationship of the cross-contour polylines manually interpreted in the process of exploration and production. In this work, we present an orebody implicit modeling method based on the normal estimation of cross-contour polylines. The improved method consists of three stages: (1) estimating the normals of cross-contour polylines by using the least square plane fitting method based on principal component analysis; (2) reorienting the normal directions by using the method based on the normal propagation; (3) using an implicit function to construct an orebody model. The innovation of this method is that it can automatically estimate the normals of the cross-contour polylines and reorient normal directions without manual intervention. Experimental results show that the proposed method has the advantages of a small amount of calculation, high efficiency and strong reliability. Moreover, this normal estimation method is useful to improve the automation of implicit orebody modeling.
6

MITRA, NILOY J., AN NGUYEN e LEONIDAS GUIBAS. "ESTIMATING SURFACE NORMALS IN NOISY POINT CLOUD DATA". International Journal of Computational Geometry & Applications 14, n. 04n05 (ottobre 2004): 261–76. http://dx.doi.org/10.1142/s0218195904001470.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
In this paper we describe and analyze a method based on local least square fitting for estimating the normals at all sample points of a point cloud data (PCD) set, in the presence of noise. We study the effects of neighborhood size, curvature, sampling density, and noise on the normal estimation when the PCD is sampled from a smooth curve in ℝ2or a smooth surface in ℝ3, and noise is added. The analysis allows us to find the optimal neighborhood size using other local information from the PCD. Experimental results are also provided.
7

SAUVANT, D., P. CHAPOUTOT e H. ARCHIMEDE. "La digestion des amidons par les ruminants et ses conséquences". INRAE Productions Animales 7, n. 2 (24 aprile 1994): 115–24. http://dx.doi.org/10.20870/productions-animales.1994.7.2.4161.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Il est maintenant acquis que la digestion ruminale de l’amidon varie largement en fonction de la nature de l’aliment et de son traitement technologique. La mesure in sacco de la dégradation théorique de l’amidon (X,%) permet de prédire la digestibilité préintestinale (Y,%) de ce constituant : Y = 0,483 X + 45,62. Les différences sur les paramètres de la digestion de l’amidon peuvent se répercuter sur la protéosynthèse microbienne, le profil fermentaire et la stabilité de l’écosystème ruminal. La digestibilité de l’amidon entrant dans l’intestin grêle n’est que partielle et semble révéler une limite de la capacité amylolytique de ce segment. L’amidon non digéré dans l’intestin grêle est en partie dégradé par les microorganismes du gros intestin. Les résultats de bilans digestifs partiels publiés à ce jour permettent d’effectuer une estimation des quantités d’acides gras volatils et de glucose potentiellement mises à disposition de l’animal par l’amidon d’un régime. Lorsque la ration distribuée aboutit à des valeurs normales du profil fermentaire ruminal et du taux butyreux du lait, il ne semble pas y avoir d’effet important de la vitesse de digestion de l’amidon et de sa répartition entre les différents segments digestifs. Une distinction apparaît par contre pour les régimes plus riches en concentrés qui entraînent des fermentations propioniques marquées et un faible taux butyreux du lait. Dans ce cas, le choix d’amidon lentement dégradable semble apporter une certaine "sécurité fermentaire" dans le rumen et permet de limiter la chute du taux butyreux du lait. D’une façon générale, des recherches doivent encore être poursuivies pour mieux connaître les conséquences nutritionnelles et zootechniques de la vitesse de dégradation de l’amidon des régimes dans le réticulo-rumen et pour élaborer des recommandations fiables.
8

Shi, Tiandong, Deyun Zhong e Liguan Wang. "Geological Modeling Method Based on the Normal Dynamic Estimation of Sparse Point Clouds". Mathematics 9, n. 15 (31 luglio 2021): 1819. http://dx.doi.org/10.3390/math9151819.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The effect of geological modeling largely depends on the normal estimation results of geological sampling points. However, due to the sparse and uneven characteristics of geological sampling points, the results of normal estimation have great uncertainty. This paper proposes a geological modeling method based on the dynamic normal estimation of sparse point clouds. The improved method consists of three stages: (1) using an improved local plane fitting method to estimate the normals of the point clouds; (2) using an improved minimum spanning tree method to redirect the normals of the point clouds; (3) using an implicit function to construct a geological model. The innovation of this method is an iterative estimation of the point cloud normal. The geological engineer adjusts the normal direction of some point clouds according to the geological law, and then the method uses these correct point cloud normals as a reference to estimate the normals of all point clouds. By continuously repeating the iterative process, the normal estimation result will be more accurate. Experimental results show that compared with the original method, the improved method is more suitable for the normal estimation of sparse point clouds by adjusting normals, according to prior knowledge, dynamically.
9

Wu, Xianyu, Penghao Li, Xin Zhang, Jiangtao Chen e Feng Huang. "Three Dimensional Shape Reconstruction via Polarization Imaging and Deep Learning". Sensors 23, n. 10 (9 maggio 2023): 4592. http://dx.doi.org/10.3390/s23104592.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Deep-learning-based polarization 3D imaging techniques, which train networks in a data-driven manner, are capable of estimating a target’s surface normal distribution under passive lighting conditions. However, existing methods have limitations in restoring target texture details and accurately estimating surface normals. Information loss can occur in the fine-textured areas of the target during the reconstruction process, which can result in inaccurate normal estimation and reduce the overall reconstruction accuracy. The proposed method enables extraction of more comprehensive information, mitigates the loss of texture information during object reconstruction, enhances the accuracy of surface normal estimation, and facilitates more comprehensive and precise reconstruction of objects. The proposed networks optimize the polarization representation input by utilizing the Stokes-vector-based parameter, in addition to separated specular and diffuse reflection components. This approach reduces the impact of background noise, extracts more relevant polarization features of the target, and provides more accurate cues for restoration of surface normals. Experiments are performed using both the DeepSfP dataset and newly collected data. The results show that the proposed model can provide more accurate surface normal estimates. Compared to the UNet architecture-based method, the mean angular error is reduced by 19%, calculation time is reduced by 62%, and the model size is reduced by 11%.
10

Zandy, Moe, Vicky Chang, Deepa P. Rao e Minh T. Do. "Exposition à la fumée du tabac et sommeil : estimation de l’association entre concentration de cotinine urinaire et qualité du sommeil". Promotion de la santé et prévention des maladies chroniques au Canada 40, n. 3 (marzo 2020): 77–89. http://dx.doi.org/10.24095/hpcdp.40.3.02f.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Introduction La majorité des études sur l’exposition à la fumée du tabac et la qualité du sommeil se fondent sur le tabagisme autodéclaré, ce qui peut entraîner des erreurs de classification de l’exposition et des biais liés à l’autodéclaration. L’objectif de cette étude consistait à étudier les associations entre la concentration de cotinine urinaire – un marqueur biologique de l’exposition à la fumée du tabac – et diverses mesures de la qualité du sommeil, soit le temps de sommeil, la continuité ou l’efficacité du sommeil, la satisfaction à l’égard du sommeil et le degré de vigilance durant les heures normales d’éveil. Méthodologie À l’aide des données d’un échantillon national de 10806 adultes (âgés de 18 à 79 ans) tiré de l’Enquête canadienne sur les mesures de la santé (2007-2013), nous avons effectué des analyses de régression logistique binaire pour estimer les associations entre concentration de cotinine urinaire et mesures de la qualité du sommeil, en tenant compte des facteurs de confusion éventuels. De plus, nous avons effectué une régression logistique ordinale pour évaluer les associations entre concentration urinaire de cotinine et présence d’un nombre plus élevé de troubles du sommeil. Résultats Dans l’ensemble, 28,7 % des répondants à l’enquête ciblant les adultes canadiens présentaient des concentrations de cotinine urinaire supérieures à la limite de détection, et la prévalence de chaque trouble du sommeil variait entre 5,5 % et 35,6 %. Une concentration élevée de cotinine dans l’urine (4e quartile par rapport à une concentration inférieure à la limite de détection) était associée à une probabilité significativement plus élevée d’avoir un temps de sommeil trop court ou trop long (rapport de cotes [RC] = 1,41; IC à 95 % : 1,02 à 1,95; tendance p = 0,021), d’avoir de la difficulté à s’endormir ou à rester endormi (RC = 1,71; IC à 95 % : 1,28 à 2,27; tendance p = 0,003), de ressentir une insatisfaction à l’égard du sommeil (RC = 1,87; IC à 95 % : 1,21 à 2,89; tendance p = 0,011) et de présenter un nombre plus élevé de troubles du sommeil (RC = 1,64; IC à 95 % : 1,19 à 2,26; tendance p = 0,001). Les relations observées étaient plus marquées chez les femmes que chez les hommes. Conclusion Notre étude, qui a fait appel à un marqueur biologique de l’exposition à la fumée du tabac, contribue à l’ensemble de la littérature traitant de l’effet de l’exposition aux substances toxiques de l’environnement sur la qualité du sommeil en montrant qu’il existe une relation entre l’exposition à la fumée du tabac et un sommeil de mauvaise qualité. Pour tenir compte des limites inhérentes à la nature transversale du plan d’étude et pour mieux évaluer les effets sur le long terme de l’exposition à la fumée de tabac sur la qualité du sommeil, il est nécessaire de réaliser des études longitudinales.
11

Zhao, Ruibin, Mingyong Pang, Caixia Liu e Yanling Zhang. "Robust Normal Estimation for 3D LiDAR Point Clouds in Urban Environments". Sensors 19, n. 5 (12 marzo 2019): 1248. http://dx.doi.org/10.3390/s19051248.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Normal estimation is a crucial first step for numerous light detection and ranging (LiDAR) data-processing algorithms, from building reconstruction, road extraction, and ground-cover classification to scene rendering. For LiDAR point clouds in urban environments, this paper presents a robust method to estimate normals by constructing an octree-based hierarchical representation for the data and detecting a group of large enough consistent neighborhoods at multiscales. Consistent neighborhoods are mainly determined based on the observation that an urban environment is typically comprised of regular objects, e.g., buildings, roads, and the ground surface, and irregular objects, e.g., trees and shrubs; the surfaces of most regular objects can be approximatively represented by a group of local planes. Even in the frequent presence of heavy noise and anisotropic point samplings in LiDAR data, our method is capable of estimating robust normals for kinds of objects in urban environments, and the estimated normals are beneficial to more accurately segment and identify the objects, as well as preserving their sharp features and complete outlines. The proposed method was experimentally validated both on synthetic and real urban LiDAR datasets, and was compared to state-of-the-art methods.
12

Liu, Guangshuai, Xurui Li, Si Sun e Wenyu Yi. "Robust and Fast Normal Mollification via Consistent Neighborhood Reconstruction for Unorganized Point Clouds". Sensors 23, n. 6 (20 marzo 2023): 3292. http://dx.doi.org/10.3390/s23063292.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This paper introduces a robust normal estimation method for point cloud data that can handle both smooth and sharp features. Our method is based on the inclusion of neighborhood recognition into the normal mollification process in the neighborhood of the current point: First, the point cloud surfaces are assigned normals via a normal estimator of robust location (NERL), which guarantees the reliability of the smooth region normals, and then a robust feature point recognition method is proposed to identify points around sharp features accurately. Furthermore, Gaussian maps and clustering are adopted for feature points to seek a rough isotropic neighborhood for the first-stage normal mollification. In order to further deal with non-uniform sampling or various complex scenes efficiently, the second-stage normal mollification based on residual is proposed. The proposed method was experimentally validated on synthetic and real-world datasets and compared to state-of-the-art methods.
13

Abu-Hani, J., T. Zhang e S. Filin. "HIGH FIDELITY EDGE AWARE NORMAL ESTIMATION FOR LOW RESOLUTION AND NOISY POINT CLOUDS OF HERITAGE SITES". International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B2-2022 (30 maggio 2022): 737–43. http://dx.doi.org/10.5194/isprs-archives-xliii-b2-2022-737-2022.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract. Surface normals play a pivotal role in most shape analysis applications. As such, their accurate computation has received a considerable amount of attention over the years. In heritage sites, where entities of complex shape are prevalent, the development of reliable means to estimate them becomes even more important. When approaching the computation of normals, not only the shape complexity becomes a factor, but also the nature of the data, specifically the point density and noise level. To handle both shape and data quality-related aspects, this paper proposes a new method for the computation of surface normals, focusing on entities of complex form. We consider portable laser scanners as our data acquisition source, as such scanners offer efficient site coverage. Nonetheless, as portable scans are characterized by relatively sparse point clouds and noisy responses, their processing becomes a challenge. While existing research focused on the development of robust estimation methods, their application scope exhibited some limitations in preserving sharp features. Here, we demonstrate how the use of the L0 norm optimization framework, which features an inherent ability to preserve sharp transitions, with no need to introduce assumptions about the scene structure, can accommodate such a problem. Results show improved performance compared to common normal computation schemes with more reliable and accurate estimations of their form.
14

Livezey, Robert E., Konstantin Y. Vinnikov, Marina M. Timofeyeva, Richard Tinker e Huug M. van den Dool. "Estimation and Extrapolation of Climate Normals and Climatic Trends". Journal of Applied Meteorology and Climatology 46, n. 11 (1 novembre 2007): 1759–76. http://dx.doi.org/10.1175/2007jamc1666.1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract WMO-recommended 30-yr normals are no longer generally useful for the design, planning, and decision-making purposes for which they were intended. They not only have little relevance to the future climate, but are often unrepresentative of the current climate. The reason for this is rapid global climate change over the last 30 yr that is likely to continue into the future. It is demonstrated that simple empirical alternatives already are available that not only produce reasonably accurate normals for the current climate but also often justify their extrapolation to several years into the future. This result is tied to the condition that recent trends in the climate are approximately linear or have a substantial linear component. This condition is generally satisfied for the U.S. climate-division data. One alternative [the optimal climate normal (OCN)] is multiyear averages that are not fixed at 30 yr like WMO normals are but rather are adapted climate record by climate record based on easily estimated characteristics of the records. The OCN works well except with very strong trends or longer extrapolations with more moderate trends. In these cases least squares linear trend fits to the period since the mid-1970s are viable alternatives. An even better alternative is the use of “hinge fit” normals, based on modeling the time dependence of large-scale climate change. Here, longer records can be exploited to stabilize estimates of modern trends. Related issues are the need to avoid arbitrary trend fitting and to account for trends in studies of ENSO impacts. Given these results, the authors recommend that (a) the WMO and national climate services address new policies for changing climate normals using the results here as a starting point and (b) NOAA initiate a program for improved estimates and forecasts of official U.S. normals, including operational implementation of a simple hybrid system that combines the advantages of both the OCN and the hinge fit.
15

Norets, Andriy, e Justinas Pelenis. "Adaptive Bayesian Estimation of Discrete‐Continuous Distributions Under Smoothness and Sparsity". Econometrica 90, n. 3 (2022): 1355–77. http://dx.doi.org/10.3982/ecta17884.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We consider nonparametric estimation of a mixed discrete‐continuous distribution under anisotropic smoothness conditions and a possibly increasing number of support points for the discrete part of the distribution. For these settings, we derive lower bounds on the estimation rates. Next, we consider a nonparametric mixture of normals model that uses continuous latent variables for the discrete part of the observations. We show that the posterior in this model contracts at rates that are equal to the derived lower bounds up to a log factor. Thus, Bayesian mixture of normals models can be used for (up to a log factor) optimal adaptive estimation of mixed discrete‐continuous distributions. The proposed model demonstrates excellent performance in simulations mimicking the first stage in the estimation of structural discrete choice models.
16

HAMDAN, HASAN, e LING XU. "Estimating variance-mean mixtures of Normals". Journal of Statistical Research 52, n. 2 (11 marzo 2019): 129–50. http://dx.doi.org/10.47302/jsr.2018520202.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
A new semi-nonparametric method for modeling data with Normal variance-mean mixtures (NVMM) is presented. This new method is based on the least-squares programming routine, UNMIX. Density estimates based on random samples of size n from two, three, and four components of NVMM are found using UNMIX. Graphical comparisons of the UNMIX fit found by the EM Algorithm and the Bayesian approaches are done using three real life examples. A quantitative comparison using the AIC and Chi-Square is done for one of the most commonly used examples, the Galaxy Data. The results are promising and the method has great potential for improvement.
17

Sun, Bomin, e Thomas C. Peterson. "Estimating temperature normals for USCRN stations". International Journal of Climatology 25, n. 14 (2005): 1809–17. http://dx.doi.org/10.1002/joc.1220.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
18

Fourey, Sébastien, e Rémy Malgouyres. "Normals estimation for digital surfaces based on convolutions". Computers & Graphics 33, n. 1 (febbraio 2009): 2–10. http://dx.doi.org/10.1016/j.cag.2008.11.003.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
19

Roeder, Kathryn, e Larry Wasserman. "Practical Bayesian Density Estimation Using Mixtures of Normals". Journal of the American Statistical Association 92, n. 439 (settembre 1997): 894–902. http://dx.doi.org/10.1080/01621459.1997.10474044.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
20

Rataj, Jan. "Estimation of oriented direction distribution of a planar body". Advances in Applied Probability 28, n. 2 (giugno 1996): 394–404. http://dx.doi.org/10.2307/1428064.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Methods of estimation of the oriented direction distribution (i.e. the distribution of unit outer normals over the boundary) of a planar set from the convex ring are proposed. The methods are based on an estimation of the area dilation of the investigated set by chosen test sets.
21

Rataj, Jan. "Estimation of oriented direction distribution of a planar body". Advances in Applied Probability 28, n. 02 (giugno 1996): 394–404. http://dx.doi.org/10.1017/s0001867800048540.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Methods of estimation of the oriented direction distribution (i.e. the distribution of unit outer normals over the boundary) of a planar set from the convex ring are proposed. The methods are based on an estimation of the area dilation of the investigated set by chosen test sets.
22

Wu, Yunlong, Qing Zhang e Shuxuan Zhang. "Fast Cylindrical Fitting Method Using Point Cloud’s Normals Estimation". Mathematical Problems in Engineering 2018 (12 dicembre 2018): 1–7. http://dx.doi.org/10.1155/2018/8904653.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Cylindrical fitting is an essential step in Large Process Pipeline’s measurement process, and precision of initial values of cylindrical fitting is a key element in getting a correct fitting result. In order to get well initial values, covariance matrixes of all points in cylinder’s three-dimensional laser scanning point cloud should be firstly established to estimate normals of all points, and then cylinder’s axis vector can be calculated by using least squares method. Secondly, remaining parameters’ initial values of the cylinder can be got by coordinate transformation. Finally, Levenberg-Marquardt algorithm is used in iterative optimization process to get fitting result by using the above values as initial values. Experiments demonstrate that this method can get precise initial values of cylindrical fitting and improve the accuracy and speed of cylindrical fitting.
23

Reichl, Johannes. "Estimating marginal likelihoods from the posterior draws through a geometric identity". Monte Carlo Methods and Applications 26, n. 3 (1 settembre 2020): 205–21. http://dx.doi.org/10.1515/mcma-2020-2068.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractThis article develops a new estimator of the marginal likelihood that requires only a sample of the posterior distribution as the input from the analyst. This sample may come from any sampling scheme, such as Gibbs sampling or Metropolis–Hastings sampling. The presented approach can be implemented generically in almost any application of Bayesian modeling and significantly decreases the computational burdens associated with marginal likelihood estimation compared to existing techniques. The functionality of this method is demonstrated in the context of probit and logit regressions, on two mixtures of normals models, and also on a high-dimensional random intercept probit. Simulation results show that the simple approach presented here achieves excellent stability in low-dimensional models, and also clearly outperforms existing methods when the number of coefficients in the model increases.
24

Kitanovski, Vlado, e Jon Yngve Hardeberg. "Objective evaluation of relighting models on translucent materials from multispectral RTI images". Electronic Imaging 2021, n. 5 (18 gennaio 2021): 133–1. http://dx.doi.org/10.2352/issn.2470-1173.2021.5.maap-133.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
In this paper, we evaluate the quality of reconstruction i.e. relighting from images obtained by a newly developed multispectral reflectance transformation imaging (MS-RTI) system. The captured MS-RTI images are of objects with different translucency and color. We use the most common methods for relighting the objects: polynomial texture mapping (PTM) and hemispherical harmonics (HSH), as well as the recent discrete model decomposition (DMD). The results show that all three models can reconstruct the images of translucent materials, with the reconstruction error varying with translucency but still in the range of what has been reported for other non-translucent materials. DMD relighted images are marginally better for the most transparent objects, while HSH- and PTM- relighted images appear to be better for the opaquer objects. The estimation of the surface normals of highly translucent objects using photometric stereo is not very accurate. Utilizing the peak of the fitted angular reflectance field, the relighting models, especially PTM, can provide more accurate estimation of the surface normals.
25

Holroyd, Michael, Jason Lawrence, Greg Humphreys e Todd Zickler. "A photometric approach for estimating normals and tangents". ACM Transactions on Graphics 27, n. 5 (dicembre 2008): 1–9. http://dx.doi.org/10.1145/1409060.1409086.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
26

Rigal, Alix, Jean-Marc Azaïs e Aurélien Ribes. "Estimating daily climatological normals in a changing climate". Climate Dynamics 53, n. 1-2 (15 dicembre 2018): 275–86. http://dx.doi.org/10.1007/s00382-018-4584-6.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
27

Soto Molina, Victor Hugo, e Hugo Delgado Granados. "Estimación de la temperatura del aire en la alta montaña mexicana mediante un modelo de elevación del terreno: caso del volcán Nevado de Toluca (México) / Estimation of the air temperature in the Mexican high mountain environment by means of a model of elevation of the terrain, case of the Nevado de Toluca volcano (Mexico)". Ería 2, n. 2 (19 luglio 2020): 167–82. http://dx.doi.org/10.17811/er.2.2020.167-182.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
La carencia de estaciones climatológicas por encima de 3.500 metros sobre el nivel del mar (msnm) en México condiciona que los estudios sobre ecosistemas de alta montaña sean realizados con datos de estaciones cercanas, pero sin considerar la diferencia altitudinal de la temperatura debida al relieve y al gradiente vertical. Por tal razón, se realiza un modelo mensual y anual de la distribución espacial de la temperatura del aire en superficie para el volcán Nevado de Toluca (4.680 msnm) y zonas adyacentes, mediante el uso de un Modelo Digital de Elevaciones y el Gradiente Vertical de la Temperatura Troposférica. Este último se ha obtenido a partir de los valores medios de los elementos meteorológicos de cada una de las estaciones situadas alrededor del edificio volcánico y a diferente altitud entre sí. La precisión del modelo ha sido comprobada mediante las observaciones registradas en una estación climatológica instalada al noroeste de la cima del estratovolcán a 4.283 msnm. En el mapa ráster resultante con resolución espacial de 15 metros por pixel se aprecia que los valores estimados son estadísticamente semejantes a aquellos observados in situ. Los resultados en el modelo muestran un aceptable grado de exactitud, y éste puede implementarse fácilmente en cualquier zona a cualquier escala temporal, donde la falta de estaciones climatológicas limite o impida el análisis de la relación de la temperatura con los ecosistemas de alta montaña.L'absence de stations climatologiques au Mexique à plus de 3 500 mètres d'altitude conditionne la réalisation d'études sur les écosystèmes de haute montagne à partir de données provenant de stations à proximité, sans tenir compte de la différence d'altitude en température due au relief et le gradient vertical de celui-ci. Cela a conduit à l'élaboration de ce travail où une modélisation mensuelle et annuelle de la distribution spatiale de la température de l'air au niveau de la surface pour le volcan Nevado de Toluca (4 680 mètres d'altitude) et les zones adjacentes a été réalisée à l'aide d'un modèle altimétrique numérique et du gradient vertical de la température troposphérique; ces derniers ont été obtenus à partir des normales climatiques de chacune des stations situées autour du bâtiment volcanique et à différentes altitudes. La précision du modèle a été vérifiée par des observations enregistrées dans une station météorologique installée au nord-ouest du sommet de la montagne à 4 283 mètres d'altitude. Dans la carte raster résultante avec une résolution spatiale de 15 mètres par pixel, il a été constaté que les valeurs estimées sont statistiquement similaires à celles observées in situ. Les résultats du modèle montrent un degré de précision acceptable, ce qui peut être facilement mis en œuvre dans n'importe quelle zone et à n'importe quelle échelle de temps, lorsque le manque de stations climatologiques limite ou empêche l'analyse de la relation entre la température et les écosystèmes de haute montagne.The lack of climatological stations above 3,500 meters above sea level (m asl) in Mexico, conditions that the studies on high mountain ecosystems are made with data from nearby stations, but without considering the altitudinal difference of the temperature due to the relief and the vertical gradient of it. This led to the elaboration of this work where a monthly and annual modeling of the spatial distribution of air temperature at the surface level for the Nevado de Toluca volcano (4,680 m asl) and adjacent areas was carried out using a Digital Elevation Model and the Vertical Gradient of the Tropospheric Temperature. The latter having been obtained based on the climatic normals of each of the stations located around the volcanic mountain and at different altitudes. The accuracy of the model has been verified by observations recorded in a weather station installed northwest of the mountain top at 4,283 meters above sea level. In the resulting raster map with spatial resolution of 15 meters per pixel, it was found that the estimated values are statistically similar to those observed in situ. The results in the model show an acceptable degree of accuracy, and this can be easily implemented in any area at any time scale, where the lack of climatological stations limits the analysis of the relationship of air temperature with high mountain ecosystems.
28

Kim, S., H. G. Kim e T. Kim. "MESH MODELLING OF 3D POINT CLOUD FROM UAV IMAGES BY POINT CLASSIFICATION AND GEOMETRIC CONSTRAINTS". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2 (30 maggio 2018): 507–11. http://dx.doi.org/10.5194/isprs-archives-xlii-2-507-2018.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The point cloud generated by multiple image matching is classified as an unstructured point cloud because it is not regularly point spaced and has multiple viewpoints. The surface reconstruction technique is used to generate mesh model using unstructured point clouds. In the surface reconstruction process, it is important to calculate correct surface normals. The point cloud extracted from multi images contains position and color information of point as well as geometric information of images used in the step of point cloud generation. Thus, the surface normal estimation based on the geometric constraints is possible. However, there is a possibility that a direction of the surface normal is incorrectly estimated by noisy vertical area of the point cloud. In this paper, we propose an improved method to estimate surface normals of the vertical points within an unstructured point cloud. The proposed method detects the vertical points, adjust their normal vectors by analyzing surface normals of nearest neighbors. As a result, we have found almost all vertical points through point type classification, detected the points with wrong normal vectors and corrected the direction of the normal vectors. We compared the quality of mesh models generated with corrected surface normals and uncorrected surface normals. Result of comparison showed that our method could correct wrong surface normal successfully of vertical points and improve the quality of the mesh model.
29

Verhoeven, J. T. M., e J. M. Thijssen. "Potential of Fractal Analysis for Lesion Detection in Echographic Images". Ultrasonic Imaging 15, n. 4 (ottobre 1993): 304–23. http://dx.doi.org/10.1177/016173469301500403.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The application of fractal analysis to parametric imaging of B-mode echograms and to differentation of echographic speckle textures was investigated. Echograms were obtained from realistic simulations and from a clinical study on diffuse liver disease. The simulations comprised tissue models with randomly positioned scatterers in a 3-D volume in which the number density was varied over a range from 0.5 to 25 mm–3. The clinical echograms comprised both normals and patients with liver cirrhosis. Three methods of estimating the fractal dimension were investigated, two in the spatial image domain and one in the spatial frequency domain. The results of these methods are compared and the applicability and the limitations of texture differentiation using fractal analysis is discussed. The main conclusion is that fractal analysis offers no obvious advantage over statistical analysis of the texture of echographic images. Its use for parametric imaging is further limited by the need to use relatively large windows for local estimation of the fractal dimension.
30

Camero Jiménez, José W., e Jahaziel G. Ponce Sánchez. "COMPARACION DEL PROMEDIO Y LA MEDIANA COMO ESTIMADORES DE LA MEDIA PARA MUESTRAS NORMALES DEPENDIENDO DEL TAMAÑO DE MUESTRA". Revista Cientifica TECNIA 23, n. 2 (13 marzo 2017): 33. http://dx.doi.org/10.21754/tecnia.v23i2.73.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Actualmente los métodos para estimar la media son los basados en el intervalo de confianza del promedio o media muestral. Este trabajo pretende ayudar a escoger el estimador (promedio o mediana) a usar dependiendo del tamaño de muestra. Para esto se han generado, vía simulación en excel, muestras con distribución normal y sus intervalos de confianza para ambos estimadores, y mediante pruebas de hipótesis para la diferencia de proporciones se demostrará que método es mejor dependiendo del tamaño de muestra. Palabras clave.-Tamaño de muestra, Intervalo de confianza, Promedio, Mediana. ABSTRACTCurrently the methods for estimating the mean are those based on the confidence interval of the average or sample mean. This paper aims to help you choose the estimator (average or median) to use depending on the sample size. For this we have generated, via simulation in EXCEL, samples with normal distribution and confidence intervals for both estimators, and by hypothesis tests for the difference of proportions show that method is better depending on the sample size. Keywords.-Sampling size, Confidence interval, Average, Median.
31

Li, Shujuan, Junsheng Zhou, Baorui Ma, Yu-Shen Liu e Zhizhong Han. "NeAF: Learning Neural Angle Fields for Point Normal Estimation". Proceedings of the AAAI Conference on Artificial Intelligence 37, n. 1 (26 giugno 2023): 1396–404. http://dx.doi.org/10.1609/aaai.v37i1.25224.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Normal estimation for unstructured point clouds is an important task in 3D computer vision. Current methods achieve encouraging results by mapping local patches to normal vectors or learning local surface fitting using neural networks. However, these methods are not generalized well to unseen scenarios and are sensitive to parameter settings. To resolve these issues, we propose an implicit function to learn an angle field around the normal of each point in the spherical coordinate system, which is dubbed as Neural Angle Fields (NeAF). Instead of directly predicting the normal of an input point, we predict the angle offset between the ground truth normal and a randomly sampled query normal. This strategy pushes the network to observe more diverse samples, which leads to higher prediction accuracy in a more robust manner. To predict normals from the learned angle fields at inference time, we randomly sample query vectors in a unit spherical space and take the vectors with minimal angle values as the predicted normals. To further leverage the prior learned by NeAF, we propose to refine the predicted normal vectors by minimizing the angle offsets. The experimental results with synthetic data and real scans show significant improvements over the state-of-the-art under widely used benchmarks. Project page: https://lisj575.github.io/NeAF/.
32

Nugroho, Didit Budi, Agus Priyono e Bambang Susanto. "SKEW NORMAL AND SKEW STUDENT-T DISTRIBUTIONS ON GARCH(1,1) MODEL". MEDIA STATISTIKA 14, n. 1 (16 aprile 2021): 21–32. http://dx.doi.org/10.14710/medstat.14.1.21-32.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) type models have become important tools in financial application since their ability to estimate the volatility of financial time series data. In the empirical financial literature, the presence of skewness and heavy-tails have impacts on how well the GARCH-type models able to capture the financial market volatility sufficiently. This study estimates the volatility of financial asset returns based on the GARCH(1,1) model assuming Skew Normal and Skew Student-t distributions for the returns errors. The models are applied to daily returns of FTSE100 and IBEX35 stock indices from January 2000 to December 2017. The model parameters are estimated by using the Generalized Reduced Gradient Non-Linear method in Excel’s Solver and also the Adaptive Random Walk Metropolis method implemented in Matlab. The estimation results from fitting the models to real data demonstrate that Excel’s Solver is a promising way for estimating the parameters of the GARCH(1,1) models with non-Normal distribution, indicated by the accuracy of the estimation of Excel’s Solver. The fitting performance of models is evaluated by using log-likelihood ratio test and it indicates that the GARCH(1,1) model with Skew Student-t distribution provides the best fitting, followed by Student-t, Skew-Normal, and Normal distributions.
33

Wilks, D. S. "Projecting “Normals” in a Nonstationary Climate". Journal of Applied Meteorology and Climatology 52, n. 2 (febbraio 2013): 289–302. http://dx.doi.org/10.1175/jamc-d-11-0267.1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractClimate “normals” are statistical estimates of present and/or near-future climate means for such quantities as seasonal temperature or precipitation. In a changing climate, simply averaging a large number of previous years of data may not be the best method for estimating normals. Here eight formulations for climate normals, including the recently proposed “hinge” function, are compared in artificial- and real-data settings. Although the hinge function is attractive conceptually for representing accelerating climate changes simply, its use is in general not yet justified for divisional U.S. seasonal temperature or precipitation. Averages of the most recent 15 and 30 yr have performed better during the recent past for U.S. divisional seasonal temperature and precipitation, respectively; these averaging windows are longer than those currently employed for this purpose at the U.S. Climate Prediction Center.
34

Giordani, Paolo, e Robert Kohn. "Adaptive Independent Metropolis–Hastings by Fast Estimation of Mixtures of Normals". Journal of Computational and Graphical Statistics 19, n. 2 (gennaio 2010): 243–59. http://dx.doi.org/10.1198/jcgs.2009.07174.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
35

Stavropoulou, Eleni, Christophe Dano, Marc Boulon, Matthieu Briffaut, Ankit Sharma e Alain Puech. "Résistance au cisaillement des interfaces roche / coulis représentatives de pieux offshore". Revue Française de Géotechnique, n. 158 (2019): 6. http://dx.doi.org/10.1051/geotech/2019012.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Dans le cadre des projets éoliens offshore, la conception optimisée des fondations de type monopieux forés dans des massifs rocheux carbonatés demande une estimation réaliste de la résistance au cisaillement mobilisable. Dans cette optique, une campagne d’essais de cisaillement d’interface roche / coulis est conduite sur la machine BCR3D qui permet, en particulier, d’appliquer une condition de rigidité imposée sur la normale à l’interface, condition jugée représentative de ce qui se passe au droit du pieu. On présente dans une première partie le dispositif expérimental, la préparation des éprouvettes d’essai et le choix des paramètres expérimentaux. Puis, au travers de différents résultats expérimentaux, on montre l’importance de la rigidité normale dans le développement de la résistance au cisaillement. Dans un second temps, on met en évidence que la rugosité, son évolution sous chargement et le contraste de propriétés mécaniques entre une roche (calcaire tendre ou calcarénite) et le coulis conditionnent le frottement mobilisable. L’élément commun à toutes les observations est la dilatation empêchée de l’interface et l’évolution associée de la contrainte normale à l’interface.
36

Norton. "A MULTIVARIATE TECHNIQUE FOR ESTIMATING NEW ZEALAND TEMPERATURE NORMALS". Weather and Climate 5, n. 2 (1985): 64. http://dx.doi.org/10.2307/44279988.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
37

Wu, Yingrui, Mingyang Zhao, Keqiang Li, Weize Quan, Tianqi Yu, Jianfeng Yang, Xiaohong Jia e Dong-Ming Yan. "CMG-Net: Robust Normal Estimation for Point Clouds via Chamfer Normal Distance and Multi-Scale Geometry". Proceedings of the AAAI Conference on Artificial Intelligence 38, n. 6 (24 marzo 2024): 6171–79. http://dx.doi.org/10.1609/aaai.v38i6.28434.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This work presents an accurate and robust method for estimating normals from point clouds. In contrast to predecessor approaches that minimize the deviations between the annotated and the predicted normals directly, leading to direction inconsistency, we first propose a new metric termed Chamfer Normal Distance to address this issue. This not only mitigates the challenge but also facilitates network training and substantially enhances the network robustness against noise. Subsequently, we devise an innovative architecture that encompasses Multi-scale Local Feature Aggregation and Hierarchical Geometric Information Fusion. This design empowers the network to capture intricate geometric details more effectively and alleviate the ambiguity in scale selection. Extensive experiments demonstrate that our method achieves the state-of-the-art performance on both synthetic and real-world datasets, particularly in scenarios contaminated by noise. Our implementation is available at https://github.com/YingruiWoo/CMG-Net_Pytorch.
38

Aloqaili, Murtadha Jaafar, e Rahim Alhamzawi. "Bayesian Composite Quantile Regression with Composite Group Bridge Penalty". NeuroQuantology 20, n. 2 (28 febbraio 2022): 173–79. http://dx.doi.org/10.14704/nq.2022.20.2.nq22269.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We study composite quantile regression (CQReg) with composite group bridge penalty for model selection and estimation. Compared to conventional mean regression, composite quantile regression (CQR) is an efficient and robust estimation approach. A simple and efficient algorithm was developed for posterior inference using a pseudo composite asymmetric Laplace distribution which can be formulated as a location-scale mixture of normals. The composite group bridge priors were formulated as a scale mixture of multivariate uniforms. We assess the performance of the proposed method using simulation studies, and demonstrate it with an air pollution data. Results indicated that our approach performs very well compared to the existing approaches.
39

SRIVASTAVA, A. K., P. GUHATHAKURTA e S. R. KSHIRSAGAR. "Estimation of annual and seasonal temperatures over Indian stations using optimal normals". MAUSAM 54, n. 3 (18 gennaio 2022): 615–22. http://dx.doi.org/10.54302/mausam.v54i3.1552.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
40

Zhao, Can, e Xianglin Meng. "An improved algorithm for k-nearest-neighbor finding and surface normals estimation". Tsinghua Science and Technology 14, S1 (giugno 2009): 77–81. http://dx.doi.org/10.1016/s1007-0214(09)70071-9.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
41

Yoon, Jong-In. "Estimation of the Mixture of Normals of Saving Rate Using Gibbs Algorithm". Journal of Digital Convergence 13, n. 10 (28 ottobre 2015): 219–24. http://dx.doi.org/10.14400/jdc.2015.13.10.219.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
42

Guo, Ruibin, Keju Peng, Weihong Fan, Yongping Zhai e Yunhui Liu. "RGB-D SLAM Using Point–Plane Constraints for Indoor Environments". Sensors 19, n. 12 (17 giugno 2019): 2721. http://dx.doi.org/10.3390/s19122721.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Pose estimation and map reconstruction are basic requirements for robotic autonomous behavior. In this paper, we propose a point–plane-based method to simultaneously estimate the robot’s poses and reconstruct the current environment’s map using RGB-D cameras. First, we detect and track the point and plane features from color and depth images, and reliable constraints are obtained, even for low-texture scenes. Then, we construct cost functions from these features, and we utilize the plane’s minimal representation to minimize these functions for pose estimation and local map optimization. Furthermore, we extract the Manhattan World (MW) axes on the basis of the plane normals and vanishing directions of parallel lines for the MW scenes, and we add the MW constraint to the point–plane-based cost functions for more accurate pose estimation. The results of experiments on public RGB-D datasets demonstrate the robustness and accuracy of the proposed algorithm for pose estimation and map reconstruction, and we show its advantages compared with alternative methods.
43

Hopkinson, Ron F., Michael F. Hutchinson, Daniel W. McKenney, Ewa J. Milewska e Pia Papadopol. "Optimizing Input Data for Gridding Climate Normals for Canada". Journal of Applied Meteorology and Climatology 51, n. 8 (agosto 2012): 1508–18. http://dx.doi.org/10.1175/jamc-d-12-018.1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractSpatial models of 1971–2000 monthly climate normals for daily maximum and minimum temperature and total precipitation are required for many applications. The World Meteorological Organization’s recommended standard for the calculation of a normal value is a complete 30-yr record with a minimal amount of missing data. Only 650 stations (~16%) in Canada meet this criterion for the period 1971–2000. Thin-plate smoothing-spline analyses, as implemented by the Australian National University Splines (ANUSPLIN) package, are used to assess the utility of differing amounts of station data in estimating nationwide monthly climate normals. The data include 1) only those stations (1169) with 20 or more years of data, 2) all stations (3835) with 5 or more years of data in at least one month, and 3) as in case 2 but with data adjusted through the most statistically significant linear-regression relationship with a nearby long-term station to 20 or more years (3983 stations). Withheld-station tests indicate that the regression-adjusted normals as in dataset 3 generally yield the best results for all three climatological elements, but the unadjusted normals as in dataset 2 are competitive with the adjusted normals in spring and autumn, reflecting the known longer spatial correlation scales in these seasons. The summary mean absolute differences between the ANUSPLIN estimates and the observations at 48 spatially representative withheld stations for dataset 3 are 0.36°C, 0.66°C, and 4.7 mm, respectively, for maximum temperature, minimum temperature, and precipitation. These are respectively 18%, 7%, and 18% smaller than the summary mean absolute differences for the long-term normals in dataset 1.
44

Zhang Jinghua, 张景华, 张焱 Zhang Yan, 石志广 Shi Zhiguang, 李飚 Li Biao, 张宇 Zhang Yu, 刘荻 Liu Di e 索玉昌 Suo Yuchang. "基于法向量估计的透明物体表面反射光分离". Acta Optica Sinica 41, n. 15 (2021): 1526001. http://dx.doi.org/10.3788/aos202141.1526001.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
45

Bernton, Espen, Pierre E. Jacob, Mathieu Gerber e Christian P. Robert. "On parameter estimation with the Wasserstein distance". Information and Inference: A Journal of the IMA 8, n. 4 (22 ottobre 2019): 657–76. http://dx.doi.org/10.1093/imaiai/iaz003.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract Statistical inference can be performed by minimizing, over the parameter space, the Wasserstein distance between model distributions and the empirical distribution of the data. We study asymptotic properties of such minimum Wasserstein distance estimators, complementing results derived by Bassetti, Bodini and Regazzini in 2006. In particular, our results cover the misspecified setting, in which the data-generating process is not assumed to be part of the family of distributions described by the model. Our results are motivated by recent applications of minimum Wasserstein estimators to complex generative models. We discuss some difficulties arising in the numerical approximation of these estimators. Two of our numerical examples ($g$-and-$\kappa$ and sum of log-normals) are taken from the literature on approximate Bayesian computation and have likelihood functions that are not analytically tractable. Two other examples involve misspecified models.
46

Xu, Rui, Zhiyang Dou, Ningna Wang, Shiqing Xin, Shuangmin Chen, Mingyan Jiang, Xiaohu Guo, Wenping Wang e Changhe Tu. "Globally Consistent Normal Orientation for Point Clouds by Regularizing the Winding-Number Field". ACM Transactions on Graphics 42, n. 4 (26 luglio 2023): 1–15. http://dx.doi.org/10.1145/3592129.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Estimating normals with globally consistent orientations for a raw point cloud has many downstream geometry processing applications. Despite tremendous efforts in the past decades, it remains challenging to deal with an unoriented point cloud with various imperfections, particularly in the presence of data sparsity coupled with nearby gaps or thin-walled structures. In this paper, we propose a smooth objective function to characterize the requirements of an acceptable winding-number field, which allows one to find the globally consistent normal orientations starting from a set of completely random normals. By taking the vertices of the Voronoi diagram of the point cloud as examination points, we consider the following three requirements: (1) the winding number is either 0 or 1, (2) the occurrences of 1 and the occurrences of 0 are balanced around the point cloud, and (3) the normals align with the outside Voronoi poles as much as possible. Extensive experimental results show that our method outperforms the existing approaches, especially in handling sparse and noisy point clouds, as well as shapes with complex geometry/topology.
47

Chapple, C., B. D. Bowen, R. K. Reed, S. L. Xie e J. L. Bert. "A model of human microvascular exchange: parameter estimation based on normals and nephrotics". Computer Methods and Programs in Biomedicine 41, n. 1 (settembre 1993): 33–54. http://dx.doi.org/10.1016/0169-2607(93)90064-r.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
48

Li, Can, Ping Chen, Xin Xu, Xinyu Wang e Aijun Yin. "A Coarse-to-Fine Method for Estimating the Axis Pose Based on 3D Point Clouds in Robotic Cylindrical Shaft-in-Hole Assembly". Sensors 21, n. 12 (12 giugno 2021): 4064. http://dx.doi.org/10.3390/s21124064.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
In this work, we propose a novel coarse-to-fine method for object pose estimation coupled with admittance control to promote robotic shaft-in-hole assembly. Considering that traditional approaches to locate the hole by force sensing are time-consuming, we employ 3D vision to estimate the axis pose of the hole. Thus, robots can locate the target hole in both position and orientation and enable the shaft to move into the hole along the axis orientation. In our method, first, the raw point cloud of a hole is processed to acquire the keypoints. Then, a coarse axis is extracted according to the geometric constraints between the surface normals and axis. Lastly, axis refinement is performed on the coarse axis to achieve higher precision. Practical experiments verified the effectiveness of the axis pose estimation. The assembly strategy composed of axis pose estimation and admittance control was effectively applied to the robotic shaft-in-hole assembly.
49

Essama, Gervais, e Huu Minh Mai. "Représentation probabiliste des rendements des actifs financiers : difficultés d’estimation et résultats empiriques". Économie appliquée 47, n. 4 (1994): 133–66. http://dx.doi.org/10.3406/ecoap.1994.1537.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
L'utilisation des lois stables en finance a pour but de remédier à l’insuffisance de la loi normale. De nombreuses études empiriques ont montré que les variations des cours boursiers dévient fortement de la normalité. Pour vérifier l’adéquation des variations de cours boursiers à cette famille de lois, il est nécessaire d’en estimer les paramètres et d’être capable de calculer leurs fonctions de distribution : de nombreuses méthodes de calculs et d’ estimation sont alors proposées mais les résultats obtenus restent peu satisfaisants. Dans certains cas, l’absence de variance rend difficile la modélisation du comportement des rentabilités ou encore l ’évaluation de certains titres financiers dont la valeur est en grande partie déterminée par la volatilité -les options, par exemple. Par ailleurs, l’occurrence d’événements à caractère accidentel n’est pas suffisamment appréhendée par les lois stables. C’est ainsi que sont alors proposés des processus mixtes et des processus qui intègrent la composante autorégressive de certaines séries économiques. Nous proposons des alternatives à la loi normale et mettons en évidence la difficulté particulière d’utilisation des lois stables, à savoir l’estimation par différentes méthodes des paramètres qui les caractérisent. Nos tests conduisent à rejeter la loi normale. Les lois stables semblent mieux décrire la distribution des données boursières françaises. Ces résultats entraînent un certain nombre d’implications que nous évoquerons également, notamment la définition d’une mesure de la dispersion des cours pour la gestion de portefeuille.
50

Kushwaha, S. K. P., K. R. Dayal, A. Singh e K. Jain. "BUILDING FACADE AND ROOFTOP SEGMENTATION BY NORMAL ESTIMATION FROM UAV DERIVED RGB POINT CLOUD". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W17 (29 novembre 2019): 173–77. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w17-173-2019.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract. Point cloud segmentation is a significant process to organise an unstructured point cloud. In this study, RGB point cloud was generated with the help of images acquired from an Unmanned Aerial Vehicle (UAV). A dense urban area was considered with varying planar features in the built-up environment along with buildings with different floors. Initially, using Cloth Simulation Filter (CSF) filter, the ground and the non-ground features in the Point Cloud Data (PCD) were segmented, with non-ground features comprising trees and buildings and ground features comprising roads, ground vegetation, and open land. Subsequently, using CANUPO classifier the trees and building points were classified. Noise filtering removed the points which have less density in clusters. Point cloud normals were generated for the building points. For segmentation building elements, normal vector components in different directions (X component, Y component and Z component) were used to segment out the facade, and the roof points of the buildings as the surface normals corresponding to the roof will have a higher contribution in the z component of the normal vector. The validation of the segmentation is done by comparing the results with manually identified roof points and façade points in the point cloud. Overall accuracies obtained for building roof and building facade segmentation are 90.86 % and 84.83 % respectively.

Vai alla bibliografia