Academic literature on the topic 'Weakly calibrated'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Weakly calibrated.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Weakly calibrated"

1

LI Guo-dong, 李国栋, 田国会 TIAN Guo-hui, 王洪君 WANG Hong-jun, and 尹建芹 YIN Jian-qin. "Euclidean epipolar rectification frame of weakly calibrated stereo pairs." Optics and Precision Engineering 22, no. 7 (2014): 1955–61. http://dx.doi.org/10.3788/ope.20142207.1955.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

JARUSIRISAWAD, Songkran, Takahide HOSOKAWA, and Hideo SAITO. "Diminished reality using plane-sweep algorithm with weakly-calibrated cameras." Progress in Informatics, no. 7 (March 2010): 11. http://dx.doi.org/10.2201/niipi.2010.7.3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Robert, L., and OD Faugeras. "Relative 3D positioning and 3D convex hull computation from a weakly calibrated stereo pair." Image and Vision Computing 13, no. 3 (April 1995): 189–96. http://dx.doi.org/10.1016/0262-8856(95)90839-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Shengyong Chen and Y. F. Li. "Finding Optimal Focusing Distance and Edge Blur Distribution for Weakly Calibrated 3-D Vision." IEEE Transactions on Industrial Informatics 9, no. 3 (August 2013): 1680–87. http://dx.doi.org/10.1109/tii.2012.2221471.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ralis, S. J., B. Vikramaditya, and B. J. Nelson. "Micropositioning of a weakly calibrated microassembly system using coarse-to-fine visual servoing strategies." IEEE Transactions on Electronics Packaging Manufacturing 23, no. 2 (April 2000): 123–31. http://dx.doi.org/10.1109/6104.846935.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Houqin, Bian, and Su Jianbo. "Feature matching based on geometric constraints in weakly calibrated stereo views of curved scenes." Journal of Systems Engineering and Electronics 19, no. 3 (June 2008): 562–70. http://dx.doi.org/10.1016/s1004-4132(08)60121-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Andreassen, L. M., M. Huss, K. Melvold, H. Elvehøy, and S. H. Winsvold. "Ice thickness measurements and volume estimates for glaciers in Norway." Journal of Glaciology 61, no. 228 (2015): 763–75. http://dx.doi.org/10.3189/2015jog14j161.

Full text
Abstract:
AbstractGlacier volume and ice thickness distribution are important variables for water resource management in Norway and the assessment of future glacier changes. We present a detailed assessment of thickness distribution and total glacier volume for mainland Norway based on data and modelling. Glacier outlines from a Landsat-derived inventory from 1999 to 2006 covering an area of 2692 ± 81 km2 were used as input. We compiled a rich set of ice thickness observations collected over the past 30 years. Altogether, interpolated ice thickness measurements were available for 870 km2 (32%) of the current glacier area of Norway, with a total ice volume of 134 ± 23 km3. Results indicate that mean ice thickness is similar for all larger ice caps, and weakly correlates with their total area. Ice thickness data were used to calibrate a physically based distributed model for estimating the ice thickness of unmeasured glaciers. The results were also used to calibrate volume–area scaling relations. The calibrated total volume estimates for all Norwegian glaciers ranged from 257 to 300 km3.
APA, Harvard, Vancouver, ISO, and other styles
8

Ma, Meiyi, John Stankovic, Ezio Bartocci, and Lu Feng. "Predictive Monitoring with Logic-Calibrated Uncertainty for Cyber-Physical Systems." ACM Transactions on Embedded Computing Systems 20, no. 5s (October 31, 2021): 1–25. http://dx.doi.org/10.1145/3477032.

Full text
Abstract:
Predictive monitoring—making predictions about future states and monitoring if the predicted states satisfy requirements—offers a promising paradigm in supporting the decision making of Cyber-Physical Systems (CPS). Existing works of predictive monitoring mostly focus on monitoring individual predictions rather than sequential predictions. We develop a novel approach for monitoring sequential predictions generated from Bayesian Recurrent Neural Networks (RNNs) that can capture the inherent uncertainty in CPS, drawing on insights from our study of real-world CPS datasets. We propose a new logic named Signal Temporal Logic with Uncertainty (STL-U) to monitor a flowpipe containing an infinite set of uncertain sequences predicted by Bayesian RNNs. We define STL-U strong and weak satisfaction semantics based on whether all or some sequences contained in a flowpipe satisfy the requirement. We also develop methods to compute the range of confidence levels under which a flowpipe is guaranteed to strongly (weakly) satisfy an STL-U formula. Furthermore, we develop novel criteria that leverage STL-U monitoring results to calibrate the uncertainty estimation in Bayesian RNNs. Finally, we evaluate the proposed approach via experiments with real-world CPS datasets and a simulated smart city case study, which show very encouraging results of STL-U based predictive monitoring approach outperforming baselines.
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Yuzhuo, Hangting Chen, Jian Wang, Pei Wang, and Pengyuan Zhang. "Confidence Learning for Semi-Supervised Acoustic Event Detection." Applied Sciences 11, no. 18 (September 15, 2021): 8581. http://dx.doi.org/10.3390/app11188581.

Full text
Abstract:
In recent years, the involvement of synthetic strongly labeled data, weakly labeled data, and unlabeled data has drawn much research attention in semi-supervised acoustic event detection (SAED). The classic self-training method carries out predictions for unlabeled data and then selects predictions with high probabilities as pseudo-labels for retraining. Such models have shown its effectiveness in SAED. However, probabilities are poorly calibrated confidence estimates, and samples with low probabilities are ignored. Hence, we introduce a confidence-based semi-supervised Acoustic event detection (C-SAED) framework. The C-SAED method learns confidence deliberately and retrains all data distinctly by applying confidence as weights. Additionally, we apply a power pooling function whose coefficient can be trained automatically and use weakly labeled data more efficiently. The experimental results demonstrate that the generated confidence is proportional to the accuracy of the predictions. Our C-SAED framework achieves a relative error rate reduction of 34% in contrast to the baseline model.
APA, Harvard, Vancouver, ISO, and other styles
10

Papachristou, Christos, and Anastasios N. Delopoulos. "A method for the evaluation of projective geometric consistency in weakly calibrated stereo with application to point matching." Computer Vision and Image Understanding 119 (February 2014): 81–101. http://dx.doi.org/10.1016/j.cviu.2013.12.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Weakly calibrated"

1

Hardy, Clément. "Architectures multi-échelles de type encοdeur-décοdeur pοur la stéréοphοtοmétrie." Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMC222.

Full text
Abstract:
La stéréophotométrie est une technique de reconstruction 3D de la surface d'un objet. De plus en plus de recherches s'intéressent à ce problème qui se veut prometteur dans le monde industriel. En effet, la stéréophotométrie peut être utilisée pour détecter les défauts d'usinage de pièces mécaniques ou pour de la reconnaissance faciale par exemple. Cette thèse explore les méthodes d'apprentissage profond pour la stéréophotométrie, notamment les différents aspects liés aux bases de données d'entraînement et aux architectures considérées.De manière générale, la sur-paramétrisation d'un réseau de neurones est souvent suffisante pour supporter la diversité des problèmes rencontrés. La base de données d'entraînement est alors considérée comme le point clé permettant de conditionner le réseau au problème traité. Par conséquent, pour répondre à ce besoin, nous proposons une nouvelle base de données d'entraînement synthétique. Cette base de données considère une très grande variété de géométries, de textures, de directions ou conditions lumineuses mais également d'environnements, permettant donc de générer un nombre de situation quasiment infini.Le second point décisif d'une bonne reconstruction concerne le choix de l'architecture. L'architecture d'un réseau doit assurer une bonne capacité de généralisation sur de nouvelles données pour générer de très bons résultats sur des données inédites. Et ce, quelle que soit l'application. En particulier, pour la stéréophotométrie, l'enjeu est d'être capable de reconstruire des images très haute résolution afin de ne pas perdre de détails. Nous proposons alors une architecture multi-échelles de type encodeur-décodeur afin de répondre à ce problème.Dans un premier temps, nous proposons une architecture fondée sur les réseaux convolutionnels pour répondre au problème de stéréophotométrie calibrée, i.e. quand la direction lumineuse est connue. Dans un second temps, nous proposons une version fondé sur les Transformers afin de répondre au problème de stéréophotométrie universelle. C'est-à-dire que nous sommes en capacité de gérer n'importe quel environnement, direction lumineuse, etc., sans aucune information préalable. Finalement, pour améliorer les reconstructions sur des matériaux difficiles (translucides ou brillants par exemple), nous proposons une nouvelle approche que nous appelons ``faiblement calibrée'' pour la stéréophotométrie. Dans ce contexte, nous n'avons qu'une connaissance approximative de la direction d'éclairage.L'ensemble des pistes que nous avons explorées ont conduit à des résultats convaincants, à la fois quantitatifs et visuels sur l'ensemble des bases de données de l'état-de-l'art. En effet, nous avons pu observer une amélioration notable de la précision de reconstruction des cartes de normales, contribuant ainsi à avancer l'état de l'art dans ce domaine
Photometric stereo is a technique for 3D surface reconstruction of objects. This field has seen a surge in research interest due to its potential applications in industry. Specifically, photometric stereo can be employed for tasks such as detecting machining defects in mechanical components or facial recognition. This thesis delves into deep learning methods for photometry stero, with a particular focus on training data and network architectures.While neural network over-parameterization is often adequate, the training dataset plays a pivotal role in task adaptation. To generate a highly diverse and extensible training set, we propose a new synthetic dataset. This dataset incorporates a broad spectrum of geometric, textural, lighting, and environmental variations, allowing for the creation of nearly infinite training instances.The second decisive point of a good reconstruction concerns the choice of architecture. The architecture of a network must ensure a good generalization capacity on new data to generate very good results on unseen data. And this, regardless of the application. In particular, for the photometric stereo problem, the challenge is to be able to reconstruct very high-resolution images in order not to lose any details. We therefore propose a multi-scale encoder-decoder architecture to address this problem.We first introduce a convolutional neural network architecture for calibrated photometric stereo, where the lighting direction is known. To handle unconstrained environments, we propose a Transformers-based approach for universal photometric stereo. Lastly, for challenging materials shiny like translucent or shiny surfaces, we introduce a ``weakly calibrated'' approach that assumes only approximate knowledge of the lighting direction.The approaches we have investigated have consistently demonstrated strong performance on standard benchmarks, as evidenced by both quantitative metrics and visual assessments. Our results, particularly the improved accuracy of reconstructed normal maps, represent a significant advancement in photometric stereo
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Weakly calibrated"

1

Du, Yingkui, Panli He, Nan Wang, Xiaowei Han, and Zhonghu Yuan. "Pavement Transverse Profile Roughness via Weakly Calibrated Laser Triangulation." In Intelligent Computing Methodologies, 180–91. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-42297-8_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Jian, Tingwei Liu, Miao Zhang, and Yongri Piao. "To Be Critical: Self-calibrated Weakly Supervised Learning for Salient Object Detection." In Pattern Recognition and Computer Vision, 184–98. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-8552-4_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yap, Po Jen. "Dialogic Judicial Review and First World Autocracies." In Redefining Comparative Constitutional Law, 274–92. Oxford University PressOxford, 2024. http://dx.doi.org/10.1093/9780191996344.003.0019.

Full text
Abstract:
Abstract Dialogic or weak-form review is the only viable and effective path for courts operating in First World autocracies. The judicial use of strong-form review to address problems posed by sedition laws and restrictions on the franchise—as Mark Tushnet suggests—would be counterproductive as this would only be to the detriment of the courts. At the same time, I argue that dialogic review is not judicial abdication. I will also show how weak-form review has enhanced rights protection in Singapore and Hong Kong, and has imposed soft but meaningful controls on state power in these autocracies. Precisely because these autocracies want to remain First World, the perceived independence of the courts must be preserved for their governments to retain talent and continued investments in the economy. Governments in First World autocracies are sensitive to global businesses’ perception of the regime’s commitment to the rule of law as that directly impacts the entity’s economic future. This is unlike military dictatorships and banana republics, where the rent-seeking behavior of autocrats is driven primarily by the self-interest of its cabal. Therefore, in First World autocracies, so long as the courts respect the regime’s plenary agenda-setting powers, the government will in turn acquiesce to the judiciary’s calibrated show of force to preserve rights.
APA, Harvard, Vancouver, ISO, and other styles
4

"Advancing an Ecosystem Approach in the Gulf of Maine." In Advancing an Ecosystem Approach in the Gulf of Maine, edited by Stephen S. Hale. American Fisheries Society, 2012. http://dx.doi.org/10.47886/9781934874301.ch13.

Full text
Abstract:
<i>Abstract</i>.—Spatial patterns of subtidal benthic invertebrates and physicalchemical variables in the nearshore Gulf of Maine (Acadian biogeographic province) were studied to provide information to calibrate benthic indices of ecological condition, determine physical-chemical factors affecting species distributions, and compare recent data with historical biogeographic studies. Knowledge of the distribution of species and how they are affected by biotic, environmental, and anthropogenic factors is essential to the pursuit of ecosystem-based management. Five years (2000–2004) of data from 268 reference stations of the National Coastal Assessment were used. Multidimensional scaling done on Bray-Curtis similarity matrices of species’ relative abundance (367 species) showed faunal transitions around Cape Ann and Cape Elizabeth, with a weaker transition around Penobscot Bay. The southernmost area shared 41% of its species with the northernmost area. An ordination of environmental data (temperature, salinity, sediment percent silt-clay, depth) correlated well with the ordination of benthic relative abundance data (<I>R </I>= 0.75, <i>p </i>< 0.03). Temperature was the most important factor affecting broad species distribution patterns, followed by salinity. A multivariate regression tree first split the fauna at a temperature of 16°C. Species richness increased with increasing salinity but showed no relationship with latitude or percent silt-clay. Accuracy of benthic indices for the nearshore Gulf of Maine might be improved by taking biogeographical differences among subregions into account. These results provide a foundation for ecosystem-based management, valuation of ecosystem services, conservation, and ocean spatial planning.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Weakly calibrated"

1

Pari, L., J. M. Sebastian, C. Gonzalez, and L. Angel. "Estimation of the image Jacobian using two weakly calibrated cameras." In European Control Conference 2007 (ECC). IEEE, 2007. http://dx.doi.org/10.23919/ecc.2007.7068870.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Tamadazte, Brahim, and Nicolas Andreff. "Weakly calibrated stereoscopic visual servoing for laser steering: Application to phonomicrosurgery." In 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2014). IEEE, 2014. http://dx.doi.org/10.1109/iros.2014.6942641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Remley, Kate A., Dylan F. Williams, Paul D. Hale, Chih-Ming-Wang, Jeffrey A. Jargon, and Youngcheol Park. "Calibrated oscilloscope measurements for system-level characterization of weakly nonlinear sources." In 2014 International Workshop on Integrated Nonlinear Microwave and Millimetre-wave Circuits (INMMiC). IEEE, 2014. http://dx.doi.org/10.1109/inmmic.2014.6815075.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zanne, P., G. Morel, and F. Plestan. "Robust control of 6 DOF displacements reconstructed from a weakly calibrated camera." In 2001 European Control Conference (ECC). IEEE, 2001. http://dx.doi.org/10.23919/ecc.2001.7075997.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lin, Yuping, Gerard Medioni, and Jongmoo Choi. "Accurate 3D face reconstruction from weakly calibrated wide baseline images with profile contours." In 2010 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2010. http://dx.doi.org/10.1109/cvpr.2010.5539793.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Jin Liang and Su Jianbo. "Homography-based correspondence in weakly calibrated curved surface environment and its error analysis." In IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004. IEEE, 2004. http://dx.doi.org/10.1109/robot.2004.1307988.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Haner, Sebast, and Anders Heyden. "A step towards self-calibration in SLAM: Weakly calibrated on-line structure and motion estimation." In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR Workshops). IEEE, 2010. http://dx.doi.org/10.1109/cvprw.2010.5543256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Matsumura, Yutaka, and Yuji Sakamoto. "Synthesis of arbitrary viewpoint image from images of multiple weakly calibrated camera images using all in-focus rendering method." In Electronic Imaging 2008, edited by Brian D. Corner, Masaaki Mochimaru, and Robert Sitnik. SPIE, 2008. http://dx.doi.org/10.1117/12.767041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Asadi, Sadegh, and Abbas Khaksar. "Analytical and Numerical Sand Production Prediction Calibrated with Field Data, Example from High-Rate Gas Wells." In SPE Asia Pacific Oil & Gas Conference and Exhibition. SPE, 2022. http://dx.doi.org/10.2118/210776-ms.

Full text
Abstract:
Abstract Sand production prediction is essential from the early stages of field development planning for well completion design and later for production management. Unconsolidated and weakly consolidated sandstones are prone to fail at low flowing bottom hole pressures during hydrocarbon production. To predict the sand-free drawdown, a robust sand prediction model that integrates near-wellbore and in-situ stresses, rock mechanical properties, well trajectory, reservoir pressure, production and depletion trends is required. Sanding prediction models should be calibrated with field data such as production and well tests observation. In the absence of field data, numerical techniques can provide a reliable estimate on potential onset and severity of sanding at various reservoir pressures. In this study, analytical and finite-element numerical models are independently used to predict the onset of sanding and volume of produced sand from high rate has wells with weakly consolidated sandstone reservoirs in onshore, Western Australia. The analytical method uses a poro-elastic model and core-calibrated log-derived rock strength profiles with an empirical effective rock strength factor (ESF). In the study, the ESF was calibrated against documented field sanding observation from a well test extended flow period at the initial reservoir pressure under a low drawdown pressure. The numerical method uses a poro-elasto-plastic model defined from triaxial core tests. The rock failure criterion in the numerical method is based on a critical strain limit (CSL) corresponding to the failure of the inner wall of thick-walled cylinder core tests that can also satisfy the existing wells sanding observations. To verify the onset and severity of sanding predicted by the analytical model, numerical simulations for an identical sandstone interval are developed to investigate the corresponding CSL. This combined analytical and numerical modelling calibrated with field data provided high confidence in the sanding evaluation and their application for future well completion and sand management decisions. The analytical model was finally used for sanding assessment over field life pressure condition because of its processing simplicity, speed and flexibility in assessing various pressure and rock strength scenarios with sensitivity analysis over the whole production interval in compared with the numerical method which is more suitable for single-depth, single pressure condition and well and perforation trajectory modelling.
APA, Harvard, Vancouver, ISO, and other styles
10

Albakri, Mohammad I., Vijaya V. N. Sriram Malladi, and Pablo A. Tarazaga. "Acoustoelastic-Based Stress Measurement Utilizing Low-Frequency Flexural Waves." In ASME 2017 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/smasis2017-3858.

Full text
Abstract:
Current acoustoelastic-based stress measurement techniques operate at the high-frequency, weakly-dispersive portions of the dispersion curves. The weak dispersive effects at such high frequencies allow the utilization of time-of-flight measurements to quantify the effects of stress on wave speed. However, this comes at the cost of lower sensitivity to the state-of-stress of the structure, and hence calibration at a known stress state is required to compensate for material and geometric uncertainties in the structure under test. In this work, the strongly-dispersive, highly stress-sensitive, low-frequency flexural waves are utilized for stress measurement in structural components. A new model-based technique is developed for this purpose, where the acoustoelastic theory is integrated into a numerical optimization algorithm to analyze dispersive waves propagating along the structure under test. The developed technique is found to be robust against material and geometric uncertainties. In the absence of calibration experiments, the robustness of this technique is inversely proportional to the excitation frequency. The capabilities of the developed technique are experimentally demonstrated on a long rectangular beam, where reference-free, un-calibrated stress measurements are successfully conducted.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Weakly calibrated"

1

Ávila-Montealegre, Oscar Iván, Anderson Grajales-Olarte, Juan J. Ospina-Tejeiro, and Mario A. Ramos-Veloza. Minimum Wage and Macroeconomic Adjustment: Insights from a Small Open, Emerging, Economy with Formal and Informal Labor. Banco de la República, December 2023. http://dx.doi.org/10.32468/be.1264.

Full text
Abstract:
We examine the adjustment of a small, open, emerging market economy (SOEME) to an unexpected increase in the minimum wage using an extended New-Keynesian SOE model that incorporates heterogeneous households, a flexible production structure, and a minimum wage rule. We calibrate the model for Colombia and find that an unexpected increase in the minimum wage has significant effects on the low-skilled labor market, and weaker impacts on inflation and the policy interest rate. The rise in the minimum wage increases production costs and prompts the substitution of formal low-skilled labor with informal workers and machinery, resulting in reduced output, increased inflation, and higher policy interest rates. We also observe that the minimum wage influences the transmission of productivity, demand, and monetary shocks, leading to a more persistent impact on macroeconomic variables, and a less efficient monetary policy to control inflation. Our findings suggest that the minimum wage has important macroeconomic implications, and affects emerging market economies through different channels than in developed economies.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography