Gotowa bibliografia na temat „3D real-time localization”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „3D real-time localization”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Artykuły w czasopismach na temat "3D real-time localization"
Liu, M. H., Sheng Zhang i Yi Pan. "UWB-based Real-Time 3D High Precision Localization System". Journal of Physics: Conference Series 2290, nr 1 (1.06.2022): 012082. http://dx.doi.org/10.1088/1742-6596/2290/1/012082.
Pełny tekst źródłaLynen, Simon, Bernhard Zeisl, Dror Aiger, Michael Bosse, Joel Hesch, Marc Pollefeys, Roland Siegwart i Torsten Sattler. "Large-scale, real-time visual–inertial localization revisited". International Journal of Robotics Research 39, nr 9 (7.07.2020): 1061–84. http://dx.doi.org/10.1177/0278364920931151.
Pełny tekst źródłaLI, Wei, Yi WU, Chunlin SHEN i Huajun GONG. "Robust 3D Surface Reconstruction in Real-Time with Localization Sensor". IEICE Transactions on Information and Systems E101.D, nr 8 (1.08.2018): 2168–72. http://dx.doi.org/10.1587/transinf.2018edl8056.
Pełny tekst źródłaMair, Elmar, Klaus H. Strobl, Tim Bodenmüller, Michael Suppa i Darius Burschka. "Real-time Image-based Localization for Hand-held 3D-modeling". KI - Künstliche Intelligenz 24, nr 3 (26.05.2010): 207–14. http://dx.doi.org/10.1007/s13218-010-0037-z.
Pełny tekst źródłaBędkowski, Janusz, Andrzej Masłowski i Geert De Cubber. "Real time 3D localization and mapping for USAR robotic application". Industrial Robot: An International Journal 39, nr 5 (17.08.2012): 464–74. http://dx.doi.org/10.1108/01439911211249751.
Pełny tekst źródłaBaeck, P. J., N. Lewyckyj, B. Beusen, W. Horsten i K. Pauly. "DRONE BASED NEAR REAL-TIME HUMAN DETECTION WITH GEOGRAPHIC LOCALIZATION". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-3/W8 (20.08.2019): 49–53. http://dx.doi.org/10.5194/isprs-archives-xlii-3-w8-49-2019.
Pełny tekst źródłaHauser, Fabian, i Jaroslaw Jacak. "Real-time 3D single-molecule localization microscopy analysis using lookup tables". Biomedical Optics Express 12, nr 8 (16.07.2021): 4955. http://dx.doi.org/10.1364/boe.424016.
Pełny tekst źródłaLi, Yiming, Markus Mund, Philipp Hoess, Joran Deschamps, Ulf Matti, Bianca Nijmeijer, Vilma Jimenez Sabinina, Jan Ellenberg, Ingmar Schoen i Jonas Ries. "Real-time 3D single-molecule localization using experimental point spread functions". Nature Methods 15, nr 5 (9.04.2018): 367–69. http://dx.doi.org/10.1038/nmeth.4661.
Pełny tekst źródłaZhu, Wenjun, Peng Wang, Rui Li i Xiangli Nie. "Real-time 3D work-piece tracking with monocular camera based on static and dynamic model libraries". Assembly Automation 37, nr 2 (3.04.2017): 219–29. http://dx.doi.org/10.1108/aa-02-2017-018.
Pełny tekst źródłaFeng, Sheng, Chengdong Wu, Yunzhou Zhang i Shigen Shen. "Collaboration calibration and three-dimensional localization in multi-view system". International Journal of Advanced Robotic Systems 15, nr 6 (1.11.2018): 172988141881377. http://dx.doi.org/10.1177/1729881418813778.
Pełny tekst źródłaRozprawy doktorskie na temat "3D real-time localization"
Lee, Young Jin. "Real-Time Object Motion and 3D Localization from Geometry". The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1408443773.
Pełny tekst źródłaUhercik, Marian. "Surgical tools localization in 3D ultrasound images". Phd thesis, INSA de Lyon, 2011. http://tel.archives-ouvertes.fr/tel-00735702.
Pełny tekst źródłaPicard, Quentin. "Proposition de mécanismes d'optimisation des données pour la perception temps-réel dans un système embarqué hétérogène". Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG039.
Pełny tekst źródłaThe development of autonomous systems has an increasing need for perception of the environment in embedded systems. Autonomous cars, drones, mixed reality devices have limited form factor and a restricted budget of power consumption for real-time performances. For instance, those use cases have a budget in the range of 300W-10W, 15W-10W and 10W-10mW respectively. This thesis is focused on autonomous and mobile systems with a budget of 10mW to 15W with the use of imager sensors and the inertial measurement unit (IMU). Simultaneous Localization And Mapping (SLAM) provides accurate and robust perception of the environment in real-time without prior knowledge for autonomous and mobile systems. The thesis aims at the real-time execution of the whole SLAM system composed of advanced perception functions, from localization to 3D reconstruction, with restricted hardware resources. In this context, two main questions are raised to answer the challenges of the literature. How to reduce the resource requirements of advanced perception functions? What is the SLAM pipeline partitioning for the heterogeneous system that integrates several computing units, from the embedded chip in the imager, to the near-sensor processing (FPGA) and in the embedded platform (ARM, embedded GPU)?. The first issue addressed in the thesis is about the need to reduce the hardware resources used by the SLAM pipeline, from the sensor output to the 3D reconstruction. In this regard, the work described in the manuscript provides two main contributions. The first one presents the processing in the embedded chip with an impact on the image characteristics by reducing the dynamic range. The second has an impact on the management of the image flow injected in the SLAM pipeline with a near-sensor processing. The first contribution aims at reducing the memory footprint of the SLAM algorithms with the evaluation of the pixel dynamic reduction on the accuracy and robustness of real-time localization and 3D reconstruction. The experiments show that we can reduce the input data up to 75% corresponding to 2 bits per pixel while maintaining a similar accuracy than the baseline 8 bits per pixel. Those results have been obtained with the evaluation of the accuracy and robustness of four SLAM algorithms on two databases. The second contribution aims at reducing the amount of data injected in SLAM with a decimation strategy to control the input frame rate, called the adaptive filtering. Data are initially injected in constant rate (20 frames per second). This implies a consumption of energy, memory, bandwidth and increases the complexity of calculation. Can we reduce this amount of data ? In SLAM, the accuracy and the number of operations depend on the movement of the system. With the linear and angular accelerations from the IMU, data are injected based on the movement of the system. Those key images are injected with the adaptive filtering approach (AF). Although the results depend on the difficulty of the chosen database, the experiments describe that the AF allows the decimation of up to 80% of the images while maintaining low localization and reconstruction errors similar to the baseline. This study shows that in the embedded context, the peak memory consumption is reduced up to 92%
Zarader, Pierre. "Transcranial ultrasound tracking of a neurosurgical microrobot". Electronic Thesis or Diss., Sorbonne université, 2024. http://www.theses.fr/2024SORUS054.
Pełny tekst źródłaWith the aim of treating brain tumors difficult to access with current surgical tools, Robeauté is developing an innovative microrobot to navigate deep brain areas with minimal invasiveness. The aim of this thesis was to develop and validate a transcranial ultrasound-based tracking system for the microrobot, in order to be able to implement robotic commands and thus guarantee both the safety and the effectiveness of the intervention.The proposed approach consists in positioning three ultrasound emitters on the patient's head, and embedding an ultrasound receiver on the microrobot. Knowing the speed of sound in biological tissue and the skull thickness crossed, it is possible to estimate the distances from the emitters to the receiver by time-of-flight measurements, and to deduce its 3D position by trilateration. A proof of concept was first carried out using a skull phantom of constant thickness, demonstrating submillimeter localization accuracy. The system was then evaluated using a calvaria phantom whose thickness and speed of sound in front of each emitter were deduced by CT scan. The system demonstrated an mean localization accuracy of 1.5 mm, i.e. a degradation in accuracy of 1 mm compared with the tracking through the skull phantom of constant thickness, explained by the uncertainty brought by the heterogeneous shape of the calvaria. Finally, three preclinical tests, without the possibility of assessing localization error, were carried out: (i) a post-mortem test on a human, (ii) a post-mortem test on a ewe, (iii) and an in vivo test on a ewe.Further improvements to the tracking system have been proposed, such as (i) the use of CT scan-based transcranial ultrasound propagation simulation to take account of skull heterogeneities, (ii) the miniaturization of the ultrasound sensor embedded in the microrobot, (iii) as well as the integration of ultrasound imaging to visualize local vascularization around the microrobot, thereby reducing the risk of lesions and detecting possible pathological angiogenesis
Huang, Liang-zheng, i 黃良正. "3D Feature-based Localization for Mobile Robot in Real-time Environment". Thesis, 2013. http://ndltd.ncl.edu.tw/handle/56540472759084742787.
Pełny tekst źródła國立雲林科技大學
資訊工程系碩士班
101
In this thesis, a 3D feature-based localization for mobile robot in real-time environments is proposed. First, a 3D environment can be obtained by the point cloud images from the LIDAR mounted on a robot. Then, the proposed method identifies the predetermined 3D objects by Fast Point Feature Histogram (FPFH) algorithm. Since the locations of the objects are known, the current pose of the robot can be computed by the geometry relation between the captured point cloud image and the 3D objects. The proposed method is more precise then the methods based on 2D image because the 3D images have more information than 2D images. Furthermore, the proposed method also reduces the image deformation and position displacement.
Zhao, Shi. "3D real-time stockpile mapping and modelling with accurate quality calculation using voxels". Thesis, 2016. http://hdl.handle.net/2440/103494.
Pełny tekst źródłaThesis (Ph.D.) -- University of Adelaide, School of Mechanical Engineering, 2016.
Tefera, Yonas Teodros. "Perception-driven approaches to real-time remote immersive visualization". Doctoral thesis, 2022. http://hdl.handle.net/11562/1070226.
Pełny tekst źródłaCzęści książek na temat "3D real-time localization"
Nagy, Balázs, i Csaba Benedek. "Real-Time Point Cloud Alignment for Vehicle Localization in a High Resolution 3D Map". W Lecture Notes in Computer Science, 226–39. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-11009-3_13.
Pełny tekst źródłaSzklarski, Jacek, Cezary Ziemiecki, Jacek Szałtys i Marian Ostrowski. "Real-Time 3D Mapping with Visual-Inertial Odometry Pose Coupled with Localization in an Occupancy Map". W Advances in Intelligent Systems and Computing, 388–97. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-13273-6_37.
Pełny tekst źródłaLi, Ruijiang, Xun Jia, John H. Lewis, Xuejun Gu, Michael Folkerts, Chunhua Men i Steve B. Jiang. "Single-Projection Based Volumetric Image Reconstruction and 3D Tumor Localization in Real Time for Lung Cancer Radiotherapy". W Medical Image Computing and Computer-Assisted Intervention – MICCAI 2010, 449–56. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15711-0_56.
Pełny tekst źródłaChen, Alvin I., Max L. Balter, Timothy J. Maguire i Martin L. Yarmush. "3D Near Infrared and Ultrasound Imaging of Peripheral Blood Vessels for Real-Time Localization and Needle Guidance". W Lecture Notes in Computer Science, 388–96. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46726-9_45.
Pełny tekst źródłaFlorido, Alberto Martín, Francisco Rivas Montero i Jose María Cañas Plaza. "Robust 3D Visual Localization Based on RTABmaps". W Advancements in Computer Vision and Image Processing, 1–17. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5628-2.ch001.
Pełny tekst źródłaLu, Qinghua, Guanhong Zhang, Xueqian Mao i Xia Xu. "Research on High-Altitude Estimation Method Based on Fusion of Semantic Information with ElasticFusion". W Advances in Transdisciplinary Engineering. IOS Press, 2024. http://dx.doi.org/10.3233/atde240285.
Pełny tekst źródłaYang, Yingyi, Hao Wu, Fan Yang, Xiaoming Mai i Hui Chen. "Design and Implementation of Substation Operation Safety Monitoring and Management System Based on Three-Dimensional Reconstruction". W Machine Learning and Artificial Intelligence. IOS Press, 2020. http://dx.doi.org/10.3233/faia200793.
Pełny tekst źródłaLou, E., A. Chan, B. Coutts, E. Parent i J. Mahood. "Accuracy of pedicle localization using a 3D ultrasound navigator on vertebral phantoms for posterior spinal surgery". W Studies in Health Technology and Informatics. IOS Press, 2021. http://dx.doi.org/10.3233/shti210443.
Pełny tekst źródłaQize Yuan, Evan, i Calvin Sze Hang Ng. "Role of Hybrid Operating Room: Present and Future". W Immunosuppression. IntechOpen, 2020. http://dx.doi.org/10.5772/intechopen.91187.
Pełny tekst źródłaStreszczenia konferencji na temat "3D real-time localization"
Jaworski, Wojciech, Pawel Wilk, Pawel Zborowski, Witold Chmielowiec, Andrew YongGwon Lee i Abhishek Kumar. "Real-time 3D indoor localization". W 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN). IEEE, 2017. http://dx.doi.org/10.1109/ipin.2017.8115874.
Pełny tekst źródłaMouragnon, E., M. Lhuillier, M. Dhome, F. Dekeyser i P. Sayd. "Real Time Localization and 3D Reconstruction". W 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06). IEEE, 2006. http://dx.doi.org/10.1109/cvpr.2006.236.
Pełny tekst źródłaZhu, Yilong, Bohuan Xue, Linwei Zheng, Huaiyang Huang, Ming Liu i Rui Fan. "Real-Time, Environmentally-Robust 3D LiDAR Localization". W 2019 IEEE International Conference on Imaging Systems and Techniques (IST). IEEE, 2019. http://dx.doi.org/10.1109/ist48021.2019.9010305.
Pełny tekst źródłaLi, Xuehong, i Shuhua Yang. "The indoor real-time 3D localization algorithm using UWB". W 2015 International Conference on Advanced Mechatronic Systems (ICAMechS). IEEE, 2015. http://dx.doi.org/10.1109/icamechs.2015.7287085.
Pełny tekst źródłaRuiz, Luis, i Zhidong Wang. "Real time multi robot 3D localization system using trilateration". W 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, 2016. http://dx.doi.org/10.1109/robio.2016.7866541.
Pełny tekst źródłaLiu, Ruixu, Tao Peng, Vijayan K.Asari i John S. Loomis. "Real-time 3D scene reconstruction and localization with surface optimization". W NAECON 2018 - IEEE National Aerospace and Electronics Conference. IEEE, 2018. http://dx.doi.org/10.1109/naecon.2018.8556661.
Pełny tekst źródłaHirosue, Kauzki, Shohei Ukawa, Yuichi Itoh, Takao Onoye i Masanori Hashimoto. "GPGPU-based Highly Parallelized 3D Node Localization for Real-Time 3D Model Reproduction". W IUI'17: 22nd International Conference on Intelligent User Interfaces. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3025171.3025183.
Pełny tekst źródłaQiu, Jian, David Chu, Xiangying Meng i Thomas Moscibroda. "On the feasibility of real-time phone-to-phone 3D localization". W the 9th ACM Conference. New York, New York, USA: ACM Press, 2011. http://dx.doi.org/10.1145/2070942.2070962.
Pełny tekst źródłaKulkarni, Ashutosh P., Glen P. Abousleman i Jennie Si. "Real-time 3D target tracking and localization for arbitrary camera geometries". W Defense and Security Symposium, redaktor Ivan Kadar. SPIE, 2007. http://dx.doi.org/10.1117/12.720064.
Pełny tekst źródłaZhu, Yipeng, Tao Wang i Shiqiang Zhu. "Real-time Monocular 3D People Localization and Tracking on Embedded System". W 2021 6th IEEE International Conference on Advanced Robotics and Mechatronics (ICARM). IEEE, 2021. http://dx.doi.org/10.1109/icarm52023.2021.9536118.
Pełny tekst źródła