Добірка наукової літератури з теми "3D real-time localization"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "3D real-time localization".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "3D real-time localization":
Liu, M. H., Sheng Zhang, and Yi Pan. "UWB-based Real-Time 3D High Precision Localization System." Journal of Physics: Conference Series 2290, no. 1 (June 1, 2022): 012082. http://dx.doi.org/10.1088/1742-6596/2290/1/012082.
Lynen, Simon, Bernhard Zeisl, Dror Aiger, Michael Bosse, Joel Hesch, Marc Pollefeys, Roland Siegwart, and Torsten Sattler. "Large-scale, real-time visual–inertial localization revisited." International Journal of Robotics Research 39, no. 9 (July 7, 2020): 1061–84. http://dx.doi.org/10.1177/0278364920931151.
LI, Wei, Yi WU, Chunlin SHEN, and Huajun GONG. "Robust 3D Surface Reconstruction in Real-Time with Localization Sensor." IEICE Transactions on Information and Systems E101.D, no. 8 (August 1, 2018): 2168–72. http://dx.doi.org/10.1587/transinf.2018edl8056.
Mair, Elmar, Klaus H. Strobl, Tim Bodenmüller, Michael Suppa, and Darius Burschka. "Real-time Image-based Localization for Hand-held 3D-modeling." KI - Künstliche Intelligenz 24, no. 3 (May 26, 2010): 207–14. http://dx.doi.org/10.1007/s13218-010-0037-z.
Będkowski, Janusz, Andrzej Masłowski, and Geert De Cubber. "Real time 3D localization and mapping for USAR robotic application." Industrial Robot: An International Journal 39, no. 5 (August 17, 2012): 464–74. http://dx.doi.org/10.1108/01439911211249751.
Baeck, P. J., N. Lewyckyj, B. Beusen, W. Horsten, and K. Pauly. "DRONE BASED NEAR REAL-TIME HUMAN DETECTION WITH GEOGRAPHIC LOCALIZATION." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-3/W8 (August 20, 2019): 49–53. http://dx.doi.org/10.5194/isprs-archives-xlii-3-w8-49-2019.
Hauser, Fabian, and Jaroslaw Jacak. "Real-time 3D single-molecule localization microscopy analysis using lookup tables." Biomedical Optics Express 12, no. 8 (July 16, 2021): 4955. http://dx.doi.org/10.1364/boe.424016.
Li, Yiming, Markus Mund, Philipp Hoess, Joran Deschamps, Ulf Matti, Bianca Nijmeijer, Vilma Jimenez Sabinina, Jan Ellenberg, Ingmar Schoen, and Jonas Ries. "Real-time 3D single-molecule localization using experimental point spread functions." Nature Methods 15, no. 5 (April 9, 2018): 367–69. http://dx.doi.org/10.1038/nmeth.4661.
Zhu, Wenjun, Peng Wang, Rui Li, and Xiangli Nie. "Real-time 3D work-piece tracking with monocular camera based on static and dynamic model libraries." Assembly Automation 37, no. 2 (April 3, 2017): 219–29. http://dx.doi.org/10.1108/aa-02-2017-018.
Feng, Sheng, Chengdong Wu, Yunzhou Zhang, and Shigen Shen. "Collaboration calibration and three-dimensional localization in multi-view system." International Journal of Advanced Robotic Systems 15, no. 6 (November 1, 2018): 172988141881377. http://dx.doi.org/10.1177/1729881418813778.
Дисертації з теми "3D real-time localization":
Lee, Young Jin. "Real-Time Object Motion and 3D Localization from Geometry." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1408443773.
Uhercik, Marian. "Surgical tools localization in 3D ultrasound images." Phd thesis, INSA de Lyon, 2011. http://tel.archives-ouvertes.fr/tel-00735702.
Picard, Quentin. "Proposition de mécanismes d'optimisation des données pour la perception temps-réel dans un système embarqué hétérogène." Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG039.
The development of autonomous systems has an increasing need for perception of the environment in embedded systems. Autonomous cars, drones, mixed reality devices have limited form factor and a restricted budget of power consumption for real-time performances. For instance, those use cases have a budget in the range of 300W-10W, 15W-10W and 10W-10mW respectively. This thesis is focused on autonomous and mobile systems with a budget of 10mW to 15W with the use of imager sensors and the inertial measurement unit (IMU). Simultaneous Localization And Mapping (SLAM) provides accurate and robust perception of the environment in real-time without prior knowledge for autonomous and mobile systems. The thesis aims at the real-time execution of the whole SLAM system composed of advanced perception functions, from localization to 3D reconstruction, with restricted hardware resources. In this context, two main questions are raised to answer the challenges of the literature. How to reduce the resource requirements of advanced perception functions? What is the SLAM pipeline partitioning for the heterogeneous system that integrates several computing units, from the embedded chip in the imager, to the near-sensor processing (FPGA) and in the embedded platform (ARM, embedded GPU)?. The first issue addressed in the thesis is about the need to reduce the hardware resources used by the SLAM pipeline, from the sensor output to the 3D reconstruction. In this regard, the work described in the manuscript provides two main contributions. The first one presents the processing in the embedded chip with an impact on the image characteristics by reducing the dynamic range. The second has an impact on the management of the image flow injected in the SLAM pipeline with a near-sensor processing. The first contribution aims at reducing the memory footprint of the SLAM algorithms with the evaluation of the pixel dynamic reduction on the accuracy and robustness of real-time localization and 3D reconstruction. The experiments show that we can reduce the input data up to 75% corresponding to 2 bits per pixel while maintaining a similar accuracy than the baseline 8 bits per pixel. Those results have been obtained with the evaluation of the accuracy and robustness of four SLAM algorithms on two databases. The second contribution aims at reducing the amount of data injected in SLAM with a decimation strategy to control the input frame rate, called the adaptive filtering. Data are initially injected in constant rate (20 frames per second). This implies a consumption of energy, memory, bandwidth and increases the complexity of calculation. Can we reduce this amount of data ? In SLAM, the accuracy and the number of operations depend on the movement of the system. With the linear and angular accelerations from the IMU, data are injected based on the movement of the system. Those key images are injected with the adaptive filtering approach (AF). Although the results depend on the difficulty of the chosen database, the experiments describe that the AF allows the decimation of up to 80% of the images while maintaining low localization and reconstruction errors similar to the baseline. This study shows that in the embedded context, the peak memory consumption is reduced up to 92%
Zarader, Pierre. "Transcranial ultrasound tracking of a neurosurgical microrobot." Electronic Thesis or Diss., Sorbonne université, 2024. http://www.theses.fr/2024SORUS054.
With the aim of treating brain tumors difficult to access with current surgical tools, Robeauté is developing an innovative microrobot to navigate deep brain areas with minimal invasiveness. The aim of this thesis was to develop and validate a transcranial ultrasound-based tracking system for the microrobot, in order to be able to implement robotic commands and thus guarantee both the safety and the effectiveness of the intervention.The proposed approach consists in positioning three ultrasound emitters on the patient's head, and embedding an ultrasound receiver on the microrobot. Knowing the speed of sound in biological tissue and the skull thickness crossed, it is possible to estimate the distances from the emitters to the receiver by time-of-flight measurements, and to deduce its 3D position by trilateration. A proof of concept was first carried out using a skull phantom of constant thickness, demonstrating submillimeter localization accuracy. The system was then evaluated using a calvaria phantom whose thickness and speed of sound in front of each emitter were deduced by CT scan. The system demonstrated an mean localization accuracy of 1.5 mm, i.e. a degradation in accuracy of 1 mm compared with the tracking through the skull phantom of constant thickness, explained by the uncertainty brought by the heterogeneous shape of the calvaria. Finally, three preclinical tests, without the possibility of assessing localization error, were carried out: (i) a post-mortem test on a human, (ii) a post-mortem test on a ewe, (iii) and an in vivo test on a ewe.Further improvements to the tracking system have been proposed, such as (i) the use of CT scan-based transcranial ultrasound propagation simulation to take account of skull heterogeneities, (ii) the miniaturization of the ultrasound sensor embedded in the microrobot, (iii) as well as the integration of ultrasound imaging to visualize local vascularization around the microrobot, thereby reducing the risk of lesions and detecting possible pathological angiogenesis
Huang, Liang-zheng, and 黃良正. "3D Feature-based Localization for Mobile Robot in Real-time Environment." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/56540472759084742787.
國立雲林科技大學
資訊工程系碩士班
101
In this thesis, a 3D feature-based localization for mobile robot in real-time environments is proposed. First, a 3D environment can be obtained by the point cloud images from the LIDAR mounted on a robot. Then, the proposed method identifies the predetermined 3D objects by Fast Point Feature Histogram (FPFH) algorithm. Since the locations of the objects are known, the current pose of the robot can be computed by the geometry relation between the captured point cloud image and the 3D objects. The proposed method is more precise then the methods based on 2D image because the 3D images have more information than 2D images. Furthermore, the proposed method also reduces the image deformation and position displacement.
Zhao, Shi. "3D real-time stockpile mapping and modelling with accurate quality calculation using voxels." Thesis, 2016. http://hdl.handle.net/2440/103494.
Thesis (Ph.D.) -- University of Adelaide, School of Mechanical Engineering, 2016.
Tefera, Yonas Teodros. "Perception-driven approaches to real-time remote immersive visualization." Doctoral thesis, 2022. http://hdl.handle.net/11562/1070226.
Частини книг з теми "3D real-time localization":
Nagy, Balázs, and Csaba Benedek. "Real-Time Point Cloud Alignment for Vehicle Localization in a High Resolution 3D Map." In Lecture Notes in Computer Science, 226–39. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-11009-3_13.
Szklarski, Jacek, Cezary Ziemiecki, Jacek Szałtys, and Marian Ostrowski. "Real-Time 3D Mapping with Visual-Inertial Odometry Pose Coupled with Localization in an Occupancy Map." In Advances in Intelligent Systems and Computing, 388–97. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-13273-6_37.
Li, Ruijiang, Xun Jia, John H. Lewis, Xuejun Gu, Michael Folkerts, Chunhua Men, and Steve B. Jiang. "Single-Projection Based Volumetric Image Reconstruction and 3D Tumor Localization in Real Time for Lung Cancer Radiotherapy." In Medical Image Computing and Computer-Assisted Intervention – MICCAI 2010, 449–56. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15711-0_56.
Chen, Alvin I., Max L. Balter, Timothy J. Maguire, and Martin L. Yarmush. "3D Near Infrared and Ultrasound Imaging of Peripheral Blood Vessels for Real-Time Localization and Needle Guidance." In Lecture Notes in Computer Science, 388–96. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46726-9_45.
Florido, Alberto Martín, Francisco Rivas Montero, and Jose María Cañas Plaza. "Robust 3D Visual Localization Based on RTABmaps." In Advancements in Computer Vision and Image Processing, 1–17. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5628-2.ch001.
Lu, Qinghua, Guanhong Zhang, Xueqian Mao, and Xia Xu. "Research on High-Altitude Estimation Method Based on Fusion of Semantic Information with ElasticFusion." In Advances in Transdisciplinary Engineering. IOS Press, 2024. http://dx.doi.org/10.3233/atde240285.
Yang, Yingyi, Hao Wu, Fan Yang, Xiaoming Mai, and Hui Chen. "Design and Implementation of Substation Operation Safety Monitoring and Management System Based on Three-Dimensional Reconstruction." In Machine Learning and Artificial Intelligence. IOS Press, 2020. http://dx.doi.org/10.3233/faia200793.
Lou, E., A. Chan, B. Coutts, E. Parent, and J. Mahood. "Accuracy of pedicle localization using a 3D ultrasound navigator on vertebral phantoms for posterior spinal surgery." In Studies in Health Technology and Informatics. IOS Press, 2021. http://dx.doi.org/10.3233/shti210443.
Qize Yuan, Evan, and Calvin Sze Hang Ng. "Role of Hybrid Operating Room: Present and Future." In Immunosuppression. IntechOpen, 2020. http://dx.doi.org/10.5772/intechopen.91187.
Тези доповідей конференцій з теми "3D real-time localization":
Jaworski, Wojciech, Pawel Wilk, Pawel Zborowski, Witold Chmielowiec, Andrew YongGwon Lee, and Abhishek Kumar. "Real-time 3D indoor localization." In 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN). IEEE, 2017. http://dx.doi.org/10.1109/ipin.2017.8115874.
Mouragnon, E., M. Lhuillier, M. Dhome, F. Dekeyser, and P. Sayd. "Real Time Localization and 3D Reconstruction." In 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06). IEEE, 2006. http://dx.doi.org/10.1109/cvpr.2006.236.
Zhu, Yilong, Bohuan Xue, Linwei Zheng, Huaiyang Huang, Ming Liu, and Rui Fan. "Real-Time, Environmentally-Robust 3D LiDAR Localization." In 2019 IEEE International Conference on Imaging Systems and Techniques (IST). IEEE, 2019. http://dx.doi.org/10.1109/ist48021.2019.9010305.
Li, Xuehong, and Shuhua Yang. "The indoor real-time 3D localization algorithm using UWB." In 2015 International Conference on Advanced Mechatronic Systems (ICAMechS). IEEE, 2015. http://dx.doi.org/10.1109/icamechs.2015.7287085.
Ruiz, Luis, and Zhidong Wang. "Real time multi robot 3D localization system using trilateration." In 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, 2016. http://dx.doi.org/10.1109/robio.2016.7866541.
Liu, Ruixu, Tao Peng, Vijayan K.Asari, and John S. Loomis. "Real-time 3D scene reconstruction and localization with surface optimization." In NAECON 2018 - IEEE National Aerospace and Electronics Conference. IEEE, 2018. http://dx.doi.org/10.1109/naecon.2018.8556661.
Hirosue, Kauzki, Shohei Ukawa, Yuichi Itoh, Takao Onoye, and Masanori Hashimoto. "GPGPU-based Highly Parallelized 3D Node Localization for Real-Time 3D Model Reproduction." In IUI'17: 22nd International Conference on Intelligent User Interfaces. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3025171.3025183.
Qiu, Jian, David Chu, Xiangying Meng, and Thomas Moscibroda. "On the feasibility of real-time phone-to-phone 3D localization." In the 9th ACM Conference. New York, New York, USA: ACM Press, 2011. http://dx.doi.org/10.1145/2070942.2070962.
Kulkarni, Ashutosh P., Glen P. Abousleman, and Jennie Si. "Real-time 3D target tracking and localization for arbitrary camera geometries." In Defense and Security Symposium, edited by Ivan Kadar. SPIE, 2007. http://dx.doi.org/10.1117/12.720064.
Zhu, Yipeng, Tao Wang, and Shiqiang Zhu. "Real-time Monocular 3D People Localization and Tracking on Embedded System." In 2021 6th IEEE International Conference on Advanced Robotics and Mechatronics (ICARM). IEEE, 2021. http://dx.doi.org/10.1109/icarm52023.2021.9536118.