Статті в журналах з теми "Images 360 degrés"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Images 360 degrés.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Images 360 degrés".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Hadi Ali, Israa, and Sarmad Salman. "360-Degree Panoramic Image Stitching for Un-ordered Images Based on Harris Corner Detection." Indian Journal of Science and Technology 12, no. 4 (January 1, 2019): 1–9. http://dx.doi.org/10.17485/ijst/2019/v12i4/140988.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Assens, Marc, Xavier Giro-i-Nieto, Kevin McGuinness, and Noel E. O’Connor. "Scanpath and saliency prediction on 360 degree images." Signal Processing: Image Communication 69 (November 2018): 8–14. http://dx.doi.org/10.1016/j.image.2018.06.006.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Barazzetti, L., M. Previtali, and F. Roncoroni. "CAN WE USE LOW-COST 360 DEGREE CAMERAS TO CREATE ACCURATE 3D MODELS?" ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2 (May 30, 2018): 69–75. http://dx.doi.org/10.5194/isprs-archives-xlii-2-69-2018.

Повний текст джерела
Анотація:
360 degree cameras capture the whole scene around a photographer in a single shot. Cheap 360 cameras are a new paradigm in photogrammetry. The camera can be pointed to any direction, and the large field of view reduces the number of photographs. This paper aims to show that accurate metric reconstructions can be achieved with affordable sensors (less than 300 euro). The camera used in this work is the Xiaomi Mijia Mi Sphere 360, which has a cost of about 300 USD (January 2018). Experiments demonstrate that millimeter-level accuracy can be obtained during the image orientation and surface reconstruction steps, in which the solution from 360° images was compared to check points measured with a total station and laser scanning point clouds. The paper will summarize some practical rules for image acquisition as well as the importance of ground control points to remove possible deformations of the network during bundle adjustment, especially for long sequences with unfavorable geometry. The generation of orthophotos from images having a 360° field of view (that captures the entire scene around the camera) is discussed. Finally, the paper illustrates some case studies where the use of a 360° camera could be a better choice than a project based on central perspective cameras. Basically, 360° cameras become very useful in the survey of long and narrow spaces, as well as interior areas like small rooms.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Alves, Ricardo Martins, Luís Sousa, Aldric Trindade Negrier, João M. F. Rodrigues, Jânio Monteiro, Pedro J. S. Cardoso, Paulo Felisberto, and Paulo Bica. "Interactive 360 Degree Holographic Installation." International Journal of Creative Interfaces and Computer Graphics 8, no. 1 (January 2017): 20–38. http://dx.doi.org/10.4018/ijcicg.2017010102.

Повний текст джерела
Анотація:
With new marketing strategies and technologies, new demands arise, and the standard public relation or salesperson is not enough, costumers tend to have higher standards while companies try to capture their attention, requiring the use of creative contents and ideas. For this purpose, this article describes how an interactive holographic installation was developed, making use of a holographic technology to call attention of potential clients. This is achieved by working as a host or showing a product advertising the company. The installation consists in a 360 degree (8 view) holographic avatar or object and optionality, also a screen, where a set of menus with videos, images and textual contents are presented. It uses several Microsoft Kinect sensors for enabling user (and other persons) tracking and natural interaction around the installation, through gestures and speech while building several statistics of the visualized content. All those statistics can be analyzed on-the-fly by the company to understand the success of the event.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Banchi, Yoshihiro, Keisuke Yoshikawa, and Takashi Kawai. "Evaluating user experience of 180 and 360 degree images." Electronic Imaging 2020, no. 2 (January 26, 2020): 244–1. http://dx.doi.org/10.2352/issn.2470-1173.2020.2.sda-244.

Повний текст джерела
Анотація:
This paper describes a comparison of user experience of virtual reality (VR) image format. The authors prepared the following four conditions and evaluated the user experience during viewing VR images with a headset by measuring subjective and objective indices; Condition 1: monoscopic 180-degree image, Condition 2: stereoscopic 180-degree image, Condition 3: monoscopic 360-degree image, Condition 4: stereoscopic 360-degree image. From the results of the subjective indices (reality, presence, and depth sensation), condition 4 was evaluated highest, and conditions 2 and 3 were evaluated to the same extent. In addition, from the results of the objective indices (eye and head tracking), a tendency to suppress head movement was found in 180-degree images.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Hussain, Abuelainin. "Interactive 360-Degree Virtual Reality into eLearning Content Design." International Journal of Innovative Technology and Exploring Engineering 10, no. 2 (December 10, 2020): 1–4. http://dx.doi.org/10.35940/ijitee.b8219.1210220.

Повний текст джерела
Анотація:
The techniques and methods essential in creating 2D and 3D virtual reality images that can be displayed in multimedia devices are the main aims of the study. Tools such as desktops, laptops, tablets, smartphones, and other multimedia devices that display such content are the primary concern in the study. Such devices communicate the content through videos, images, or sound which are realistic and useful to the user. Such contents can be captured from different locations in virtual imaginary sites through the abovenamed electronic devices. These are beneficial e-learning instructional techniques for students, especially for higher learning [1]. Considering architectural learners who rely on such images to develop the expected simple designs useful in real constructions, 360-degree imaging has to be considered in e-learning for their benefits. The primary forms through which the content can be transformed into a virtual reality include YouTube and Facebook websites, all of which can display 360-degree virtual environment content. Through this, the learners will interact with virtual reality in such setups, thus enhancing their studies.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Lee, Hyunchul, and Okkyung Choi. "An efficient parameter update method of 360-degree VR image model." International Journal of Engineering Business Management 11 (January 1, 2019): 184797901983599. http://dx.doi.org/10.1177/1847979019835993.

Повний текст джерела
Анотація:
Recently, with the rapid growth of manufacture and ease of user convenience, technologies utilizing virtual reality images have been increasing. It is very important to estimate the projected direction and position of the image to show the image quality similar to the real world, and the estimation of the direction and the position is solved using the relation that transforms the spheres into the expanded equirectangular. The transformation relationship can be divided into a camera intrinsic parameter and a camera extrinsic parameter, and all the images have respective camera parameters. Also, if several images use the same camera, the camera intrinsic parameters of the images will have the same values. However, it is not the best way to set the camera intrinsic parameter to the same value for all images when matching images. To solve these problems and show images that does not have a sense of heterogeneity, it is needed to create the cost function by modeling the conversion relation and calculate the camera parameter that the residual value becomes the minimum. In this article, we compare and analyze efficient camera parameter update methods. For comparative analysis, we use Levenberg–Marquardt, a parameter optimization algorithm using corresponding points, and propose an efficient camera parameter update method based on the analysis results.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Jauhari, Jauhari. "SOLO-YOGYA INTO 360-DEGREE PHOTOGRAPHY." Capture : Jurnal Seni Media Rekam 13, no. 1 (December 13, 2021): 17–31. http://dx.doi.org/10.33153/capture.v13i1.3627.

Повний текст джерела
Анотація:
Currently, technological developments have made it possible for photographic works not only to be present in the form of a flat 180 degree two-dimensional panorama, but even being able to present reality with a 360 degree perspective. This research on the creation of photographic works aims to optimize photographic equipment for photographing with a 360 degree perspective. Eventhough there are many 360 degree application in smartphones, but using a DSLR camera to create works with a 360 degree perspective has the advantage that it can be printed in large sizes with high resolution without breaking the pixels. The method of creating this work is based on the experimental process of developing DSLR camera equipment. This 360 degree photography creation technique uses the 'panning-sequence' technique with 'continuous exposure' which allows the images captured by the camera can be combined or mixed into one panoramic image. In addition to getting an important and interesting visual appearance, the presence of a 360 degree perspective in this work can also give a new nuances in the world of the art of photography.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Tsubaki, Ikuko, and Kazuo Sasaki. "An Interrupted Projection using Seam Carving for 360-degree Images." Electronic Imaging 2018, no. 2 (January 28, 2018): 414–1. http://dx.doi.org/10.2352/issn.2470-1173.2018.2.vipc-414.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Banchi, Yoshihiro, and Takashi Kawai. "Evaluating user experience of different angle VR images." Electronic Imaging 2021, no. 2 (January 18, 2021): 98–1. http://dx.doi.org/10.2352/issn.2470-1173.2021.2.sda-098.

Повний текст джерела
Анотація:
This paper describes a comparison of user experience of virtual reality (VR) image angles. 7 angles conditions are prepared and evaluated the user experience during viewing VR images with a headset by measuring subjective and objective indexes. Angle conditions were every 30 degrees from 180 to 360 degrees. From the results of the subjective indexes (reality, presence, and depth sensation), a 360-degree image was evaluated highest, and different evaluations were made between 240 and 270 degrees.In addition, from the results of the objective indexes (eye and head tracking), a tendency to spread the eye and head movement was found as the image angle increases.
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Larabi, Mohamed-Chaker, Audrey Girard, Sami Jaballah, and Fan Yu. "Benchmark of 2D quality metrics for the assessment of 360-deg images." Color and Imaging Conference 2019, no. 1 (October 21, 2019): 262–67. http://dx.doi.org/10.2352/issn.2169-2629.2019.27.47.

Повний текст джерела
Анотація:
Omnidirectional or 360-degree images are becoming very popular in many applications and several challenges are raised because of both the nature and the representation of the data. Quality assessment is one of them from two different points of view: objectively or subjectively. In this paper, we propose to study the performance of different metrics belonging to various categories including simple mathematical metrics, humand perception based metrics and spherically optimized metrics. The performance of these metrics is measured using different tools such as PLCC, SROCC, KROCC and RMSE based on the only publically available database from Nanjing university. The results show that the metric that are considered as optimized for 360 degrees images are not providing the best correlation with the human judgement of the quality.
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Jiang, Hao, Gangyi Jiang, Mei Yu, Yun Zhang, You Yang, Zongju Peng, Fen Chen, and Qingbo Zhang. "Cubemap-Based Perception-Driven Blind Quality Assessment for 360-degree Images." IEEE Transactions on Image Processing 30 (2021): 2364–77. http://dx.doi.org/10.1109/tip.2021.3052073.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Kim, Seeun, Tae Hyun Baek, and Sukki Yoon. "The effect of 360-degree rotatable product images on purchase intention." Journal of Retailing and Consumer Services 55 (July 2020): 102062. http://dx.doi.org/10.1016/j.jretconser.2020.102062.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Zhu, Yucheng, Guangtao Zhai, and Xiongkuo Min. "The prediction of head and eye movement for 360 degree images." Signal Processing: Image Communication 69 (November 2018): 15–25. http://dx.doi.org/10.1016/j.image.2018.05.010.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Fang, Yuming, Xiaoqiang Zhang, and Nevrez Imamoglu. "A novel superpixel-based saliency detection model for 360-degree images." Signal Processing: Image Communication 69 (November 2018): 1–7. http://dx.doi.org/10.1016/j.image.2018.07.009.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Luo, Xin, Yue Chen, Yong Huang, Xiaodi Tan, and Hideyoshi Horimai. "360 degree realistic 3D image display and image processing from real objects." Optical Review 23, no. 6 (September 1, 2016): 1010–16. http://dx.doi.org/10.1007/s10043-016-0264-0.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Hasler, O., B. Loesch, S. Blaser, and S. Nebiker. "CONFIGURATION AND SIMULATION TOOL FOR 360-DEGREE STEREO CAMERA RIG." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W13 (June 5, 2019): 793–98. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w13-793-2019.

Повний текст джерела
Анотація:
<p><strong>Abstract.</strong> The demand for capturing outdoor and indoor scenes is rising with the digitalization trend in the construction industry. An efficient solution for capturing these environments is mobile mapping. Image-based systems with 360&amp;deg; panoramic coverage allow a rapid data acquisition and can be made user-friendly accessible when hosted in a cloud-based 3D geoinformation service. The design of such a 360° stereo camera system is challenging since multiple parameters like focal length, stereo base length and environmental restrictions such as narrow corridors are influencing each other. Therefore, this paper presents a toolset, which helps configuring and evaluating such a panorama stereo camera rig. The first tool is used to determine, from which distance on 360&amp;deg; stereo coverage depending on the parametrization of the rig is achieved. The second tool can be used to capture images with the parametrized camera rig in different virtual indoor and outdoor scenes. The last tool supports stitching the captured images together in respect of the intrinsic and extrinsic parameters from the configuration tool. This toolset radically simplifies the evaluation process of a 360&amp;deg; stereo camera configuration and decreases the number of physical MMS prototypes.</p>
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Gafar, Ilham Afan, Zaenul Arif, and Syefudin. "Systematic Literature Review: Virtual Tour 360 Degree Panorama." International Journal of Engineering Business and Social Science 1, no. 01 (October 1, 2022): 01–10. http://dx.doi.org/10.58451/ijebss.v1i01.1.

Повний текст джерела
Анотація:
The application of VR as a medium to promote a place, be it tourist attractions, education, tourism or health, has become a common thing in the current era. 360-degree panoramic virtual tour itself is one method in making Virtual Reality. A 360-degree panoramic virtual tour is a collection of 360-degree images which are then processed so that they can be enjoyed virtually as if they were real. The goal to be achieved in this paper is to analyze a 360-degree panoramic virtual tour as a medium to promote a place, by conducting in-depth reviews and evaluating searches through selected literature based on certain criteria and the selected studies will be processed to answer research questions. Systematic Literature Review (SLR) is a research method that aims to identify and evaluate research results with the best technique based on specific procedures from comparison results. The results of the research on the selection of journal topics, Tourism, Education, and Health can be the main reference regarding the 360-degree panoramic virtual tour and there are various methods that can be used starting from MDLC, Luther Sutopo, Image Method, IMSDD, Qualitative.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Cheng, Yih-Shyang, Zheng-Feng Chen, and Chih-Hung Chen. "Virtual-image generation in 360-degree viewable image-plane disk-type multiplex holography." Optics Express 21, no. 8 (April 19, 2013): 10301. http://dx.doi.org/10.1364/oe.21.010301.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Banchi, Yoshihiro, Keisuke Yoshikawa, and Takashi Kawai. "Effects of binocular parallax in 360-degree VR images on viewing behavior." Electronic Imaging 2019, no. 3 (January 13, 2019): 648–1. http://dx.doi.org/10.2352/issn.2470-1173.2019.3.sda-648.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Zhu, Yucheng, Guangtao Zhai, Xiongkuo Min, and Jiantao Zhou. "Learning a Deep Agent to Predict Head Movement in 360-Degree Images." ACM Transactions on Multimedia Computing, Communications, and Applications 16, no. 4 (December 16, 2020): 1–23. http://dx.doi.org/10.1145/3410455.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Delforouzi, Ahmad, Seyed Amir Hossein Tabatabaei, Kimiaki Shirahama, and Marcin Grzegorzek. "A polar model for fast object tracking in 360-degree camera images." Multimedia Tools and Applications 78, no. 7 (August 20, 2018): 9275–97. http://dx.doi.org/10.1007/s11042-018-6525-0.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
23

KILINCI, Elif. "AN ANALYSIS OF VIRTUAL MUSEUM WHICH PROVIDES 360 DEGREE PANORAMIC IMAGE IN TERMS OF GRAPHICH DESIGN." TURKISH ONLINE JOURNAL OF DESIGN, ART AND COMMUNICATION 5, no. 4 (October 1, 2015): 57–65. http://dx.doi.org/10.7456/10504100/005.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
24

See, Zi Siang, Lizbeth Goodman, Craig Hight, Mohd Shahrizal Sunar, Arindam Dey, Yen Kaow Ng, and Mark Billinghurst. "Creating high fidelity 360° virtual reality with high dynamic range spherical panorama images." Virtual Creativity 9, no. 1 (December 1, 2019): 73–109. http://dx.doi.org/10.1386/vcr_00006_1.

Повний текст джерела
Анотація:
Abstract This research explores the development of a novel method and apparatus for creating spherical panoramas enhanced with high dynamic range (HDR) for high fidelity Virtual Reality 360 degree (VR360) user experiences. A VR360 interactive panorama presentation using spherical panoramas can provide virtual interactivity and wider viewing coverage; with three degrees of freedom, users can look around in multiple directions within the VR360 experiences, gaining the sense of being in control of their own engagement. This degree of freedom is facilitated by the use of mobile displays or head-mount-devices. However, in terms of image reproduction, the exposure range can be a major difficulty in reproducing a high contrast real-world scene. Imaging variables caused by difficulties and obstacles can occur during the production process of spherical panorama facilitated with HDR. This may result in inaccurate image reproduction for location-based subjects, which will in turn result in a poor VR360 user experience. In this article we describe a HDR spherical panorama reproduction approach (workflow and best practice) which can shorten the production processes, and reduce imaging variables, and technical obstacles and issues to a minimum. This leads to improved photographic image reproduction with fewer visual abnormalities for VR360 experiences, which can be adaptable into a wide range of interactive design applications. We describe the process in detail and also report on a user study that shows the proposed approach creates images which viewers prefer, on the whole, to those created using more complicated HDR methods, or to those created without the use of HDR at all.
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Ullah, Faiz, Oh-Jin Kwon, and Seungcheol Choi. "Generation of a Panorama Compatible with the JPEG 360 International Standard Using a Single PTZ Camera." Applied Sciences 11, no. 22 (November 21, 2021): 11019. http://dx.doi.org/10.3390/app112211019.

Повний текст джерела
Анотація:
Recently, the JPEG working group (ISO/IEC JTC1 SC29 WG1) developed an international standard, JPEG 360, that specifies the metadata and functionalities for saving and sharing 360-degree images efficiently to create a more realistic environment in various virtual reality services. We surveyed the metadata formats of existing 360-degree images and compared them to the JPEG 360 metadata format. We found that existing omnidirectional cameras and stitching software packages use formats that are incompatible with the JPEG 360 standard to embed metadata in JPEG image files. This paper proposes an easy-to-use tool for embedding JPEG 360 standard metadata for 360-degree images in JPEG image files using a JPEG-defined box format: the JPEG universal metadata box format. The proposed implementation will help 360-degree cameras and software vendors provide immersive services to users in a standardized manner for various markets, such as entertainment, education, professional training, navigation, and virtual and augmented reality applications. We also propose and develop an economical JPEG 360 standard compatible panoramic image acquisition system from a single PTZ camera with a special-use case of a wide field of view image of a conference or meeting. A remote attendee of the conference/meeting can see the realistic and immersive environment through our PTZ panorama in virtual reality.
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Ha, Van Kha Ly, Rifai Chai, and Hung T. Nguyen. "A Telepresence Wheelchair with 360-Degree Vision Using WebRTC." Applied Sciences 10, no. 1 (January 3, 2020): 369. http://dx.doi.org/10.3390/app10010369.

Повний текст джерела
Анотація:
This paper presents an innovative approach to develop an advanced 360-degree vision telepresence wheelchair for healthcare applications. The study aims at improving a wide field of view surrounding the wheelchair to provide safe wheelchair navigation and efficient assistance for wheelchair users. A dual-fisheye camera is mounted in front of the wheelchair to capture images which can be then streamed over the Internet. A web real-time communication (WebRTC) protocol was implemented to provide efficient video and data streaming. An estimation model based on artificial neural networks was developed to evaluate the quality of experience (QoE) of video streaming. Experimental results confirmed that the proposed telepresence wheelchair system was able to stream a 360-degree video surrounding the wheelchair smoothly in real-time. The average streaming rate of the entire 360-degree video was 25.83 frames per second (fps), and the average peak signal to noise ratio (PSNR) was 29.06 dB. Simulation results of the proposed QoE estimation scheme provided a prediction accuracy of 94%. Furthermore, the results showed that the designed system could be controlled remotely via the wireless Internet to follow the desired path with high accuracy. The overall results demonstrate the effectiveness of our proposed approach for the 360-degree vision telepresence wheelchair for assistive technology applications.
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Woo Han, Seo, and Doug Young Suh. "A 360-degree Panoramic Image Inpainting Network Using a Cube Map." Computers, Materials & Continua 66, no. 1 (2020): 213–28. http://dx.doi.org/10.32604/cmc.2020.012223.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
28

TSUKADA, Shota, Yusuke HASEGAWA, Yoshihiro BANCHI, Hiroyuki MORIKAWA, and Takashi KAWAI. "1F4-2 Analysis of user behavior during viewing 360-degree virtual reality images." Japanese journal of ergonomics 52, Supplement (2016): S240—S241. http://dx.doi.org/10.5100/jje.52.s240.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
29

BANCHI, Yoshihiro, Keisuke YOSHIKAWA, and Takashi KAWAI. "1A4-2 Behavioral Analysis while viewing 360-degree images using a HMD (1)." Japanese Journal of Ergonomics 54, Supplement (June 2, 2018): 1A4–2–1A4–2. http://dx.doi.org/10.5100/jje.54.1a4-2.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
30

YOSHIKAWA, Keisuke, Yoshihiro BANCHI, and Takashi KAWAI. "1A4-3 Behavioral analysis while viewing 360-degree images using a HMD (2)." Japanese Journal of Ergonomics 54, Supplement (June 2, 2018): 1A4–3–1A4–3. http://dx.doi.org/10.5100/jje.54.1a4-3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Madu, Chisom T., Taylor Phelps, Joel S. Schuman, Ronald Zambrano, Ting-Fang Lee, Joseph Panarelli, Lama Al-Aswad, and Gadi Wollstein. "Automated 360-degree goniophotography with the NIDEK Gonioscope GS-1 for glaucoma." PLOS ONE 18, no. 3 (March 7, 2023): e0270941. http://dx.doi.org/10.1371/journal.pone.0270941.

Повний текст джерела
Анотація:
This study was registered with ClinicalTrials.gov (ID: NCT03715231). A total of 20 participants (37 eyes) who were 18 or older and had glaucoma or were glaucoma suspects were enrolled from the NYU Langone Eye Center and Bellevue Hospital. During their usual ophthalmology visit, they were consented for the study and underwent 360-degree goniophotography using the NIDEK Gonioscope GS-1. Afterwards, the three ophthalmologists separately examined the images obtained and determined the status of the iridocorneal angle in four quadrants using the Shaffer grading system. Physicians were masked to patient names and diagnoses. Inter-observer reproducibility was determined using Fleiss’ kappa statistics. The interobserver reliability using Fleiss’ statistics was shown to be significant between three glaucoma specialists with fair overall agreement (Fleiss’ kappa: 0.266, p < .0001) in the interpretation of 360-degree goniophotos. Automated 360-degree goniophotography using the NIDEK Gonioscope GS-1 have quality such that they are interpreted similarly by independent expert observers. This indicates that angle investigation may be performed using this automated device and that interpretation by expert observers is likely to be similar. Images produced from automated 360-degree goniophotography using the NIDEK Gonioscope GS-1 are similarly interpreted amongst glaucoma specialists, thus supporting use of this technique to document and assess the anterior chamber angle in patients with, or suspected of, glaucoma and iridocorneal angle abnormalities.
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Qiu, Miaomiao, and Feng Shao. "Blind 360-degree image quality assessment via saliency-guided convolution neural network." Optik 240 (August 2021): 166858. http://dx.doi.org/10.1016/j.ijleo.2021.166858.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Sun, Wei, Xiongkuo Min, Guangtao Zhai, Ke Gu, Huiyu Duan, and Siwei Ma. "MC360IQA: A Multi-channel CNN for Blind 360-Degree Image Quality Assessment." IEEE Journal of Selected Topics in Signal Processing 14, no. 1 (January 2020): 64–77. http://dx.doi.org/10.1109/jstsp.2019.2955024.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

YAMAGUCHI, Mai, and Takayoshi TOGINO. "The effect of a presence viewers position during viewing 360-degree image." Proceedings of the Annual Convention of the Japanese Psychological Association 75 (September 15, 2011): 3PM128. http://dx.doi.org/10.4992/pacjpa.75.0_3pm128.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Makiguchi, Motohiro, Hideaki Takada, Tohru Kawakami, and Mutsumi Sasai. "33‐2: Improving Image Quality of 360‐degree Tabletop 3D Screen System." SID Symposium Digest of Technical Papers 51, no. 1 (August 2020): 470–73. http://dx.doi.org/10.1002/sdtp.13907.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Janiszewski, Mateusz, Masoud Torkan, Lauri Uotinen, and Mikael Rinne. "Rapid Photogrammetry with a 360-Degree Camera for Tunnel Mapping." Remote Sensing 14, no. 21 (October 31, 2022): 5494. http://dx.doi.org/10.3390/rs14215494.

Повний текст джерела
Анотація:
Structure-from-Motion Multi-View Stereo (SfM-MVS) photogrammetry is a viable method to digitize underground spaces for inspection, documentation, or remote mapping. However, the conventional image acquisition process can be laborious and time-consuming. Previous studies confirmed that the acquisition time can be reduced when using a 360-degree camera to capture the images. This paper demonstrates a method for rapid photogrammetric reconstruction of tunnels using a 360-degree camera. The method is demonstrated in a field test executed in a tunnel section of the Underground Research Laboratory of Aalto University in Espoo, Finland. A 10 m-long tunnel section with exposed rock was photographed using the 360-degree camera from 27 locations and a 3D model was reconstructed using SfM-MVS photogrammetry. The resulting model was then compared with a reference laser scan and a more conventional digital single-lens reflex (DSLR) camera-based model. Image acquisition with a 360-degree camera was 3x faster than with a conventional DSLR camera and the workflow was easier and less prone to errors. The 360-degree camera-based model achieved a 0.0046 m distance accuracy error compared to the reference laser scan. In addition, the orientation of discontinuities was measured remotely from the 3D model and the digitally obtained values matched the manual compass measurements of the sub-vertical fracture sets, with an average error of 2–5°.
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Zhu, Yucheng, Guangtao Zhai, Xiongkuo Min, and Jiantao Zhou. "The Prediction of Saliency Map for Head and Eye Movements in 360 Degree Images." IEEE Transactions on Multimedia 22, no. 9 (September 2020): 2331–44. http://dx.doi.org/10.1109/tmm.2019.2957986.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Ling, Jing, Kao Zhang, Yingxue Zhang, Daiqin Yang, and Zhenzhong Chen. "A saliency prediction model on 360 degree images using color dictionary based sparse representation." Signal Processing: Image Communication 69 (November 2018): 60–68. http://dx.doi.org/10.1016/j.image.2018.03.007.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Hara, Takayuki, Yusuke Mukuta, and Tatsuya Harada. "Spherical Image Generation from a Single Image by Considering Scene Symmetry." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 2 (May 18, 2021): 1513–21. http://dx.doi.org/10.1609/aaai.v35i2.16242.

Повний текст джерела
Анотація:
Spherical images taken in all directions (360 degrees by 180 degrees) allow the full surroundings of a subject to be represented, providing an immersive experience to viewers. Generating a spherical image from a single normal-field-of-view (NFOV) image is convenient and expands the usage scenarios considerably without relying on a specific panoramic camera or images taken from multiple directions; however, achieving such images remains a challenging and unresolved problem. The primary challenge is controlling the high degree of freedom involved in generating a wide area that includes all directions of the desired spherical image. We focus on scene symmetry, which is a basic property of the global structure of spherical images, such as rotational symmetry, plane symmetry, and asymmetry. We propose a method for generating a spherical image from a single NFOV image and controlling the degree of freedom of the generated regions using the scene symmetry. To estimate and control the scene symmetry using both a circular shift and flip of the latent image features, we incorporate the intensity of the symmetry as a latent variable into conditional variational autoencoders. Our experiments show that the proposed method can generate various plausible spherical images controlled from symmetric to asymmetric, and can reduce the reconstruction errors of the generated images based on the estimated symmetry.
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Tai, Kuan-Chen, and Chih-Wei Tang. "Siamese Networks-Based People Tracking Using Template Update for 360-Degree Videos Using EAC Format." Sensors 21, no. 5 (March 1, 2021): 1682. http://dx.doi.org/10.3390/s21051682.

Повний текст джерела
Анотація:
Rich information is provided by 360-degree videos. However, non-uniform geometric deformation caused by sphere-to-plane projection significantly decreases tracking accuracy of existing trackers, and the huge amount of data makes it difficult to achieve real-time tracking. Thus, this paper proposes a Siamese networks-based people tracker using template update for 360-degree equi-angular cubemap (EAC) format videos. Face stitching overcomes the problem of content discontinuity of the EAC format and avoids raising new geometric deformation in stitched images. Fully convolutional Siamese networks enable tracking at high speed. Mostly important, to be robust against combination of non-uniform geometric deformation of the EAC format and partial occlusions caused by zero padding in stitched images, this paper proposes a novel Bayes classifier-based timing detector of template update by referring to the linear discriminant feature and statistics of a score map generated by Siamese networks. Experimental results show that the proposed scheme significantly improves tracking accuracy of the fully convolutional Siamese networks SiamFC on the EAC format with operation beyond the frame acquisition rate. Moreover, the proposed score map-based timing detector of template update outperforms state-of-the-art score map-based timing detectors.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Kawakami, Tohru, Munekazu Date, Mutsumi Sasai, and Hideaki Takada. "360-degree screen-free floating 3D image in a crystal ball using a spatially imaged iris and rotational multiview DFD technologies." Applied Optics 56, no. 22 (July 25, 2017): 6156. http://dx.doi.org/10.1364/ao.56.006156.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
42

An , So Rin, and Young Hoon Jo. "Building Virtual Reality Contents of Excavated Remains Based on 360-degree Panorama Image." Science and engineering of cultural heritage 15, no. 1 (December 25, 2020): 93–99. http://dx.doi.org/10.37563/sech.15.1.10.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
43

IMAI, Arisa, Takao KATO, Kazuhiro KANEKO, Mai YAMAGUCHI, and Shunichi ISHIHARA. "Examination of effect of undoing of natural environment in 360-degree panorama image." Proceedings of the Annual Convention of the Japanese Psychological Association 74 (September 20, 2010): 1EV120. http://dx.doi.org/10.4992/pacjpa.74.0_1ev120.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Cheng, Yih-Shyang, Yuan-Tien Su, and Chih-Hung Chen. "360-degree viewable image-plane disk-type multiplex holography by one-step recording." Optics Express 18, no. 13 (June 15, 2010): 14012. http://dx.doi.org/10.1364/oe.18.014012.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Kim, Hakdong, Heonyeong Lim, Minkyu Jee, Yurim Lee, MinSung Yoon, and Cheongwon Kim. "High-Precision Depth Map Estimation from Missing Viewpoints for 360-Degree Digital Holography." Applied Sciences 12, no. 19 (September 20, 2022): 9432. http://dx.doi.org/10.3390/app12199432.

Повний текст джерела
Анотація:
In this paper, we propose a novel model to extract highly precise depth maps from missing viewpoints, especially for generating holographic 3D content. These depth maps are essential elements for phase extraction, which is required for the synthesis of computer-generated holograms (CGHs). The proposed model, called the holographic dense depth, estimates depth maps through feature extraction, combining up-sampling. We designed and prepared a total of 9832 multi-view images with resolutions of 640 × 360. We evaluated our model by comparing the estimated depth maps with their ground truths using various metrics. We further compared the CGH patterns created from estimated depth maps with those from ground truths and reconstructed the holographic 3D image scenes from their CGHs. Both quantitative and qualitative results demonstrate the effectiveness of the proposed method.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Jin, Xun, and Jongweon Kim. "Artwork Identification for 360-Degree Panoramic Images Using Polyhedron-Based Rectilinear Projection and Keypoint Shapes." Applied Sciences 7, no. 5 (May 19, 2017): 528. http://dx.doi.org/10.3390/app7050528.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Gao, Jiahao, Zhiwen Hu, Kaigui Bian, Xinyu Mao, and Lingyang Song. "AQ360: UAV-Aided Air Quality Monitoring by 360-Degree Aerial Panoramic Images in Urban Areas." IEEE Internet of Things Journal 8, no. 1 (January 1, 2021): 428–42. http://dx.doi.org/10.1109/jiot.2020.3004582.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Shimura, Masayasu, Shingo Yoshida, Kosuke Osawa, Yuki Minamoto, Takeshi Yokomori, Kaoru Iwamoto, Mamoru Tanahashi, and Hidenori Kosaka. "Micro particle image velocimetry investigation of near-wall behaviors of tumble enhanced flow in an internal combustion engine." International Journal of Engine Research 20, no. 7 (May 28, 2018): 718–25. http://dx.doi.org/10.1177/1468087418774710.

Повний текст джерела
Анотація:
A micro particle image velocimetry has been performed to investigate tumble enhanced flow characteristics near piston top surface of a motored internal combustion engine for three inlet valve open timing (−30, −15, 0 crank angle degrees). Particle image velocimetry was conducted at 340, 350 and 360 crank angle degrees of the end of the compression stroke at the constant motored speed of 2000 r/min. The measurement region was 3.2 mm × 1.5 mm on the piston top including central axis of the cylinder. The spatial resolution of particle image velocimetry in the wall-normal direction was 75 µm and the vector spacing was 37.5 µm. The first velocity vector is located about 60 µm from the piston top surface. The micro particle image velocimetry measurements revealed that the ensemble-averaged flow near the piston top is not close to the turbulent boundary layer and rather has tendency of the Blasius theorem, whereas fluctuation root-mean-square velocity near the wall is not low. This result shows that revision of a wall heat transfer model based on an assumption of the proper characteristics of flow field near the piston top is required for more accurate prediction of heat flux in gasoline engines.
Стилі APA, Harvard, Vancouver, ISO та ін.
49

McCracken, D. Jay, Raymond A. Higginbotham, Jason H. Boulter, Yuan Liu, John A. Wells, Sameer H. Halani, Amit M. Saindane, Nelson M. Oyesiku, Daniel L. Barrow, and Jeffrey J. Olson. "Degree of Vascular Encasement in Sphenoid Wing Meningiomas Predicts Postoperative Ischemic Complications." Neurosurgery 80, no. 6 (January 28, 2017): 957–66. http://dx.doi.org/10.1093/neuros/nyw134.

Повний текст джерела
Анотація:
Abstract BACKGROUND: Sphenoid wing meningiomas (SWMs) can encase arteries of the circle of Willis, increasing their susceptibility to intraoperative vascular injury and severe ischemic complications. OBJECTIVE: To demonstrate the effect of circumferential vascular encasement in SWM on postoperative ischemia. METHODS: A retrospective review of 75 patients surgically treated for SWM from 2009 to 2015 was undertaken to determine the degree of circumferential vascular encasement (0°-360°) as assessed by preoperative magnetic resonance imaging (MRI). A novel grading system describing “maximum” and “total” arterial encasement scores was created. Postoperative MRIs were reviewed for total ischemia volume measured on sequential diffusion-weighted images. RESULTS: Of the 75 patients, 89.3% had some degree of vascular involvement with a median maximum encasement score of 3.0 (2.0-3.0) in the internal carotid artery (ICA), M1, M2, and A1 segments; 76% of patients had some degree of ischemia with median infarct volume of 3.75 cm3 (0.81-9.3 cm3). Univariate analysis determined risk factors associated with larger infarction volume, which were encasement of the supraclinoid ICA (P &lt; .001), M1 segment (P &lt; .001), A1 segment (P = .015), and diabetes (P = .019). As the maximum encasement score increased from 1 to 5 in each of the significant arterial segments, so did mean and median infarction volume (P &lt; .001). Risk for devastating ischemic injury &gt;62 cm3 was found when the ICA, M1, and A1 vessels all had ≥360° involvement (P = .001). Residual tumor was associated with smaller infarct volumes (P = .022). As infarction volume increased, so did modified Rankin Score at discharge (P = .025). CONCLUSION: Subtotal resection should be considered in SWM with significant vascular encasement of proximal arteries to limit postoperative ischemic complications.
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Momonoi, Yoshiharu, Koya Yamamoto, Yoshihiro Yokote, Atsushi Sato, and Yasuhiro Takaki. "Systematic Approach for Alignment of Light Field Mirage." Applied Sciences 12, no. 23 (December 4, 2022): 12413. http://dx.doi.org/10.3390/app122312413.

Повний текст джерела
Анотація:
We previously proposed techniques to eliminate repeated three-dimensional (3D) images produced by the light field Mirage, which consists of circularly aligned multiple-slanted light field displays. However, we only constructed the lower half of the system to verify the proposed elimination techniques. In this study, we developed an alignment technique for a complete 360-degree display system. The alignment techniques for conventional 360-degree display systems, which use a large number of projectors, greatly depend on electronic calibration, which indispensably causes image quality degradation. We propose a systematic approach for the alignment for the light field Mirage, which causes less image quality degradation by taking advantage of the small number of display devices required for the light field Mirage. The calibration technique for light field displays, the image stitching technique, and the brightness matching technique are consecutively performed, and the generation of 360-degree 3D images is verified.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії