Littérature scientifique sur le sujet « 360 VIDEO STREAMING »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « 360 VIDEO STREAMING ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Articles de revues sur le sujet "360 VIDEO STREAMING"
Bradis, Nikolai Valerievich. « 360 Video Protection and Streaming ». International Journal of Advanced Trends in Computer Science and Engineering 8, no 6 (15 décembre 2019) : 3289–96. http://dx.doi.org/10.30534/ijatcse/2019/99862019.
Texte intégralJeong, JongBeom, Dongmin Jang, Jangwoo Son et Eun-Seok Ryu. « 3DoF+ 360 Video Location-Based Asymmetric Down-Sampling for View Synthesis to Immersive VR Video Streaming ». Sensors 18, no 9 (18 septembre 2018) : 3148. http://dx.doi.org/10.3390/s18093148.
Texte intégralNguyen, Anh, et Zhisheng Yan. « Enhancing 360 Video Streaming through Salient Content in Head-Mounted Displays ». Sensors 23, no 8 (15 avril 2023) : 4016. http://dx.doi.org/10.3390/s23084016.
Texte intégralFan, Ching-Ling, Wen-Chih Lo, Yu-Tung Pai et Cheng-Hsin Hsu. « A Survey on 360° Video Streaming ». ACM Computing Surveys 52, no 4 (18 septembre 2019) : 1–36. http://dx.doi.org/10.1145/3329119.
Texte intégralWong, En Sing, Nur Haliza Abdul Wahab, Faisal Saeed et Nouf Alharbi. « 360-Degree Video Bandwidth Reduction : Technique and Approaches Comprehensive Review ». Applied Sciences 12, no 15 (28 juillet 2022) : 7581. http://dx.doi.org/10.3390/app12157581.
Texte intégralGarcia, Henrique D., Mylène C. Q. Farias, Ravi Prakash et Marcelo M. Carvalho. « Statistical characterization of tile decoding time of HEVC-encoded 360° video ». Electronic Imaging 2020, no 9 (26 janvier 2020) : 285–1. http://dx.doi.org/10.2352/issn.2470-1173.2020.9.iqsp-285.
Texte intégralNguyen, Dien, Tuan Le, Sangsoon Lee et Eun-Seok Ryu. « SHVC Tile-Based 360-Degree Video Streaming for Mobile VR : PC Offloading Over mmWave ». Sensors 18, no 11 (1 novembre 2018) : 3728. http://dx.doi.org/10.3390/s18113728.
Texte intégralChen, Xiaolei, Di Wu et Ishfaq Ahmad. « Optimized viewport‐adaptive 360‐degree video streaming ». CAAI Transactions on Intelligence Technology 6, no 3 (3 mars 2021) : 347–59. http://dx.doi.org/10.1049/cit2.12011.
Texte intégralPodborski, Dimitri, Emmanuel Thomas, Miska M. Hannuksela, Sejin Oh, Thomas Stockhammer et Stefan Pham. « 360-Degree Video Streaming with MPEG-DASH ». SMPTE Motion Imaging Journal 127, no 7 (août 2018) : 20–27. http://dx.doi.org/10.5594/jmi.2018.2838779.
Texte intégralPeng, Shuai, Jialu Hu, Han Xiao, Shujie Yang et Changqiao Xu. « Viewport-Driven Adaptive 360◦ Live Streaming Optimization Framework ». Journal of Networking and Network Applications 1, no 4 (janvier 2022) : 139–49. http://dx.doi.org/10.33969/j-nana.2021.010401.
Texte intégralThèses sur le sujet "360 VIDEO STREAMING"
Kattadige, Chamara Manoj Madarasinghe. « Network and Content Intelligence for 360 Degree Video Streaming Optimization ». Thesis, The University of Sydney, 2023. https://hdl.handle.net/2123/29904.
Texte intégralCorbillon, Xavier. « Enable the next generation of interactive video streaming ». Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2018. http://www.theses.fr/2018IMTA0103/document.
Texte intégralOmnidirectional videos, also denoted as spherical videos or 360° videos, are videos with pixels recorded from a given viewpoint in every direction of space. A user watching such an omnidirectional content with a Head Mounted Display (HMD) can select the portion of the videoto display, usually denoted as viewport, by moving her head. To feel high immersion inside the content a user needs to see viewport with 4K resolutionand 90 Hz frame rate. With traditional streaming technologies, providing such quality would require a data rate of more than 100 Mbit s−1, which is far too high compared to the median Internet access band width. In this dissertation, I present my contributions to enable the streaming of highly immersive omnidirectional videos on the Internet. We can distinguish six contributions : a viewport-adaptive streaming architecture proposal reusing a part of existing technologies ; an extension of this architecture for videos with six degrees of freedom ; two theoretical studies of videos with non homogeneous spatial quality ; an open-source software for handling 360° videos ; and a dataset of recorded users’ trajectories while watching 360° videos
Almquist, Mathias, et Viktor Almquist. « Analysis of 360° Video Viewing Behaviour ». Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-144405.
Texte intégralAlmquist, Mathias, et Viktor Almquist. « Analysis of 360° Video Viewing Behaviours ». Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-144907.
Texte intégralLindskog, Eric. « Developing an emulator for 360° video : intended for algorithm development ». Thesis, Linköpings universitet, Databas och informationsteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-171369.
Texte intégralMittal, Ashutosh. « Novel Approach to Optimize Bandwidth Consumption for Video Streaming using Eye Tracking ». Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-212061.
Texte intégralNya framsteg inom ögonstyrningsteknologi har möjliggjort att betrakta ögonstyrning (o.k.s. eyetracking) som ett billigt, pålitligt och effektivt tillägg till teknologier för människa-dator interaktion. Det här examensarbetet utforskar möjligheten att använda ögonstyrning för klientmedveten videoströmning. Allt fler personer förbrukar videoinnehåll av hög kvalitet genom trådlösa nätverk, därmed finns det ett behov av att optimera bandbreddskonsumtionen för effektiv leverans av ett sådant högkvalitativt innehåll, både för 2Doch 360°-videor.Det här arbetet introducerar SEEN (Smart Eye-tracking Enabled Networking), en ny approach för att strömma videoinnehåll, som bygger på realtidsinformation från ögonstyrning. Den använder HEVC-metoder för rutindelning av video för att visa högkvalitativt och lågkvalitativt innehåll i samma videoram, beroende på vart användaren tittar. Lönsamheten av den föreslagna approachen validerades med hjälp av omfattande användartester utförda på en testbädd för upplevelsekvalité (Quality of Experience, QoE) som också utvecklades som en del av det här examensarbetet. Testresultaten visar betydande bandbreddsbesparingar på upp till 71% för 2D-videor på vanliga 4K-skärmar samt upp till 83% för 360°-videor på VR-headset för acceptabla QoE-betyg. En komparativ studie om viewport tracking och ögonstyrning i VR-headset är också inkluderad i det här examensarbetet för att ytterligare förespråka behovet av ögonstyrning.Denna forskning genomfördes i samarbete med Ericsson, Tobii och KTH under paraplyprojektet SEEN: Smart Eye-tracking Enabled Networking.
Timoncini, Riccardo. « Streaming audio e video nei sistemi Peer-To-Peer TV : il caso Sopcast P2PTV ». Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amslaurea.unibo.it/3670/.
Texte intégralYang, Cheng-Yu, et 楊正宇. « Visual attention guided 360-degree video streaming ». Thesis, 2019. http://ndltd.ncl.edu.tw/handle/c4yada.
Texte intégral國立中正大學
電機工程研究所
107
In recent years, with the development of multimedia video, smart phones and Virtual Reality (VR) headsets are around us. The video content that we watch every day is gradually developing in a variety of ways. For example, the 360-degree video is popular and many Youtuber and Facebook users upload the 360-degree videos in reporting their travel and in broadcasting the live events. To offer the immersive experience, storage and transmission bandwidth should be taken into account. The huge data amount of 360-degree videos makes it a challenge for efficient transmission and storage. In a limited bandwidth network, the playback of 360-degree video has some problems, such as freeze frames or poor quality of the demanded viewport, due to the huge amount of data. This could degrade the quality of user experience. Therefore, efficient compression and transmission of low-latency 360-degree video/video is important. Based on the human visual characteristics, this work propose techniques of 360-degree image coding and 360-degree video streaming. For the proposed image coding technique, the saliency map is used to modify the distortion during the RDO (rate-distortion optimization) process while it is used to predict the ROI (region of interest) for the proposed video coding technique. The experimental results show that up to 14.71% bitrate is achieved for the proposed image coding technique. For the 360-degree video streaming, this work allocates more resource for the ROIs during the rate control process to make sure a high quality of viewport demanded by the user is offered. Considering the variance of network bandwidth, MPEG-DASH is adopted and the proposed technique of 360-degree video streaming is implemented. Both subjective and objective experiments indicate the superiority of the proposed technique over the anchor scheme.
PURWAR, ABHINAV. « FOV PREDICTION FOR 360 VIDEO STREAMING IN VIRTUAL REALITY ». Thesis, 2019. http://dspace.dtu.ac.in:8080/jspui/handle/repository/16427.
Texte intégralLo, Wen-Chih, et 羅文志. « Edge-Assisted 360-degree Video Streaming for Head-Mounted Virtual Reality ». Thesis, 2018. http://ndltd.ncl.edu.tw/handle/849u53.
Texte intégral國立清華大學
資訊工程學系所
106
Over the past years, 360◦ video streaming is getting increasingly popular. Watching these videos with Head-Mounted Displays (HMDs), also known as VR headsets, gives a better immersive experience than using traditional planar monitors. However, several open challenges keep state-of-the-art technology away from the immersive viewing experience, including high bandwidth consumption, long turn around latency, and heterogeneous HMD devices. In this thesis, we propose an edge-assisted 360◦ video streaming system, which leverage edge networks to perform viewport rendering. We formulate the optimization problem to determine which HMD client should be served without overloading the edge devices. We design an algorithm to solve the problem as mentioned earlier, and a real testbed is implemented to prove the concept. The resulting edge-assisted 360◦ video streaming system is evaluated through extensive experiments with an open-sourced 360◦ viewing dataset. With the assistance of edge devices, we can reduce the bandwidth usage and computation workload on HMD devices when serving the viewers. Also, the lower network latency is guaranteed. We also conduct several extensive experiment. The results show that compared to current 360◦ video streaming platforms, like YouTube, our edge-assisted rendering platform can: (i) save up to 62% in bandwidth consumption, (ii) achieve higher viewing video quality at a given bitrate, (iii) reduce the computation workload for those lightweight HMDs. Our proposed system and the viewing dataset are open-sourced and can be leveraged by researchers and engineers to improve the 360◦ video streaming further.
Chapitres de livres sur le sujet "360 VIDEO STREAMING"
Curcio, Igor D. D., Dmitrii Monakhov, Ari Hourunranta et Emre Baris Aksu. « Tile Priorities in Adaptive 360-Degree Video Streaming ». Dans Lecture Notes in Computer Science, 212–23. Cham : Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-54407-2_18.
Texte intégralSekine, Arisa, et Masaki Bandai. « Tile Quality Selection Method in 360-Degree Tile-Based Video Streaming ». Dans Advances in Intelligent Systems and Computing, 535–44. Cham : Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-44038-1_49.
Texte intégralZou, Wenjie, et Fuzheng Yang. « Measuring Quality of Experience of Novel 360-Degree Streaming Video During Stalling ». Dans Communications and Networking, 417–24. Cham : Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-78130-3_43.
Texte intégralLi, Yaru, Li Yu, Chunyu Lin, Yao Zhao et Moncef Gabbouj. « Convolutional Neural Network Based Inter-Frame Enhancement for 360-Degree Video Streaming ». Dans Advances in Multimedia Information Processing – PCM 2018, 57–66. Cham : Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-00767-6_6.
Texte intégralLi, Yunqiao, Yiling Xu, Shaowei Xie, Liangji Ma et Jun Sun. « Two-Layer FoV Prediction Model for Viewport Dependent Streaming of 360-Degree Videos ». Dans Communications and Networking, 501–9. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-06161-6_49.
Texte intégralSkondras, Emmanouil, Konstantina Siountri, Angelos Michalas et Dimitrios D. Vergados. « Personalized Real-Time Virtual Tours in Places With Cultural Interest ». Dans Destination Management and Marketing, 802–20. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-2469-5.ch044.
Texte intégralRowe, Neil C. « Critical Issues in Content Repurposing for Small Devices ». Dans Encyclopedia of Multimedia Technology and Networking, Second Edition, 293–98. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-014-1.ch040.
Texte intégralActes de conférences sur le sujet "360 VIDEO STREAMING"
Lu, Yiyun, Yifei Zhu et Zhi Wang. « Personalized 360-Degree Video Streaming ». Dans MM '22 : The 30th ACM International Conference on Multimedia. New York, NY, USA : ACM, 2022. http://dx.doi.org/10.1145/3503161.3548047.
Texte intégralSon, Jangwoo, Dongmin Jang et Eun-Seok Ryu. « Implementing 360 video tiled streaming system ». Dans MMSys '18 : 9th ACM Multimedia Systems Conference. New York, NY, USA : ACM, 2018. http://dx.doi.org/10.1145/3204949.3208119.
Texte intégralChen, Xianda, Tianxiang Tan et Guohong Cao. « Popularity-Aware 360-Degree Video Streaming ». Dans IEEE INFOCOM 2021 - IEEE Conference on Computer Communications. IEEE, 2021. http://dx.doi.org/10.1109/infocom42981.2021.9488856.
Texte intégralLiu, Xing, Qingyang Xiao, Vijay Gopalakrishnan, Bo Han, Feng Qian et Matteo Varvello. « 360° Innovations for Panoramic Video Streaming ». Dans HotNets-XVI : The 16th ACM Workshop on Hot Topics in Networks. New York, NY, USA : ACM, 2017. http://dx.doi.org/10.1145/3152434.3152443.
Texte intégralNguyen, Duc V., Hoang Van Trung, Hoang Le Dieu Huong, Truong Thu Huong, Nam Pham Ngoc et Truong Cong Thang. « Scalable 360 Video Streaming using HTTP/2 ». Dans 2019 IEEE 21st International Workshop on Multimedia Signal Processing (MMSP). IEEE, 2019. http://dx.doi.org/10.1109/mmsp.2019.8901805.
Texte intégralSilva, Rodrigo M. A., Bruno Feijó, Pablo B. Gomes, Thiago Frensh et Daniel Monteiro. « Real time 360° video stitching and streaming ». Dans SIGGRAPH '16 : Special Interest Group on Computer Graphics and Interactive Techniques Conference. New York, NY, USA : ACM, 2016. http://dx.doi.org/10.1145/2945078.2945148.
Texte intégralChopra, Lovish, Sarthak Chakraborty, Abhijit Mondal et Sandip Chakraborty. « PARIMA : Viewport Adaptive 360-Degree Video Streaming ». Dans WWW '21 : The Web Conference 2021. New York, NY, USA : ACM, 2021. http://dx.doi.org/10.1145/3442381.3450070.
Texte intégralSeo, Bong-Seok, Eunyoung Jeong, ChangJong Hyun, Dongho You et Dong Ho Kim. « 360- Degree Video Streaming Using Stitching Information ». Dans 2019 IEEE International Conference on Consumer Electronics (ICCE). IEEE, 2019. http://dx.doi.org/10.1109/icce.2019.8661926.
Texte intégralNasrabadi, Afshin Taghavi, Anahita Mahzari, Joseph D. Beshay et Ravi Prakash. « Adaptive 360-degree video streaming using layered video coding ». Dans 2017 IEEE Virtual Reality (VR). IEEE, 2017. http://dx.doi.org/10.1109/vr.2017.7892319.
Texte intégralNasrabadi, Afshin Taghavi, Anahita Mahzari, Joseph D. Beshay et Ravi Prakash. « Adaptive 360-Degree Video Streaming using Scalable Video Coding ». Dans MM '17 : ACM Multimedia Conference. New York, NY, USA : ACM, 2017. http://dx.doi.org/10.1145/3123266.3123414.
Texte intégral