Academic literature on the topic '360 VIDEO STREAMING'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic '360 VIDEO STREAMING.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "360 VIDEO STREAMING"
Bradis, Nikolai Valerievich. "360 Video Protection and Streaming." International Journal of Advanced Trends in Computer Science and Engineering 8, no. 6 (December 15, 2019): 3289–96. http://dx.doi.org/10.30534/ijatcse/2019/99862019.
Full textJeong, JongBeom, Dongmin Jang, Jangwoo Son, and Eun-Seok Ryu. "3DoF+ 360 Video Location-Based Asymmetric Down-Sampling for View Synthesis to Immersive VR Video Streaming." Sensors 18, no. 9 (September 18, 2018): 3148. http://dx.doi.org/10.3390/s18093148.
Full textNguyen, Anh, and Zhisheng Yan. "Enhancing 360 Video Streaming through Salient Content in Head-Mounted Displays." Sensors 23, no. 8 (April 15, 2023): 4016. http://dx.doi.org/10.3390/s23084016.
Full textFan, Ching-Ling, Wen-Chih Lo, Yu-Tung Pai, and Cheng-Hsin Hsu. "A Survey on 360° Video Streaming." ACM Computing Surveys 52, no. 4 (September 18, 2019): 1–36. http://dx.doi.org/10.1145/3329119.
Full textWong, En Sing, Nur Haliza Abdul Wahab, Faisal Saeed, and Nouf Alharbi. "360-Degree Video Bandwidth Reduction: Technique and Approaches Comprehensive Review." Applied Sciences 12, no. 15 (July 28, 2022): 7581. http://dx.doi.org/10.3390/app12157581.
Full textGarcia, Henrique D., Mylène C. Q. Farias, Ravi Prakash, and Marcelo M. Carvalho. "Statistical characterization of tile decoding time of HEVC-encoded 360° video." Electronic Imaging 2020, no. 9 (January 26, 2020): 285–1. http://dx.doi.org/10.2352/issn.2470-1173.2020.9.iqsp-285.
Full textNguyen, Dien, Tuan Le, Sangsoon Lee, and Eun-Seok Ryu. "SHVC Tile-Based 360-Degree Video Streaming for Mobile VR: PC Offloading Over mmWave." Sensors 18, no. 11 (November 1, 2018): 3728. http://dx.doi.org/10.3390/s18113728.
Full textChen, Xiaolei, Di Wu, and Ishfaq Ahmad. "Optimized viewport‐adaptive 360‐degree video streaming." CAAI Transactions on Intelligence Technology 6, no. 3 (March 3, 2021): 347–59. http://dx.doi.org/10.1049/cit2.12011.
Full textPodborski, Dimitri, Emmanuel Thomas, Miska M. Hannuksela, Sejin Oh, Thomas Stockhammer, and Stefan Pham. "360-Degree Video Streaming with MPEG-DASH." SMPTE Motion Imaging Journal 127, no. 7 (August 2018): 20–27. http://dx.doi.org/10.5594/jmi.2018.2838779.
Full textPeng, Shuai, Jialu Hu, Han Xiao, Shujie Yang, and Changqiao Xu. "Viewport-Driven Adaptive 360◦ Live Streaming Optimization Framework." Journal of Networking and Network Applications 1, no. 4 (January 2022): 139–49. http://dx.doi.org/10.33969/j-nana.2021.010401.
Full textDissertations / Theses on the topic "360 VIDEO STREAMING"
Kattadige, Chamara Manoj Madarasinghe. "Network and Content Intelligence for 360 Degree Video Streaming Optimization." Thesis, The University of Sydney, 2023. https://hdl.handle.net/2123/29904.
Full textCorbillon, Xavier. "Enable the next generation of interactive video streaming." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2018. http://www.theses.fr/2018IMTA0103/document.
Full textOmnidirectional videos, also denoted as spherical videos or 360° videos, are videos with pixels recorded from a given viewpoint in every direction of space. A user watching such an omnidirectional content with a Head Mounted Display (HMD) can select the portion of the videoto display, usually denoted as viewport, by moving her head. To feel high immersion inside the content a user needs to see viewport with 4K resolutionand 90 Hz frame rate. With traditional streaming technologies, providing such quality would require a data rate of more than 100 Mbit s−1, which is far too high compared to the median Internet access band width. In this dissertation, I present my contributions to enable the streaming of highly immersive omnidirectional videos on the Internet. We can distinguish six contributions : a viewport-adaptive streaming architecture proposal reusing a part of existing technologies ; an extension of this architecture for videos with six degrees of freedom ; two theoretical studies of videos with non homogeneous spatial quality ; an open-source software for handling 360° videos ; and a dataset of recorded users’ trajectories while watching 360° videos
Almquist, Mathias, and Viktor Almquist. "Analysis of 360° Video Viewing Behaviour." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-144405.
Full textAlmquist, Mathias, and Viktor Almquist. "Analysis of 360° Video Viewing Behaviours." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-144907.
Full textLindskog, Eric. "Developing an emulator for 360° video : intended for algorithm development." Thesis, Linköpings universitet, Databas och informationsteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-171369.
Full textMittal, Ashutosh. "Novel Approach to Optimize Bandwidth Consumption for Video Streaming using Eye Tracking." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-212061.
Full textNya framsteg inom ögonstyrningsteknologi har möjliggjort att betrakta ögonstyrning (o.k.s. eyetracking) som ett billigt, pålitligt och effektivt tillägg till teknologier för människa-dator interaktion. Det här examensarbetet utforskar möjligheten att använda ögonstyrning för klientmedveten videoströmning. Allt fler personer förbrukar videoinnehåll av hög kvalitet genom trådlösa nätverk, därmed finns det ett behov av att optimera bandbreddskonsumtionen för effektiv leverans av ett sådant högkvalitativt innehåll, både för 2Doch 360°-videor.Det här arbetet introducerar SEEN (Smart Eye-tracking Enabled Networking), en ny approach för att strömma videoinnehåll, som bygger på realtidsinformation från ögonstyrning. Den använder HEVC-metoder för rutindelning av video för att visa högkvalitativt och lågkvalitativt innehåll i samma videoram, beroende på vart användaren tittar. Lönsamheten av den föreslagna approachen validerades med hjälp av omfattande användartester utförda på en testbädd för upplevelsekvalité (Quality of Experience, QoE) som också utvecklades som en del av det här examensarbetet. Testresultaten visar betydande bandbreddsbesparingar på upp till 71% för 2D-videor på vanliga 4K-skärmar samt upp till 83% för 360°-videor på VR-headset för acceptabla QoE-betyg. En komparativ studie om viewport tracking och ögonstyrning i VR-headset är också inkluderad i det här examensarbetet för att ytterligare förespråka behovet av ögonstyrning.Denna forskning genomfördes i samarbete med Ericsson, Tobii och KTH under paraplyprojektet SEEN: Smart Eye-tracking Enabled Networking.
Timoncini, Riccardo. "Streaming audio e video nei sistemi Peer-To-Peer TV: il caso Sopcast P2PTV." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amslaurea.unibo.it/3670/.
Full textYang, Cheng-Yu, and 楊正宇. "Visual attention guided 360-degree video streaming." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/c4yada.
Full text國立中正大學
電機工程研究所
107
In recent years, with the development of multimedia video, smart phones and Virtual Reality (VR) headsets are around us. The video content that we watch every day is gradually developing in a variety of ways. For example, the 360-degree video is popular and many Youtuber and Facebook users upload the 360-degree videos in reporting their travel and in broadcasting the live events. To offer the immersive experience, storage and transmission bandwidth should be taken into account. The huge data amount of 360-degree videos makes it a challenge for efficient transmission and storage. In a limited bandwidth network, the playback of 360-degree video has some problems, such as freeze frames or poor quality of the demanded viewport, due to the huge amount of data. This could degrade the quality of user experience. Therefore, efficient compression and transmission of low-latency 360-degree video/video is important. Based on the human visual characteristics, this work propose techniques of 360-degree image coding and 360-degree video streaming. For the proposed image coding technique, the saliency map is used to modify the distortion during the RDO (rate-distortion optimization) process while it is used to predict the ROI (region of interest) for the proposed video coding technique. The experimental results show that up to 14.71% bitrate is achieved for the proposed image coding technique. For the 360-degree video streaming, this work allocates more resource for the ROIs during the rate control process to make sure a high quality of viewport demanded by the user is offered. Considering the variance of network bandwidth, MPEG-DASH is adopted and the proposed technique of 360-degree video streaming is implemented. Both subjective and objective experiments indicate the superiority of the proposed technique over the anchor scheme.
PURWAR, ABHINAV. "FOV PREDICTION FOR 360 VIDEO STREAMING IN VIRTUAL REALITY." Thesis, 2019. http://dspace.dtu.ac.in:8080/jspui/handle/repository/16427.
Full textLo, Wen-Chih, and 羅文志. "Edge-Assisted 360-degree Video Streaming for Head-Mounted Virtual Reality." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/849u53.
Full text國立清華大學
資訊工程學系所
106
Over the past years, 360◦ video streaming is getting increasingly popular. Watching these videos with Head-Mounted Displays (HMDs), also known as VR headsets, gives a better immersive experience than using traditional planar monitors. However, several open challenges keep state-of-the-art technology away from the immersive viewing experience, including high bandwidth consumption, long turn around latency, and heterogeneous HMD devices. In this thesis, we propose an edge-assisted 360◦ video streaming system, which leverage edge networks to perform viewport rendering. We formulate the optimization problem to determine which HMD client should be served without overloading the edge devices. We design an algorithm to solve the problem as mentioned earlier, and a real testbed is implemented to prove the concept. The resulting edge-assisted 360◦ video streaming system is evaluated through extensive experiments with an open-sourced 360◦ viewing dataset. With the assistance of edge devices, we can reduce the bandwidth usage and computation workload on HMD devices when serving the viewers. Also, the lower network latency is guaranteed. We also conduct several extensive experiment. The results show that compared to current 360◦ video streaming platforms, like YouTube, our edge-assisted rendering platform can: (i) save up to 62% in bandwidth consumption, (ii) achieve higher viewing video quality at a given bitrate, (iii) reduce the computation workload for those lightweight HMDs. Our proposed system and the viewing dataset are open-sourced and can be leveraged by researchers and engineers to improve the 360◦ video streaming further.
Book chapters on the topic "360 VIDEO STREAMING"
Curcio, Igor D. D., Dmitrii Monakhov, Ari Hourunranta, and Emre Baris Aksu. "Tile Priorities in Adaptive 360-Degree Video Streaming." In Lecture Notes in Computer Science, 212–23. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-54407-2_18.
Full textSekine, Arisa, and Masaki Bandai. "Tile Quality Selection Method in 360-Degree Tile-Based Video Streaming." In Advances in Intelligent Systems and Computing, 535–44. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-44038-1_49.
Full textZou, Wenjie, and Fuzheng Yang. "Measuring Quality of Experience of Novel 360-Degree Streaming Video During Stalling." In Communications and Networking, 417–24. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-78130-3_43.
Full textLi, Yaru, Li Yu, Chunyu Lin, Yao Zhao, and Moncef Gabbouj. "Convolutional Neural Network Based Inter-Frame Enhancement for 360-Degree Video Streaming." In Advances in Multimedia Information Processing – PCM 2018, 57–66. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-00767-6_6.
Full textLi, Yunqiao, Yiling Xu, Shaowei Xie, Liangji Ma, and Jun Sun. "Two-Layer FoV Prediction Model for Viewport Dependent Streaming of 360-Degree Videos." In Communications and Networking, 501–9. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-06161-6_49.
Full textSkondras, Emmanouil, Konstantina Siountri, Angelos Michalas, and Dimitrios D. Vergados. "Personalized Real-Time Virtual Tours in Places With Cultural Interest." In Destination Management and Marketing, 802–20. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-2469-5.ch044.
Full textRowe, Neil C. "Critical Issues in Content Repurposing for Small Devices." In Encyclopedia of Multimedia Technology and Networking, Second Edition, 293–98. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-014-1.ch040.
Full textConference papers on the topic "360 VIDEO STREAMING"
Lu, Yiyun, Yifei Zhu, and Zhi Wang. "Personalized 360-Degree Video Streaming." In MM '22: The 30th ACM International Conference on Multimedia. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3503161.3548047.
Full textSon, Jangwoo, Dongmin Jang, and Eun-Seok Ryu. "Implementing 360 video tiled streaming system." In MMSys '18: 9th ACM Multimedia Systems Conference. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3204949.3208119.
Full textChen, Xianda, Tianxiang Tan, and Guohong Cao. "Popularity-Aware 360-Degree Video Streaming." In IEEE INFOCOM 2021 - IEEE Conference on Computer Communications. IEEE, 2021. http://dx.doi.org/10.1109/infocom42981.2021.9488856.
Full textLiu, Xing, Qingyang Xiao, Vijay Gopalakrishnan, Bo Han, Feng Qian, and Matteo Varvello. "360° Innovations for Panoramic Video Streaming." In HotNets-XVI: The 16th ACM Workshop on Hot Topics in Networks. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3152434.3152443.
Full textNguyen, Duc V., Hoang Van Trung, Hoang Le Dieu Huong, Truong Thu Huong, Nam Pham Ngoc, and Truong Cong Thang. "Scalable 360 Video Streaming using HTTP/2." In 2019 IEEE 21st International Workshop on Multimedia Signal Processing (MMSP). IEEE, 2019. http://dx.doi.org/10.1109/mmsp.2019.8901805.
Full textSilva, Rodrigo M. A., Bruno Feijó, Pablo B. Gomes, Thiago Frensh, and Daniel Monteiro. "Real time 360° video stitching and streaming." In SIGGRAPH '16: Special Interest Group on Computer Graphics and Interactive Techniques Conference. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2945078.2945148.
Full textChopra, Lovish, Sarthak Chakraborty, Abhijit Mondal, and Sandip Chakraborty. "PARIMA: Viewport Adaptive 360-Degree Video Streaming." In WWW '21: The Web Conference 2021. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3442381.3450070.
Full textSeo, Bong-Seok, Eunyoung Jeong, ChangJong Hyun, Dongho You, and Dong Ho Kim. "360- Degree Video Streaming Using Stitching Information." In 2019 IEEE International Conference on Consumer Electronics (ICCE). IEEE, 2019. http://dx.doi.org/10.1109/icce.2019.8661926.
Full textNasrabadi, Afshin Taghavi, Anahita Mahzari, Joseph D. Beshay, and Ravi Prakash. "Adaptive 360-degree video streaming using layered video coding." In 2017 IEEE Virtual Reality (VR). IEEE, 2017. http://dx.doi.org/10.1109/vr.2017.7892319.
Full textNasrabadi, Afshin Taghavi, Anahita Mahzari, Joseph D. Beshay, and Ravi Prakash. "Adaptive 360-Degree Video Streaming using Scalable Video Coding." In MM '17: ACM Multimedia Conference. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3123266.3123414.
Full text