Статті в журналах з теми "Physically based rendering (PBR)"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Physically based rendering (PBR).

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Physically based rendering (PBR)".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Sato, Rion, and Michael Cohen. "Raytracing Render Switcher with Embree." SHS Web of Conferences 102 (2021): 04015. http://dx.doi.org/10.1051/shsconf/202110204015.

Повний текст джерела
Анотація:
We introduce a way of implementing physically-based renderers that can switch rendering methods with a raytracing library. Various physically-based rendering (PBR) methods can generate beautiful images that are close to human view of real world. However, comparison between corresponding pairs of pixels of image pairs generated by different rendering methods is necessary to verify whether the implementation correctly obeys mathematical models of PBR. For comparison, result images must be same scene, same resolution, from same camera angle. We explain fundamental theory of PBR first, and present overview of a library for PBR, Embree, developed by Intel, as a way of rendering-switchable implementation. Finally, we demonstrate computing result images by a renderer we developed. The renderer can switch rendering methods and be extended for other method implementations.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Dai, Peng, Zhuwen Li, Yinda Zhang, Shuaicheng Liu, and Bing Zeng. "PBR-Net: Imitating Physically Based Rendering Using Deep Neural Network." IEEE Transactions on Image Processing 29 (2020): 5980–92. http://dx.doi.org/10.1109/tip.2020.2987169.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Radosavljević, Ljupka. "Primena proceduralnih mapa u vizuelizaciji enterijera." Zbornik radova Fakulteta tehničkih nauka u Novom Sadu 36, no. 04 (April 7, 2021): 791–94. http://dx.doi.org/10.24867/12fa23radosavljevic.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Farella, Elisa Mariarosaria, Luca Morelli, Simone Rigon, Eleonora Grilli, and Fabio Remondino. "Analysing Key Steps of the Photogrammetric Pipeline for Museum Artefacts 3D Digitisation." Sustainability 14, no. 9 (May 9, 2022): 5740. http://dx.doi.org/10.3390/su14095740.

Повний текст джерела
Анотація:
In recent years, massive digitisation of cultural heritage (CH) assets has become a focus of European programmes and initiatives. Among CH settings, attention is reserved to the immense and precious museum collections, whose digital 3D reproduction can support broader non-invasive analyses and stimulate the realisation of more attractive and interactive exhibitions. The reconstruction pipeline typically includes numerous processing steps when passive techniques are selected to deal with object digitisation. This article presents some insights on critical operations, which, based on our experience, can rule the quality of the final models and the reconstruction times for delivering 3D heritage results, while boosting the sustainability of digital cultural contents. The depth of field (DoF) problem is explored in the acquisition phase when surveying medium and small-sized objects. Techniques for deblurring images and masking object backgrounds are examined relative to the pre-processing stage. Some point cloud denoising and mesh simplification procedures are analysed in data post-processing. Hints on physically-based rendering (PBR) materials are also presented as closing operations of the reconstruction pipeline. This paper explores these processes mainly through experiments, providing a practical guide, tricks, and suggestions when tackling museum digitisation projects.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Huang, Cheng-Guo, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong Chuang. "Physically based cosmetic rendering." Computer Animation and Virtual Worlds 24, no. 3-4 (May 2013): 275–83. http://dx.doi.org/10.1002/cav.1523.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

West, Rex. "Physically-based feature line rendering." ACM Transactions on Graphics 40, no. 6 (December 2021): 1–11. http://dx.doi.org/10.1145/3478513.3480550.

Повний текст джерела
Анотація:
Feature lines visualize the shape and structure of 3D objects, and are an essential component of many non-photorealistic rendering styles. Existing feature line rendering methods, however, are only able to render feature lines in limited contexts, such as on immediately visible surfaces or in specular reflections. We present a novel, path-based method for feature line rendering that allows for the accurate rendering of feature lines in the presence of complex physical phenomena such as glossy reflection, depth-of-field, and dispersion. Our key insight is that feature lines can be modeled as view-dependent light sources. These light sources can be sampled as a part of ordinary paths , and seamlessly integrate into existing physically-based rendering methods. We illustrate the effectiveness of our method in several real-world rendering scenarios with a variety of different physical phenomena.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Ulbricht, Christiane, Alexander Wilkie, and Werner Purgathofer. "Verification of Physically Based Rendering Algorithms." Computer Graphics Forum 25, no. 2 (June 2006): 237–55. http://dx.doi.org/10.1111/j.1467-8659.2006.00938.x.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Simons, G., S. Herholz, V. Petitjean, T. Rapp, M. Ament, H. Lensch, C. Dachsbacher, M. Eisemann, and E. Eisemann. "Applying Visual Analytics to Physically Based Rendering." Computer Graphics Forum 38, no. 1 (July 4, 2018): 197–208. http://dx.doi.org/10.1111/cgf.13452.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Hullin, Matthias, Elmar Eisemann, Hans-Peter Seidel, and Sungkil Lee. "Physically-based real-time lens flare rendering." ACM Transactions on Graphics 30, no. 4 (July 1, 2011): 1. http://dx.doi.org/10.1145/2010324.1965003.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Lafortune, Eric P., and Yves D. Willems. "A Theoretical Framework for Physically Based Rendering." Computer Graphics Forum 13, no. 2 (May 1994): 97–107. http://dx.doi.org/10.1111/1467-8659.1320097.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Liu, Xincheng, Yi Chen, Haitong Zhang, Yuhong Zou, Zhangye Wang, and Qunsheng Peng. "Physically based modeling and rendering of avalanches." Visual Computer 37, no. 9-11 (July 13, 2021): 2619–29. http://dx.doi.org/10.1007/s00371-021-02215-1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Lee, Young-Hun. "A Study on Physically-Based Lighting and Rendering." Cartoon and Animation Studies 55 (June 30, 2019): 345–63. http://dx.doi.org/10.7230/koscas.2019.55.345.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Iehl, J. C., and B. Péroche. "Towards perceptual control of physically based spectral rendering." Computers & Graphics 27, no. 5 (October 2003): 747–62. http://dx.doi.org/10.1016/s0097-8493(03)00148-1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Aguerre, José Pedro, Elena García‐Nevado, Jairo Acuña Paz y Miño, Eduardo Fernández, and Benoit Beckers. "Physically Based Simulation and Rendering of Urban Thermography." Computer Graphics Forum 39, no. 6 (June 28, 2020): 377–91. http://dx.doi.org/10.1111/cgf.14044.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Schlick, Christophe. "An Inexpensive BRDF Model for Physically-based Rendering." Computer Graphics Forum 13, no. 3 (August 1994): 233–46. http://dx.doi.org/10.1111/1467-8659.1330233.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Bosch, Carles, Xavier Pueyo, Stephane Merillou, and Djamchid Ghazanfarpour. "A Physically-Based Model for Rendering Realistic Scratches." Computer Graphics Forum 23, no. 3 (September 2004): 361–70. http://dx.doi.org/10.1111/j.1467-8659.2004.00767.x.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Amador, Gonçalo N. P., and Abel J. P. Gomes. "A Simple Physically-Based 3D Liquids Surface Tracking Algorithm." International Journal of Creative Interfaces and Computer Graphics 2, no. 2 (July 2011): 37–48. http://dx.doi.org/10.4018/ijcicg.2011070103.

Повний текст джерела
Анотація:
Navier-Stokes-based methods have been used in computer graphics to simulate liquids, especially water. These physically based methods are computationally intensive, and require rendering the water surface at each step of the simulation process. The rendering of water surfaces requires knowing which 3D grid cells are crossed by the water’s surface, that is, tracking the surface across the cells is necessary. Solutions to water surface tracking and rendering problems exist in literature, but they are either too computationally intensive to be appropriate for real-time scenarios, as is the case of deformable implicit surfaces and ray-tracing, or too application-specific, as is the case of height-fields to simulate and render water mantles (e.g., lakes and oceans). This paper proposes a novel solution to water surface tracking that does not compromise the overall simulation performance. This approach differs from previous solutions in that it directly classifies and annotates the density of each 3D grid cell as either water, air, or water-air (i.e., water surface), opening the opportunity for easily reconstructing the water surface at an interactive frame rate.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

ARAKI, Fumiaki. "Visualization of Super-Droplet Data with Physically-Based Rendering." Journal of the Visualization Society of Japan 28-1, no. 1 (2008): 439. http://dx.doi.org/10.3154/jvs.28.439.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Medeiros, Esdras, Harish Doraiswamy, Matthew Berger, and Claudio T. Silva. "Using physically Based Rendering to Benchmark Structured Light Scanners." Computer Graphics Forum 33, no. 7 (October 2014): 71–80. http://dx.doi.org/10.1111/cgf.12475.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Chin, Seongah. "Recent Technical Trends and Issues of Physically-based Rendering." Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology 6, no. 9 (September 30, 2016): 49–58. http://dx.doi.org/10.14257/ajmahs.2016.09.23.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Rokne, Jon G., Guangwu Xu, and Gladimir V. G. Baranoski. "Virtual spectrophotometric measurements for biologically and physically based rendering." Visual Computer 17, no. 8 (November 1, 2001): 506–18. http://dx.doi.org/10.1007/s003710100127.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Lee, Hee-Sub, and Mi You. "High Quality Realtime Rendering Utilizing Physically Based Renderer, Eevee." Korean Journal of animation 15, no. 3 (September 30, 2019): 139–56. http://dx.doi.org/10.51467/asko.2019.09.15.3.139.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Kim, Seongdong. "PBR(Physically based Render) simulation considered mathematical Fresnel model for Game Improvement." Journal of Korea Game Society 16, no. 1 (February 28, 2016): 111–18. http://dx.doi.org/10.7583/jkgs.2016.16.1.111.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Reischl, M., E. Derzapf, and M. Guthe. "Physically Based Real‐Time Rendering of Teeth and Partial Restorations." Computer Graphics Forum 39, no. 1 (May 15, 2019): 106–16. http://dx.doi.org/10.1111/cgf.13665.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Chermain, X., B. Sauvage, J. ‐M Dischler, and C. Dachsbacher. "Procedural Physically based BRDF for Real‐Time Rendering of Glints." Computer Graphics Forum 39, no. 7 (October 2020): 243–53. http://dx.doi.org/10.1111/cgf.14141.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Avanzini, Federico, and Paolo Crosato. "Integrating physically based sound models in a multimodal rendering architecture." Computer Animation and Virtual Worlds 17, no. 3-4 (2006): 411–19. http://dx.doi.org/10.1002/cav.144.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Wang, Yifan, Weiran Li, and Qing Zhu. "Ink Wash Painting Style Rendering With Physically-based Ink Dispersion Model." Journal of Physics: Conference Series 1004 (April 2018): 012026. http://dx.doi.org/10.1088/1742-6596/1004/1/012026.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Igouchkine, Oleg, Yubo Zhang, and Kwan-Liu Ma. "Multi-Material Volume Rendering with a Physically-Based Surface Reflection Model." IEEE Transactions on Visualization and Computer Graphics 24, no. 12 (December 1, 2018): 3147–59. http://dx.doi.org/10.1109/tvcg.2017.2784830.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Huang, Zhanpeng, Guanghong Gong, and Liang Han. "Physically-based modeling, simulation and rendering of fire for computer animation." Multimedia Tools and Applications 71, no. 3 (November 13, 2012): 1283–309. http://dx.doi.org/10.1007/s11042-012-1273-z.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Gao, Yanze, Xin Wang, Yanyan Li, Lang Zhou, Qingfeng Shi, and Zhuo Li. "Modeling method of a ladar scene projector based on physically based rendering technology." Applied Optics 57, no. 28 (September 26, 2018): 8303. http://dx.doi.org/10.1364/ao.57.008303.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Yoo, Sangwook, Cheongho Lee, and Seongah Chin. "Physically Based Soap Bubble Synthesis for VR." Applied Sciences 11, no. 7 (March 31, 2021): 3090. http://dx.doi.org/10.3390/app11073090.

Повний текст джерела
Анотація:
To experience a real soap bubble show, materials and tools are required, as are skilled performers who produce the show. However, in a virtual space where spatial and temporal constraints do not exist, bubble art can be performed without real materials and tools to give a sense of immersion. For this, the realistic expression of soap bubbles is an interesting topic for virtual reality (VR). However, the current performance of VR soap bubbles is not satisfying the high expectations of users. Therefore, in this study, we propose a physically based approach for reproducing the shape of the bubble by calculating the measured parameters required for bubble modeling and the physical motion of bubbles. In addition, we applied the change in the flow of the surface of the soap bubble measured in practice to the VR rendering. To improve users’ VR experience, we propose that they should experience a bubble show in a VR HMD (Head Mounted Display) environment.
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Heasly, B. S., N. P. Cottaris, D. P. Lichtman, B. Xiao, and D. H. Brainard. "RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research." Journal of Vision 14, no. 2 (February 7, 2014): 6. http://dx.doi.org/10.1167/14.2.6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Wang, Qiuyan, and Hao Du. "Research on 3D Simulation Method of Nearshore Storm Surge Based on FLIP." Journal of Physics: Conference Series 2074, no. 1 (November 1, 2021): 012007. http://dx.doi.org/10.1088/1742-6596/2074/1/012007.

Повний текст джерела
Анотація:
Abstract This paper proposes a FLIP-based three-dimensional storm surge simulation method. Based on the Fluid Implicit Particle (FLIP) method, the flow field is calculated by FLIP fluid, and finally through the standard coloring based on PBR in the Unity engine. The device renders the fluid state model generated at each moment. The experimental results show that the method in this paper not only meets the realistic requirements of nearshore storm surge simulation, but also effectively improves the efficiency of scene rendering. The result can be used not only in game production and movie special effects, but also in engineering simulations such as ocean engineering and environmental engineering, and has a wide range of application prospects and application values.
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Pajusalu, Mihkel, Iaroslav Iakubivskyi, Gabriel Jörg Schwarzkopf, Olli Knuuttila, Timo Väisänen, Maximilian Bührer, Mario F. Palos, et al. "SISPO: Space Imaging Simulator for Proximity Operations." PLOS ONE 17, no. 3 (March 4, 2022): e0263882. http://dx.doi.org/10.1371/journal.pone.0263882.

Повний текст джерела
Анотація:
This paper describes the architecture and demonstrates the capabilities of a newly developed, physically-based imaging simulator environment called SISPO, developed for small solar system body fly-by and terrestrial planet surface mission simulations. The image simulator utilises the open-source 3-D visualisation system Blender and its Cycles rendering engine, which supports physically based rendering capabilities and procedural micropolygon displacement texture generation. The simulator concentrates on realistic surface rendering and has supplementary models to produce realistic dust- and gas-environment optical models for comets and active asteroids. The framework also includes tools to simulate the most common image aberrations, such as tangential and sagittal astigmatism, internal and external comatic aberration, and simple geometric distortions. The model framework’s primary objective is to support small-body space mission design by allowing better simulations for characterisation of imaging instrument performance, assisting mission planning, and developing computer-vision algorithms. SISPO allows the simulation of trajectories, light parameters and camera’s intrinsic parameters.
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Shiromi, Kai, Higashi Hiroshi, Mohammad Shehata, Shinsuke Shimojo, and Shigeki Nakauchi. "#TheDress type of color ambiguity induced by T-shirt image based on physically-based rendering." Journal of Vision 18, no. 10 (September 1, 2018): 221. http://dx.doi.org/10.1167/18.10.221.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Liang, Jianming, Jianhua Gong, and Yi Li. "Realistic rendering for physically based shallow water simulation in Virtual Geographic Environments (VGEs)." Annals of GIS 21, no. 4 (July 27, 2015): 301–12. http://dx.doi.org/10.1080/19475683.2015.1050064.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Bordegoni, Monica. "Haptic and Sound Interface for Shape Rendering." Presence: Teleoperators and Virtual Environments 19, no. 4 (August 1, 2010): 341–63. http://dx.doi.org/10.1162/pres_a_00010.

Повний текст джерела
Анотація:
This paper presents a system for the evaluation of the shape of aesthetic products. The evaluation of shapes is based on characteristic curves, which is a typical practice in the industrial design domain. The system, inspired by characteristic curves, is based on a haptic strip that conforms to a curve that the designer wishes to feel, explore, and analyze by physically touching it. The haptic strip is an innovative solution in the haptics domain, although it has some limitations concerning the domain of curves that can be actually represented. In order to extend this domain and make users feel the various curve features, for example curvature discontinuities, sound has been exploited as an additional information modality.
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Hong, Sungin, Chulhee Lee, and Seongah Chin. "Physically based optical parameter database obtained from real materials for real-time material rendering." Journal of Visual Languages & Computing 48 (October 2018): 29–39. http://dx.doi.org/10.1016/j.jvlc.2018.06.004.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Yi, Shinyoung, Donggun Kim, Kiseok Choi, Adrian Jarabo, Diego Gutierrez, and Min H. Kim. "Differentiable transient rendering." ACM Transactions on Graphics 40, no. 6 (December 2021): 1–11. http://dx.doi.org/10.1145/3478513.3480498.

Повний текст джерела
Анотація:
Recent differentiable rendering techniques have become key tools to tackle many inverse problems in graphics and vision. Existing models, however, assume steady-state light transport, i.e., infinite speed of light. While this is a safe assumption for many applications, recent advances in ultrafast imaging leverage the wealth of information that can be extracted from the exact time of flight of light. In this context, physically-based transient rendering allows to efficiently simulate and analyze light transport considering that the speed of light is indeed finite. In this paper, we introduce a novel differentiable transient rendering framework, to help bring the potential of differentiable approaches into the transient regime. To differentiate the transient path integral we need to take into account that scattering events at path vertices are no longer independent; instead, tracking the time of flight of light requires treating such scattering events at path vertices jointly as a multidimensional, evolving manifold. We thus turn to the generalized transport theorem, and introduce a novel correlated importance term, which links the time-integrated contribution of a path to its light throughput, and allows us to handle discontinuities in the light and sensor functions. Last, we present results in several challenging scenarios where the time of flight of light plays an important role such as optimizing indices of refraction, non-line-of-sight tracking with nonplanar relay walls, and non-line-of-sight tracking around two corners.
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Maček, Nejc, Baran Usta, Elmar Eisemann, and Ricardo Marroquim. "Real-Time Relighting of Human Faces with a Low-Cost Setup." Proceedings of the ACM on Computer Graphics and Interactive Techniques 5, no. 1 (May 4, 2022): 1–19. http://dx.doi.org/10.1145/3522626.

Повний текст джерела
Анотація:
Video-streaming services usually feature post-processing effects to replace the background. However, these often yield inconsistent lighting. Machine-learning-based relighting methods can address this problem, but, at real-time rates, are restricted to a low resolution and can result in an unrealistic skin appearance. Physically-based rendering techniques require complex skin models that can only be acquired using specialised equipment. Our method is lightweight and uses only a standard smartphone. By correcting imperfections during capture, we extract a convincing physically-based skin model. In combination with suitable acceleration techniques, we achieve real-time rates on commodity hardware.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Zhang, Yin Xia, Guo Hua Geng, and Ming Quan Zhou. "The Simulation of Rusty Phenomenon Based on Image Texture Feature." Applied Mechanics and Materials 513-517 (February 2014): 3972–75. http://dx.doi.org/10.4028/www.scientific.net/amm.513-517.3972.

Повний текст джерела
Анотація:
The simulation of aging appearance is essential in the realistic modeling, We propose a method using existing aging image to simulate aging appearance, We establish the classification library, extract the texture feature of different types aging and save them, set the weight of the texture feature and synthesis texture according to the environment. In addition, user can specify the constrained texture mapping position optionally. This method can simulate various rusty phenomenon simultaneously. It can also be combined with physically-based method and ton spreading-based method to implement rendering of realistic 3D model rapidly.
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Ward, Kelly, Nico Galoppo, and Ming Lin. "Interactive Virtual Hair Salon." Presence: Teleoperators and Virtual Environments 16, no. 3 (June 1, 2007): 237–51. http://dx.doi.org/10.1162/pres.16.3.237.

Повний текст джерела
Анотація:
User interaction with animated hair is desirable for various applications but difficult because it requires real-time animation and rendering of hair. Hair modeling, in cluding styling, simulation, and rendering, is computationally challenging due to the enormous number of deformable hair strands on a human head, elevating the computational complexity of many essential steps, such as collision detection and self-shadowing for hair. Using simulation localization techniques, multi-resolution representations, and graphics hardware rendering acceleration, we have developed a physically-based virtual hair salon system that simulates and renders hair at accelerated rates, enabling users to interactively style virtual hair. With a 3D haptic interface, users can directly manipulate and position hair strands, as well as employ real-world styling applications (cutting, blow-drying, etc.) to create hairstyles more intuitively than previous techniques.
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Hiranyachattada, Tiantada, and Kampanat Kusirirat. "Using mobile augmented reality to enhancing students’ conceptual understanding of physically-based rendering in 3D animation." European Journal of Science and Mathematics Education 8, no. 1 (January 15, 2020): 1–5. http://dx.doi.org/10.30935/scimath/9542.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Miner, Joshua D. "Biased Render." Screen Bodies 4, no. 1 (June 1, 2019): 48–71. http://dx.doi.org/10.3167/screen.2019.040105.

Повний текст джерела
Анотація:
This article explores the digitality of Indigenous bodies within contemporary 3D video games by mainstream and Indigenous developers. Its analysis relies on a critical examination of digital image synthesis via real-time graphics rendering, which algorithmically generates the visible world onscreen from 3D geometries by mapping textures, generating light and shadow, and simulating perceptual phenomena. At a time when physically based, unbiased rendering methods have made photorealistic styles and open-world structures common across AAA games in general, Indigenous game designers have instead employed simplified “low res” styles. Using bias as an interpretive model, this article unpacks how these designers critique mainstream rendering as a cultural-computational practice whose processes are encoded with cultural biases that frame the relation of player and screen body (avatar). The algorithmic production of digitally modeled bodies, as an essential but masked element of video games, offers a territory where Indigenous developers claim aesthetic presence in the medium.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Retzlaff, Max-Gerd, Johannes Hanika, Jürgen Beyerer, and Carsten Dachsbacher. "Physically based computer graphics for realistic image formation to simulate optical measurement systems." Journal of Sensors and Sensor Systems 6, no. 1 (May 8, 2017): 171–84. http://dx.doi.org/10.5194/jsss-6-171-2017.

Повний текст джерела
Анотація:
Abstract. Physically based image synthesis methods, a research direction in computer graphics (CG), are capable of simulating optical measuring systems in their entirety and thus constitute an interesting approach for the development, simulation, optimization, and validation of such systems. In addition, other CG methods, so-called procedural modeling techniques, can be used to quickly generate large sets of virtual samples and scenes thereof that comprise the same variety as physical testing objects and real scenes (e.g., if digitized sample data are not available or difficult to acquire). Appropriate image synthesis (rendering) techniques result in a realistic image formation for the virtual scenes, considering light sources, material, complex lens systems, and sensor properties, and can be used to evaluate and improve complex measuring systems and automated optical inspection (AOI) systems independent of a physical realization. In this paper, we provide an overview of suitable image synthesis methods and their characteristics, we discuss the challenges for the design and specification of a given measuring situation in order to allow for a reliable simulation and validation, and we describe an image generation pipeline suitable for the evaluation and optimization of measuring and AOI systems.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

McGahan, Joseph R., J. David Williamson, and Jerome Stoecker. "Length, Weight, and Width: Covariation Assessments Based on Haptic Exploration." Perceptual and Motor Skills 86, no. 3_suppl (June 1998): 1459–68. http://dx.doi.org/10.2466/pms.1998.86.3c.1459.

Повний текст джерела
Анотація:
Intuitive judgments about covariations of length, weight, and width were assessed in two experiments using a series of prepositional statements. In Exp. 1, only a priori judgments were rendered, whereas in Exp. 2 blindfolded participants physically manipulated and described a series of objects varying on these dimensions before rendering their judgments. Analyses indicated participants judged weight and width as positively correlated, length and weight as uncorrelated and, to some extent, length and width as negatively correlated. If judgments are rendered after the haptic exploradon phase, weight and width are (again) judged as positively correlated, and length and weight were still judged as uncorrelated. However, after the database intervention, length and width were judged as positively correlated. Results are discussed relative to research on reasoning about covariation and belief perseveration as well as perceived covariations between height, weight, and body fat.
Стилі APA, Harvard, Vancouver, ISO та ін.
47

BLANCHETTE, DAMON, and EMMANUEL AGU. "REAL-TIME DISPERSIVE REFRACTION WITH ADAPTIVE SPECTRAL MAPPING." International Journal on Artificial Intelligence Tools 22, no. 06 (December 2013): 1360019. http://dx.doi.org/10.1142/s0218213013600191.

Повний текст джерела
Анотація:
Spectral rendering, or the synthesis of images by taking into account the constituent wavelengths of white light, enables the rendering of iridescent colors caused by phenomena such as dispersion, diffraction, interference and scattering. Caustics, the focusing and defocusing of light through a refractive medium, can be interpreted as a special case of dispersion where all the wavelengths travel along the same paths. In this paper we extend Adaptive Caustic Mapping (ACM), a previously proposed caustics mapping algorithm, to handle physically-based dispersion. Because ACM can display caustics in real-time, it is amenable to extension to handle the more general case of dispersion. We also present a novel algorithm for filling in the gaps that occur due to discrete sampling of the spectrum. Our proposed method runs in screen-space, and is fast enough to display plausible dispersion phenomena at real-time and interactive frame rates.
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Thonat, Theo, Francois Beaune, Xin Sun, Nathan Carr, and Tamy Boubekeur. "Tessellation-free displacement mapping for ray tracing." ACM Transactions on Graphics 40, no. 6 (December 2021): 1–16. http://dx.doi.org/10.1145/3478513.3480535.

Повний текст джерела
Анотація:
Displacement mapping is a powerful mechanism for adding fine to medium geometric details over a 3D surface using a 2D map encoding them. While GPU rasterization supports it through the hardware tessellation unit, ray tracing surface meshes textured with high quality displacement requires a significant amount of memory. More precisely, the input surface needs to be pre-tessellated at the displacement map resolution before being enriched with its mandatory acceleration data structure. Consequently, designing displacement maps interactively while enjoying a full physically-based rendering is often impossible, as simply tiling multiple times the map quickly saturates the graphics memory. In this work we introduce a new tessellation-free displacement mapping approach for ray tracing. Our key insight is to decouple the displacement from its base domain by mapping a displacement-specific acceleration structures directly on the mesh. As a result, our method shows low memory footprint and fast high resolution displacement rendering, making interactive displacement editing possible.
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Khenak, Nawel, Jeanne Vézien, David Théry, and Patrick Bourdot. "Spatial Presence in Real and Remote Immersive Environments and the Effect of Multisensory Stimulation." PRESENCE: Virtual and Augmented Reality 27, no. 3 (July 2020): 287–308. http://dx.doi.org/10.1162/pres_a_00332.

Повний текст джерела
Анотація:
This article presents a user experiment that assesses the feeling of spatial presence, defined as the sense of “being there” in both a real and a remote environment (respectively the so-called “natural presence” and “telepresence”). Twenty-eight participants performed a 3D-pointing task while being either physically located in a real office or remotely transported by a teleoperation system. The evaluation also included the effect of combining audio and visual rendering. Spatial presence and its components were evaluated using the ITC-SOPI questionnaire (Lessiter, Freeman, Keogh, & Davidoff, 2001 ). In addition, objective metrics based on user performance and behavioral indicators were logged. Results indicate that participants experienced a higher sense of spatial presence in the remote environment (hyper-presence), and a higher ecological validity. In contrast, objective metrics prove higher in the real environment, which highlights the absence of correlation between spatial presence and the objective metrics used in the experiment. Moreover, results show the benefit of adding audio rendering in both environments to increase the sense of spatial presence, the performance of participants, and their engagement during the task.
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Huraibat, Khalil, Esther Perales, Alejandro Ferrero, Joaquín Campos, Ivo Van der Lans, and Eric Kirchner. "Physics-based modeling of a light booth to improve color accuracy of 3D rendering." London Imaging Meeting 2020, no. 1 (September 29, 2020): 54–59. http://dx.doi.org/10.2352/issn.2694-118x.2020.lim-10.

Повний текст джерела
Анотація:
Computer Aided Design (CAD) is increasingly used as a tool in industries varying from automotive to interior design. Digital visualization allows users to design their working and living spaces, and to select materials and colors for future products. The rendering software that is currently available often suggests photorealistic quality. However, visual comparisons of these images with the physical objects they represent reveal that the color accuracy of these methods is not good enough for critical applications such as automotive design. Therefore, we recently developed a spectral pipeline for rendering gonio-apparent materials such as effect coatings. In order to accurately render objects as they appear in a physical environment, this new approach requires a physics-based representation of the illumination surrounding the objects. In the present article we investigate how to physically represent one, well-defined lighting environment. Therefore, we investigated the lighting inside a recent, commercially available light booth that is widely used in the paint and graphical industry. We determined the spatial dimensions of the X-Rite SpectraLight QC light booth, and built a digital geometrical model of this light booth in the Open Source software Blender. We then measured the spectral radiance emitted by the various light sources that are integrated in the luminaire of this light booth, as well as the illuminance on a grid of measurement spots on the platform below the luminaire. Using these measurement data, we were able to develop an accurate physical simulation of the light field inside the light booth. We plan to use the physics-based model of the lighting inside the light booth to set up visual tests in which physical objects inside the physical light booth are visually compared to images showing virtual objects inside the virtual light booth. These visual tests will form the basis for developing improved models for displaying the color and texture of gonio-apparent materials.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії