Gotowa bibliografia na temat „ARTISTIC STYLE TRANSFER”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „ARTISTIC STYLE TRANSFER”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Artykuły w czasopismach na temat "ARTISTIC STYLE TRANSFER"
Zhu, Xuanying, Mugang Lin, Kunhui Wen, Huihuang Zhao i Xianfang Sun. "Deep Deformable Artistic Font Style Transfer". Electronics 12, nr 7 (26.03.2023): 1561. http://dx.doi.org/10.3390/electronics12071561.
Pełny tekst źródłaLyu, Yanru, Chih-Long Lin, Po-Hsien Lin i Rungtai Lin. "The Cognition of Audience to Artistic Style Transfer". Applied Sciences 11, nr 7 (6.04.2021): 3290. http://dx.doi.org/10.3390/app11073290.
Pełny tekst źródłaLiu, Kunxiao, Guowu Yuan, Hao Wu i Wenhua Qian. "Coarse-to-Fine Structure-Aware Artistic Style Transfer". Applied Sciences 13, nr 2 (10.01.2023): 952. http://dx.doi.org/10.3390/app13020952.
Pełny tekst źródłaZhang, Chi, Yixin Zhu i Song-Chun Zhu. "MetaStyle: Three-Way Trade-off among Speed, Flexibility, and Quality in Neural Style Transfer". Proceedings of the AAAI Conference on Artificial Intelligence 33 (17.07.2019): 1254–61. http://dx.doi.org/10.1609/aaai.v33i01.33011254.
Pełny tekst źródłaHan, Xinying, Yang Wu i Rui Wan. "A Method for Style Transfer from Artistic Images Based on Depth Extraction Generative Adversarial Network". Applied Sciences 13, nr 2 (8.01.2023): 867. http://dx.doi.org/10.3390/app13020867.
Pełny tekst źródłaRuder, Manuel, Alexey Dosovitskiy i Thomas Brox. "Artistic Style Transfer for Videos and Spherical Images". International Journal of Computer Vision 126, nr 11 (21.04.2018): 1199–219. http://dx.doi.org/10.1007/s11263-018-1089-z.
Pełny tekst źródłaBanar, Nikolay, Matthia Sabatelli, Pierre Geurts, Walter Daelemans i Mike Kestemont. "Transfer Learning with Style Transfer between the Photorealistic and Artistic Domain". Electronic Imaging 2021, nr 14 (18.01.2021): 41–1. http://dx.doi.org/10.2352/issn.2470-1173.2021.14.cvaa-041.
Pełny tekst źródłaHien, Ngo Le Huy, Luu Van Huy i Nguyen Van Hieu. "Artwork style transfer model using deep learning approach". Cybernetics and Physics, Volume 10, 2021, Number 3 (30.10.2021): 127–37. http://dx.doi.org/10.35470/2226-4116-2021-10-3-127-137.
Pełny tekst źródłaLang, Langtian. "Style transfer with VGG19". Applied and Computational Engineering 6, nr 1 (14.06.2023): 149–58. http://dx.doi.org/10.54254/2755-2721/6/20230752.
Pełny tekst źródłaDinesh Kumar, R., E. Golden Julie, Y. Harold Robinson, S. Vimal, Gaurav Dhiman i Murugesh Veerasamy. "Deep Convolutional Nets Learning Classification for Artistic Style Transfer". Scientific Programming 2022 (10.01.2022): 1–9. http://dx.doi.org/10.1155/2022/2038740.
Pełny tekst źródłaRozprawy doktorskie na temat "ARTISTIC STYLE TRANSFER"
Zabaleta, Razquin Itziar. "Image processing algorithms as artistic tools in digital cinema". Doctoral thesis, Universitat Pompeu Fabra, 2021. http://hdl.handle.net/10803/672840.
Pełny tekst źródłaLa industria del cine ha experimentado un cambio radical en las últimas décadas: la transición de su soporte fílmico a la tecnología del cine digital. Como consecuencia, han aparecido algunos desafíos técnicos, pero, al mismo tiempo, infinitas nuevas posibilidades se han abierto con la utilización de este nuevo medio. En esta tesis, se proponen diferentes herramientas que pueden ser útiles en el contexto del cine. Primero, se ha desarrollado una herramienta para aplicar \textit{color grading} de manera automática. Es un método basado en estadísticas de imágenes, que transfiere el estilo de una imagen de referencia a metraje sin procesar. Las ventajas del método son su sencillez y bajo coste computacional, que lo hacen adecuado para ser implementado a tiempo real, permitiendo que se pueda experimentar con diferentes estilos y 'looks', directamente on-set. En segundo lugar, se ha creado un método para mejorar imágenes mediante la adición de textura. En cine, el grano de película es la textura más utilizada, ya sea porque la grabación se hace directamente sobre película, o porque ha sido añadido a posteriori en contenido grabado en formato digital. En esta tesis se propone un método de 'ruido retiniano' inspirado en procesos del sistema visual, que produce resultados naturales y visualmente agradables. El modelo cuenta con parámetros que permiten variar ampliamente la apariencia de la textura, y por tanto puede ser utilizado como una herramienta artística para cinematografía. Además, debido al fenómeno de enmascaramiento del sistema visual, al añadir esta textura se produce una mejora en la calidad percibida de las imágenes, lo que supone ahorros en ancho de banda y tasa de bits. El método ha sido validado mediante experimentos psicofísicos en los cuales ha sido elegido por encima de otros métodos que emulan grano de película, métodos procedentes de academia como de industria. Finalmente, se describe una métrica de calidad de imágenes, basada en fenómenos fisiológicos, con aplicaciones tanto en el campo del procesamiento de imágenes, como más concretamente en el contexto del cine y la transmisión de imágenes: codificación de vídeo, compresión de imágenes, etc. Se propone la optimización de los parámetros del modelo, de manera que sea competitivo con otros métodos del estado del arte . Una ventaja de este método es su reducido número de parámetros comparado con algunos métodos basados en deep learning, que cuentan con un número varios órdenes de magnitud mayor.
Teixeira, Inês Filipa Nunes. "Artistic Style Transfer for Textured 3D Models". Master's thesis, 2017. https://repositorio-aberto.up.pt/handle/10216/106653.
Pełny tekst źródłaTeixeira, Inês Filipa Nunes. "Artistic Style Transfer for Textured 3D Models". Dissertação, 2017. https://repositorio-aberto.up.pt/handle/10216/106653.
Pełny tekst źródłaSAGAR. "ARTISTIC STYLE TRANSFER USING CONVOLUTIONAL NEURAL NETWORKS". Thesis, 2019. http://dspace.dtu.ac.in:8080/jspui/handle/repository/16763.
Pełny tekst źródłaWang, Shen-Chi, i 王聖棋. "Paint Style Transfer System with the Artistic Database". Thesis, 2007. http://ndltd.ncl.edu.tw/handle/7zz79n.
Pełny tekst źródła國立東華大學
資訊工程學系
95
Digital painting synthesizes an output image with paint styles of example images along with the input source image. However, the synthesis procedure always requires the user intervention in selecting patches from example images that best describe its paint styles. The thesis presents a systematic system framework to synthesize example-based rendering images requires no user intervention in the synthesis procedure. The artistic database is been comprised in this work, and the user can synthesize an image according to the paint styles of different well known artists. We use the mean shift image segmentation procedure and the texture re-synthesis method to construct our artistic database, and then find the correspondence between example textures and the mean-shifting areas of the input source image, and then synthesize the output images using the patch-based sampling approach. The main contribution of this thesis is the systematic paint style transfer system for synthesizing a new image without requiring any user intervention. The artistic database is composed of re-synthesized mean-shifting example images of different artists, which are adopted as learning examples of the paint style of different well known artists during the synthesis procedure, and the system will synthesize a new image with the paint style of the user selected artist from the database automatically.
Tu, Ning, i 杜寧. "Video Cloning for Paintings via Artistic Style Transfer". Thesis, 2016. http://ndltd.ncl.edu.tw/handle/6am2q7.
Pełny tekst źródła國立中正大學
資訊工程研究所
104
In the past, visual arts usually represented the static art like paintings, photography and sculptures. In recent years, many museums, artwork galleries, and even art exhibitions demonstrated dynamic artworks for visitors to relish. The most famous dynamic artwork is “The moving painting of Along the River During the Qingming Festival”. Nevertheless, it took two years to complete this work. They had to plan each action for every character at first, then drew each video frame by animators. Finally, it could achieve seamless stitching by using lots of projectors to render scene on the screen. In our research, we propose a method for generating animated paintings. It only needs millions of videos on a network of existing databases and requires users to perform some simple auxiliary operations to achieve the effect of animation synthesis. First, our system lets users select an object with the same class from the first video frame. We then employ random forests as learning algorithm to retrieve from a video the object which users want to insert into an artwork. Second, we utilize style transferring, which enables the video frames to be consistent with the style of painting. At last, we use the seamless image cloning algorithm to yield seamless synthesizing result. Our approach allows different users to synthesize animating paintings up to their own preferences. The resulting work not only maintains the original author's painting style, but also generates a variety of artistic conception for people to enjoy.
Rebelo, Ana Daniela Peres. "The impact of artificial intelligence on the creativity of videos". Master's thesis, 2020. http://hdl.handle.net/10773/30624.
Pełny tekst źródłaNeste estudo foi explorado o impacto do uso de um algoritmo de Inteligência Artificial (IA), Style Transfer, na avaliação dos elementos de criatividade de vídeos artísticos. O objetivo deste estudo foi verificar em que medida o uso deste sistema contribui para alterações na perceção qualitativa e quantitativa dos elementos de criatividade presentes nos vídeos, e verificar que mudanças ocorrem. Foi efetuado um estudo experimental, contemplando dois conjuntos de vídeos que foram visualizados por dois grupos: 1) um grupo de controlo (n=49, dos quais 25 peritos e 24 não peritos); 2) um grupo experimental (n=52, dos quais 27 peritos e 25 não peritos). O primeiro conjunto, composto por seis vídeos sem transformação de IA (vídeos partilhados), foi exibido a ambos os grupos, visando a verificação da equivalência de critérios de avaliação. O segundo conjunto, composto por seis vídeos (diferenciados) teve uma versão transformada por IA (exibida ao grupo experimental) e outra não transformada (exibida ao grupo de controlo). A cada participante foi solicitado que avaliasse, numa escala de Likert de cinco pontos, os vídeos, em seis elementos de criatividade e que caracterizasse a criatividade de cada vídeo com duas palavras. Os resultados quantitativos demonstraram equivalência de critérios nos vídeos partilhados entre o grupo experimental e o grupo de controlo (apenas uma de 36 comparações apresentou diferenças significativas). Relativamente à comparação quantitativa dos vídeos diferenciados (experimental versus controlo), 10 avaliações não apresentaram diferenças significativas, enquanto que cinco tiveram avaliações mais elevadas de criatividade no grupo experimental e cinco no grupo de controlo. Nas comparações qualitativas, em geral, a frequência dos termos usados pelos participantes de ambos os grupos foram semelhantes nos vídeos partilhados. Nos vídeos diferenciados ocorreram algumas diferenças. No seu conjunto, os resultados enfatizam a importância da mediação humana na aplicação de um algoritmo de Inteligência artificial na produção criativa, o que reforça a relevância do conceito de Inteligência Híbrida.
Mestrado em Comunicação Multimédia
Li, Yu-Ting, i 李毓婷. "Feature-based Artistic Styles Transfer Using Effective Texture Synthesis". Thesis, 2008. http://ndltd.ncl.edu.tw/handle/06806315150453764693.
Pełny tekst źródła義守大學
資訊工程學系碩士班
96
Texture synthesis has been widely studied in recent years and patch-based sampling has proven superior in synthesis quality and computational time. Many technologies also extended the algorithm to perform texture transfer, rendering and image inpainting. These approaches can produce satisfactory results for a wide range of applications. However, it demonstrates a severe drawback in the execution time, because it requires very time consuming blending and iteration processing for image transfer. In 2001 SIGGRAPH Hertzmann presented an image analogies technology. This method was developed to learn very complex and non-linear image filters—for instance, filters that can convert a photograph into various types of artistic renderings having the appearance of oil, watercolor, or pen-and-ink, by analogy with actual (real-life) renderings in these styles. Although this method works well in effect, it has less practical due to its high complexity and long synthesis time. Besides, an input sample image and this sample''s filtered image are both required to render a new synthesized image. This work presents a feature based artistic styles transfer algorithm, which employs patch-based texture synthesis approaches, as well as speeds up texture synthesis process by exploiting Particle Swarm Optimization (PSO) during matching search process. We add some new feature constraints to the current algorithm, therefore, output image has similar visual effect as a manual painting. Once a certain sample image with artistic style is presented, the algorithm can accomplish image analogy by transferring the style to the target.
Książki na temat "ARTISTIC STYLE TRANSFER"
Engstrom, Craig Lee, i Derrick L. Williams. “Prisoners Rise, Rise, Rise!”. University of Illinois Press, 2017. http://dx.doi.org/10.5406/illinois/9780252037702.003.0009.
Pełny tekst źródłaLee, Adam. The Platonism of Walter Pater. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198848530.001.0001.
Pełny tekst źródłaManieson, Victor. Accelerated Keyboard Musicianship. Noyam Publishers, 2021. http://dx.doi.org/10.38159/npub.eb20211001.
Pełny tekst źródłaCzęści książek na temat "ARTISTIC STYLE TRANSFER"
Fu, Tsu-Jui, Xin Eric Wang i William Yang Wang. "Language-Driven Artistic Style Transfer". W Lecture Notes in Computer Science, 717–34. Cham: Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-20059-5_41.
Pełny tekst źródłaRuder, Manuel, Alexey Dosovitskiy i Thomas Brox. "Artistic Style Transfer for Videos". W Lecture Notes in Computer Science, 26–36. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-45886-1_3.
Pełny tekst źródłaQiu, Ting, Bingbing Ni, Ziang Liu i Xuanhong Chen. "Fast Optimal Transport Artistic Style Transfer". W MultiMedia Modeling, 37–49. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-67832-6_4.
Pełny tekst źródłaShibly, Kabid Hassan, Sazia Rahman, Samrat Kumar Dey i Shahadat Hossain Shamim. "Advanced Artistic Style Transfer Using Deep Neural Network". W Cyber Security and Computer Science, 619–28. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-52856-0_49.
Pełny tekst źródłaUhde, Florian, i Sanaz Mostaghim. "Towards a General Framework for Artistic Style Transfer". W Computational Intelligence in Music, Sound, Art and Design, 177–93. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-77583-8_12.
Pełny tekst źródłaChowdhury, Atiqul Islam, Fairuz Shadmani Shishir, Ashraful Islam, Eshtiak Ahmed i Mohammad Masudur Rahman. "Artistic Natural Images Generation Using Neural Style Transfer". W Advances in Intelligent Systems and Computing, 309–17. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-9927-9_31.
Pełny tekst źródłaGupta, Bharath, K. Govinda, R. Rajkumar i Jolly Masih. "Neural Artistic Style Transfer Using Deep Neural Networks". W Advances in Intelligent Systems and Computing, 1–12. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-6887-6_1.
Pełny tekst źródłaZhang, Yinshu, Jiayi Chen, Xiangyu Si, Zhiqiang Tian i Xuguang Lan. "Image Artistic Style Transfer Based on Color Distribution Preprocessing". W Communications in Computer and Information Science, 155–64. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-7983-3_14.
Pełny tekst źródłaUhde, Florian. "Applicability of Convolutional Neural Network Artistic Style Transfer Algorithms". W Artificial Intelligence and the Arts, 61–81. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-59475-6_3.
Pełny tekst źródłaUhde, Florian, i Sanaz Mostaghim. "Dissecting Neural Networks Filter Responses for Artistic Style Transfer". W Artificial Intelligence in Music, Sound, Art and Design, 297–312. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72914-1_20.
Pełny tekst źródłaStreszczenia konferencji na temat "ARTISTIC STYLE TRANSFER"
Xing, Yeli, Jiawei Li, Tao Dai, Qingtao Tang, Li Niu i Shu-Tao Xia. "Portrait-Aware Artistic Style Transfer". W 2018 25th IEEE International Conference on Image Processing (ICIP). IEEE, 2018. http://dx.doi.org/10.1109/icip.2018.8451054.
Pełny tekst źródłaZuo, Zhiwen, Lei Zhao, Shuobin Lian, Haibo Chen, Zhizhong Wang, Ailin Li, Wei Xing i Dongming Lu. "Style Fader Generative Adversarial Networks for Style Degree Controllable Artistic Style Transfer". W Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/693.
Pełny tekst źródłaKotovenko, Dmytro, Artsiom Sanakoyeu, Sabine Lang i Bjorn Ommer. "Content and Style Disentanglement for Artistic Style Transfer". W 2019 IEEE/CVF International Conference on Computer Vision (ICCV). IEEE, 2019. http://dx.doi.org/10.1109/iccv.2019.00452.
Pełny tekst źródłaLiu, Zhi-Song, Li-Wen Wang, Wan-Chi Siu i Vicky Kalogeiton. "Name your style: text-guided artistic style transfer". W 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, 2023. http://dx.doi.org/10.1109/cvprw59228.2023.00359.
Pełny tekst źródłaXu, Wenju, Chengjiang Long i Yongwei Nie. "Learning Dynamic Style Kernels for Artistic Style Transfer". W 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2023. http://dx.doi.org/10.1109/cvpr52729.2023.00972.
Pełny tekst źródłaChen, Haibo, Lei Zhao, Zhizhong Wang, Huiming Zhang, Zhiwen Zuo, Ailin Li, Wei Xing i Dongming Lu. "DualAST: Dual Style-Learning Networks for Artistic Style Transfer". W 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2021. http://dx.doi.org/10.1109/cvpr46437.2021.00093.
Pełny tekst źródłaZhang, Xinwei, i Xiaoyun Chen. "Fast Artistic Style Transfer via Wavelet Transforms". W 2023 4th International Conference on Information Science, Parallel and Distributed Systems (ISPDS). IEEE, 2023. http://dx.doi.org/10.1109/ispds58840.2023.10235622.
Pełny tekst źródłaBuchnik, Itay, Or Berebi, Tammy Riklin Raviv i Nir Shlezinger. "Generating Artistic Images Via Few-Shot Style Transfer". W 2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW). IEEE, 2023. http://dx.doi.org/10.1109/icasspw59220.2023.10193400.
Pełny tekst źródłaBae, Eunjee, Jaekyung Kim i Sanghoon Lee. "Point Cloud-Based Free Viewpoint Artistic Style Transfer". W 2023 IEEE International Conference on Multimedia and Expo Workshops (ICMEW). IEEE, 2023. http://dx.doi.org/10.1109/icmew59549.2023.00058.
Pełny tekst źródłaDushkoff, Michael, Ryan McLaughlin i Raymond Ptucha. "A temporally coherent neural algorithm for artistic style transfer". W 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE, 2016. http://dx.doi.org/10.1109/icpr.2016.7900142.
Pełny tekst źródła