Добірка наукової літератури з теми "ARTISTIC STYLE TRANSFER"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "ARTISTIC STYLE TRANSFER".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Статті в журналах з теми "ARTISTIC STYLE TRANSFER"

1

Zhu, Xuanying, Mugang Lin, Kunhui Wen, Huihuang Zhao, and Xianfang Sun. "Deep Deformable Artistic Font Style Transfer." Electronics 12, no. 7 (March 26, 2023): 1561. http://dx.doi.org/10.3390/electronics12071561.

Повний текст джерела
Анотація:
The essence of font style transfer is to move the style features of an image into a font while maintaining the font’s glyph structure. At present, generative adversarial networks based on convolutional neural networks play an important role in font style generation. However, traditional convolutional neural networks that recognize font images suffer from poor adaptability to unknown image changes, weak generalization abilities, and poor texture feature extractions. When the glyph structure is very complex, stylized font images cannot be effectively recognized. In this paper, a deep deformable style transfer network is proposed for artistic font style transfer, which can adjust the degree of font deformation according to the style and realize the multiscale artistic style transfer of text. The new model consists of a sketch module for learning glyph mapping, a glyph module for learning style features, and a transfer module for a fusion of style textures. In the glyph module, the Deform-Resblock encoder is designed to extract glyph features, in which a deformable convolution is introduced and the size of the residual module is changed to achieve a fusion of feature information at different scales, preserve the font structure better, and enhance the controllability of text deformation. Therefore, our network has greater control over text, processes image feature information better, and can produce more exquisite artistic fonts.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Lyu, Yanru, Chih-Long Lin, Po-Hsien Lin, and Rungtai Lin. "The Cognition of Audience to Artistic Style Transfer." Applied Sciences 11, no. 7 (April 6, 2021): 3290. http://dx.doi.org/10.3390/app11073290.

Повний текст джерела
Анотація:
Artificial Intelligence (AI) is becoming more popular in various fields, including the area of art creation. Advances in AI technology bring new opportunities and challenges in the creation, experience, and appreciation of art. The neural style transfer (NST) realizes the intelligent conversion of any artistic style using neural networks. However, the artistic style is the product of cognition that involving from visual to feel. The purpose of this paper is to study factors affecting audience cognitive difference and preference on artistic style transfer. Those factors are discussed to investigate the application of the AI generator model in art creation. Therefore, based on the artist’s encoding attributes (color, stroke, texture) and the audience’s decoding cognitive levels (technical, semantic, effectiveness), this study proposed a framework to evaluate artistic style transfer in the perspective of cognition. Thirty-one subjects with a background in art, aesthetics, and design were recruited to participate in the experiment. The experimental process consists of four style groups, including Fauvism, Expressionism, Cubism, and Renaissance. According to the finding in this study, participants can still recognize different artistic styles after transferred by neural networks. Besides, the features of texture and stroke are more impact on the perception of fitness than color. The audience may prefer the samples with high cognition in the semantic and effectiveness levels. The above indicates that through AI automated routine work, the cognition of the audience to artistic style still can be kept and transferred.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Liu, Kunxiao, Guowu Yuan, Hao Wu, and Wenhua Qian. "Coarse-to-Fine Structure-Aware Artistic Style Transfer." Applied Sciences 13, no. 2 (January 10, 2023): 952. http://dx.doi.org/10.3390/app13020952.

Повний текст джерела
Анотація:
Artistic style transfer aims to use a style image and a content image to synthesize a target image that retains the same artistic expression as the style image while preserving the basic content of the content image. Many recently proposed style transfer methods have a common problem; that is, they simply transfer the texture and color of the style image to the global structure of the content image. As a result, the content image has a local structure that is not similar to the local structure of the style image. In this paper, we present an effective method that can be used to transfer style patterns while fusing the local style structure to the local content structure. In our method, different levels of coarse stylized features are first reconstructed at low resolution using a coarse network, in which style color distribution is roughly transferred, and the content structure is combined with the style structure. Then, the reconstructed features and the content features are adopted to synthesize high-quality structure-aware stylized images with high resolution using a fine network with three structural selective fusion (SSF) modules. The effectiveness of our method is demonstrated through the generation of appealing high-quality stylization results and a comparison with some state-of-the-art style transfer methods.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Zhang, Chi, Yixin Zhu, and Song-Chun Zhu. "MetaStyle: Three-Way Trade-off among Speed, Flexibility, and Quality in Neural Style Transfer." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 1254–61. http://dx.doi.org/10.1609/aaai.v33i01.33011254.

Повний текст джерела
Анотація:
An unprecedented booming has been witnessed in the research area of artistic style transfer ever since Gatys et al. introduced the neural method. One of the remaining challenges is to balance a trade-off among three critical aspects—speed, flexibility, and quality: (i) the vanilla optimization-based algorithm produces impressive results for arbitrary styles, but is unsatisfyingly slow due to its iterative nature, (ii) the fast approximation methods based on feed-forward neural networks generate satisfactory artistic effects but bound to only a limited number of styles, and (iii) feature-matching methods like AdaIN achieve arbitrary style transfer in a real-time manner but at a cost of the compromised quality. We find it considerably difficult to balance the trade-off well merely using a single feed-forward step and ask, instead, whether there exists an algorithm that could adapt quickly to any style, while the adapted model maintains high efficiency and good image quality. Motivated by this idea, we propose a novel method, coined MetaStyle, which formulates the neural style transfer as a bilevel optimization problem and combines learning with only a few post-processing update steps to adapt to a fast approximation model with satisfying artistic effects, comparable to the optimization-based methods for an arbitrary style. The qualitative and quantitative analysis in the experiments demonstrates that the proposed approach achieves high-quality arbitrary artistic style transfer effectively, with a good trade-off among speed, flexibility, and quality.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Han, Xinying, Yang Wu, and Rui Wan. "A Method for Style Transfer from Artistic Images Based on Depth Extraction Generative Adversarial Network." Applied Sciences 13, no. 2 (January 8, 2023): 867. http://dx.doi.org/10.3390/app13020867.

Повний текст джерела
Анотація:
Depth extraction generative adversarial network (DE-GAN) is designed for artistic work style transfer. Traditional style transfer models focus on extracting texture features and color features from style images through an autoencoding network by mixing texture features and color features using high-dimensional coding. In the aesthetics of artworks, the color, texture, shape, and spatial features of the artistic object together constitute the artistic style of the work. In this paper, we propose a multi-feature extractor to extract color features, texture features, depth features, and shape masks from style images with U-net, multi-factor extractor, fast Fourier transform, and MiDas depth estimation network. At the same time, a self-encoder structure is used as the content extraction network core to generate a network that shares style parameters with the feature extraction network and finally realizes the generation of artwork images in three-dimensional artistic styles. The experimental analysis shows that compared with other advanced methods, DE-GAN-generated images have higher subjective image quality, and the generated style pictures are more consistent with the aesthetic characteristics of real works of art. The quantitative data analysis shows that images generated using the DE-GAN method have better performance in terms of structural features, image distortion, image clarity, and texture details.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Ruder, Manuel, Alexey Dosovitskiy, and Thomas Brox. "Artistic Style Transfer for Videos and Spherical Images." International Journal of Computer Vision 126, no. 11 (April 21, 2018): 1199–219. http://dx.doi.org/10.1007/s11263-018-1089-z.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Banar, Nikolay, Matthia Sabatelli, Pierre Geurts, Walter Daelemans, and Mike Kestemont. "Transfer Learning with Style Transfer between the Photorealistic and Artistic Domain." Electronic Imaging 2021, no. 14 (January 18, 2021): 41–1. http://dx.doi.org/10.2352/issn.2470-1173.2021.14.cvaa-041.

Повний текст джерела
Анотація:
Transfer Learning is an important strategy in Computer Vision to tackle problems in the face of limited training data. However, this strategy still heavily depends on the amount of availabl data, which is a challenge for small heritage institutions. This paper investigates various ways of enrichingsmaller digital heritage collections to boost the performance of deep learningmodels, using the identification of musical instruments as a case study. We apply traditional data augmentation techniques as well as the use of an external, photorealistic collection, distorted by Style Transfer. Style Transfer techniques are capable of artistically stylizing images, reusing the style from any other given image. Hence, collections can be easily augmented with artificially generated images. We introduce the distinction between inner and outer style transfer and show that artificially augmented images in both scenarios consistently improve classification results, on top of traditional data augmentation techniques. However, and counter-intuitively, such artificially generated artistic depictions of works are surprisingly hard to classify. In addition, we discuss an example of negative transfer within the non-photorealistic domain.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Hien, Ngo Le Huy, Luu Van Huy, and Nguyen Van Hieu. "Artwork style transfer model using deep learning approach." Cybernetics and Physics, Volume 10, 2021, Number 3 (October 30, 2021): 127–37. http://dx.doi.org/10.35470/2226-4116-2021-10-3-127-137.

Повний текст джерела
Анотація:
Art in general and fine arts, in particular, play a significant role in human life, entertaining and dispelling stress and motivating their creativeness in specific ways. Many well-known artists have left a rich treasure of paintings for humanity, preserving their exquisite talent and creativity through unique artistic styles. In recent years, a technique called ’style transfer’ allows computers to apply famous artistic styles into the style of a picture or photograph while retaining the shape of the image, creating superior visual experiences. The basic model of that process, named ’Neural Style Transfer,’ has been introduced promisingly by Leon A. Gatys; however, it contains several limitations on output quality and implementation time, making it challenging to apply in practice. Based on that basic model, an image transform network was proposed in this paper to generate higher-quality artwork and higher abilities to perform on a larger image amount. The proposed model significantly shortened the execution time and can be implemented in a real-time application, providing promising results and performance. The outcomes are auspicious and can be used as a referenced model in color grading or semantic image segmentation, and future research focuses on improving its applications.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Lang, Langtian. "Style transfer with VGG19." Applied and Computational Engineering 6, no. 1 (June 14, 2023): 149–58. http://dx.doi.org/10.54254/2755-2721/6/20230752.

Повний текст джерела
Анотація:
Style transfer is a wide-used technique in image and photograph processing, which could transfer the style of an image to a target image that has a different content. This image processing technique has been used in the algorithms of some image processing software as well as modern artistic creation. However, the intrinsic principle of style transfer and its transfer accuracy is still not clear and stable. This article discusses a new method for preprocessing image data that uses feature extraction and forming vector fields and utilizing multiple VGG19 to separately train the distinct features in images to obtain a better effect in predicting. Our model could generate more autonomous and original images that are not simply adding a style filter to the image, which can help the development of AI style transfer and painting.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Dinesh Kumar, R., E. Golden Julie, Y. Harold Robinson, S. Vimal, Gaurav Dhiman, and Murugesh Veerasamy. "Deep Convolutional Nets Learning Classification for Artistic Style Transfer." Scientific Programming 2022 (January 10, 2022): 1–9. http://dx.doi.org/10.1155/2022/2038740.

Повний текст джерела
Анотація:
Humans have mastered the skill of creativity for many decades. The process of replicating this mechanism is introduced recently by using neural networks which replicate the functioning of human brain, where each unit in the neural network represents a neuron, which transmits the messages from one neuron to other, to perform subconscious tasks. Usually, there are methods to render an input image in the style of famous art works. This issue of generating art is normally called nonphotorealistic rendering. Previous approaches rely on directly manipulating the pixel representation of the image. While using deep neural networks which are constructed using image recognition, this paper carries out implementations in feature space representing the higher levels of the content image. Previously, deep neural networks are used for object recognition and style recognition to categorize the artworks consistent with the creation time. This paper uses Visual Geometry Group (VGG16) neural network to replicate this dormant task performed by humans. Here, the images are input where one is the content image which contains the features you want to retain in the output image and the style reference image which contains patterns or images of famous paintings and the input image which needs to be style and blend them together to produce a new image where the input image is transformed to look like the content image but “sketched” to look like the style image.
Стилі APA, Harvard, Vancouver, ISO та ін.

Дисертації з теми "ARTISTIC STYLE TRANSFER"

1

Zabaleta, Razquin Itziar. "Image processing algorithms as artistic tools in digital cinema." Doctoral thesis, Universitat Pompeu Fabra, 2021. http://hdl.handle.net/10803/672840.

Повний текст джерела
Анотація:
The industry of cinema has experienced a radical change in the last decades: the transition from film cinematography to its digital format. As a consequence, several challenges have appeared, but, at the same time, many possibilities are open now for cinematographers to explore with this new medium. In this thesis, we propose different tools that can be useful for cinematographers while doing their craft. First, we develop a tool for automatic color grading. It is a statistics-based method to automatically transfer the style from a graded image to unprocessed footage. Some advantages of the model are its simplicity and low computational cost, which make it amenable for real-time implementation, allowing cinematographers to experiment on-set with different styles and looks. Then, a method for adding texture to footage is created. In cinema, the most commonly used texture is film grain, either directly shooting on film, or adding synthetic grain later-on at post-production stage. We propose a model of "retinal noise" which is inspired by processes in the visual system, and produces results that look natural and visually pleasing. It has parameters that allow to vary widely the resulting texture appearance, which make it an artistic tool for cinematographers. Moreover, due to the "masking" phenomenon of the visual system, the addition of this texture improves the perceived visual quality of images, resulting in bit rate and bandwidth savings. The method has been validated through psychophysical experiments in which observers, including cinema professionals, prefer it over film grain emulation alternatives from academia and the industry. Finally, we introduce a physiology-based image quality metric, which can have several applications in the image processing field, and more specifically in the cinema and broadcasting context: video coding, image compression, etc. We study an optimization of the model parameters in order to be competitive with the state-of-the-art quality metrics. An advantage of the method is its reduced number of parameters, compared with some state-of-the-art methods based in deep-learning, which have a number of parameters several orders of magnitude larger.
La industria del cine ha experimentado un cambio radical en las últimas décadas: la transición de su soporte fílmico a la tecnología del cine digital. Como consecuencia, han aparecido algunos desafíos técnicos, pero, al mismo tiempo, infinitas nuevas posibilidades se han abierto con la utilización de este nuevo medio. En esta tesis, se proponen diferentes herramientas que pueden ser útiles en el contexto del cine. Primero, se ha desarrollado una herramienta para aplicar \textit{color grading} de manera automática. Es un método basado en estadísticas de imágenes, que transfiere el estilo de una imagen de referencia a metraje sin procesar. Las ventajas del método son su sencillez y bajo coste computacional, que lo hacen adecuado para ser implementado a tiempo real, permitiendo que se pueda experimentar con diferentes estilos y 'looks', directamente on-set. En segundo lugar, se ha creado un método para mejorar imágenes mediante la adición de textura. En cine, el grano de película es la textura más utilizada, ya sea porque la grabación se hace directamente sobre película, o porque ha sido añadido a posteriori en contenido grabado en formato digital. En esta tesis se propone un método de 'ruido retiniano' inspirado en procesos del sistema visual, que produce resultados naturales y visualmente agradables. El modelo cuenta con parámetros que permiten variar ampliamente la apariencia de la textura, y por tanto puede ser utilizado como una herramienta artística para cinematografía. Además, debido al fenómeno de enmascaramiento del sistema visual, al añadir esta textura se produce una mejora en la calidad percibida de las imágenes, lo que supone ahorros en ancho de banda y tasa de bits. El método ha sido validado mediante experimentos psicofísicos en los cuales ha sido elegido por encima de otros métodos que emulan grano de película, métodos procedentes de academia como de industria. Finalmente, se describe una métrica de calidad de imágenes, basada en fenómenos fisiológicos, con aplicaciones tanto en el campo del procesamiento de imágenes, como más concretamente en el contexto del cine y la transmisión de imágenes: codificación de vídeo, compresión de imágenes, etc. Se propone la optimización de los parámetros del modelo, de manera que sea competitivo con otros métodos del estado del arte . Una ventaja de este método es su reducido número de parámetros comparado con algunos métodos basados en deep learning, que cuentan con un número varios órdenes de magnitud mayor.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Teixeira, Inês Filipa Nunes. "Artistic Style Transfer for Textured 3D Models." Master's thesis, 2017. https://repositorio-aberto.up.pt/handle/10216/106653.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Teixeira, Inês Filipa Nunes. "Artistic Style Transfer for Textured 3D Models." Dissertação, 2017. https://repositorio-aberto.up.pt/handle/10216/106653.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

SAGAR. "ARTISTIC STYLE TRANSFER USING CONVOLUTIONAL NEURAL NETWORKS." Thesis, 2019. http://dspace.dtu.ac.in:8080/jspui/handle/repository/16763.

Повний текст джерела
Анотація:
One of the exciting research field has implemented known as Neural Style Transfer, which is a technique to transform images in an artistic way. Two images are taken as input image namely style image and content image to transform another base image with the help of optimization technique. This NST can be done with the help of Convolutional Neural Networks model as many researchers have tried to achieve good results using CNN network architecture. One of the famous and efficient pre-trained architecture is VGG16 and Gatys et al. [2] were able to generated good results based upon the VGG model. [2] Many famous Mobile and Web applications like DeepArt, Prisma and Pikazoapp have used these models to transformed images in an artistic way. [6] [27] We primarily have discussed different Neural Style Transfer techniques then we have classified the artistic style transfer. We have implementation the model in Keras with the pretrained CNN model that is VGG19 where we have adjusted the hyperparameters and transformation coefficients. VGG19 model has been trained on ImageNet dataset and we used it for feature extraction where for testing we have used two datasets namely Caltech101 and Caltech256. The fundamentals of NST is also discussed in depth literature survey which can be found in chapter 2.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Wang, Shen-Chi, and 王聖棋. "Paint Style Transfer System with the Artistic Database." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/7zz79n.

Повний текст джерела
Анотація:
碩士
國立東華大學
資訊工程學系
95
Digital painting synthesizes an output image with paint styles of example images along with the input source image. However, the synthesis procedure always requires the user intervention in selecting patches from example images that best describe its paint styles. The thesis presents a systematic system framework to synthesize example-based rendering images requires no user intervention in the synthesis procedure. The artistic database is been comprised in this work, and the user can synthesize an image according to the paint styles of different well known artists. We use the mean shift image segmentation procedure and the texture re-synthesis method to construct our artistic database, and then find the correspondence between example textures and the mean-shifting areas of the input source image, and then synthesize the output images using the patch-based sampling approach. The main contribution of this thesis is the systematic paint style transfer system for synthesizing a new image without requiring any user intervention. The artistic database is composed of re-synthesized mean-shifting example images of different artists, which are adopted as learning examples of the paint style of different well known artists during the synthesis procedure, and the system will synthesize a new image with the paint style of the user selected artist from the database automatically.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Tu, Ning, and 杜寧. "Video Cloning for Paintings via Artistic Style Transfer." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/6am2q7.

Повний текст джерела
Анотація:
碩士
國立中正大學
資訊工程研究所
104
In the past, visual arts usually represented the static art like paintings, photography and sculptures. In recent years, many museums, artwork galleries, and even art exhibitions demonstrated dynamic artworks for visitors to relish. The most famous dynamic artwork is “The moving painting of Along the River During the Qingming Festival”. Nevertheless, it took two years to complete this work. They had to plan each action for every character at first, then drew each video frame by animators. Finally, it could achieve seamless stitching by using lots of projectors to render scene on the screen. In our research, we propose a method for generating animated paintings. It only needs millions of videos on a network of existing databases and requires users to perform some simple auxiliary operations to achieve the effect of animation synthesis. First, our system lets users select an object with the same class from the first video frame. We then employ random forests as learning algorithm to retrieve from a video the object which users want to insert into an artwork. Second, we utilize style transferring, which enables the video frames to be consistent with the style of painting. At last, we use the seamless image cloning algorithm to yield seamless synthesizing result. Our approach allows different users to synthesize animating paintings up to their own preferences. The resulting work not only maintains the original author's painting style, but also generates a variety of artistic conception for people to enjoy.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Rebelo, Ana Daniela Peres. "The impact of artificial intelligence on the creativity of videos." Master's thesis, 2020. http://hdl.handle.net/10773/30624.

Повний текст джерела
Анотація:
In this study, the impact of using an Artificial Intelligence (AI) algorithm, Style Transfer, on the evaluation of the creative elements of artistic videos was explored. The aim of this study was to verify to what extent the use of this system contributes to changes in the qualitative and quantitative perception of the elements of creativity present in the videos, and to verify what changes occur. An experimental study was carried out, including two sets of videos that were watched by two groups: 1) a control group (n = 49, composed by 25 experts and 24 non-experts); 2) an experimental group (n = 52, composed by 27 experts and 25 non-experts). The first set, consisting of six videos without AI transformation (shared videos), was shown to both groups, aiming at verifying the equivalence of evaluation criteria. The second set, consisting of six videos (differentiated) with a version transformed by AI (displayed to the experimental group) and another untransformed (displayed to the control group). Each participant was asked to rate the videos, on a five-point Likert scale, on six elements of creativity and to characterize the creativity of each video with two words. The quantitative results showed equivalence of criteria in the shared videos between the experimental group and the control group (only one of 36 comparisons showed significant differences). Regarding quantitative comparisons of the differentiated (experimental versus control), 10 evaluations showed no significant differences, while five had higher evaluations of creativity in the experimental group and five in the control group. Concerning qualitative comparisons, in general, the frequency of terms used by participants in both groups was similar in the shared videos. In the differentiated videos there were some differences. Taken together, the results emphasize the importance of human mediation in the application of an Artificial Intelligence algorithm in creative production, which reinforces the relevance of the concept of Hybrid Intelligence.
Neste estudo foi explorado o impacto do uso de um algoritmo de Inteligência Artificial (IA), Style Transfer, na avaliação dos elementos de criatividade de vídeos artísticos. O objetivo deste estudo foi verificar em que medida o uso deste sistema contribui para alterações na perceção qualitativa e quantitativa dos elementos de criatividade presentes nos vídeos, e verificar que mudanças ocorrem. Foi efetuado um estudo experimental, contemplando dois conjuntos de vídeos que foram visualizados por dois grupos: 1) um grupo de controlo (n=49, dos quais 25 peritos e 24 não peritos); 2) um grupo experimental (n=52, dos quais 27 peritos e 25 não peritos). O primeiro conjunto, composto por seis vídeos sem transformação de IA (vídeos partilhados), foi exibido a ambos os grupos, visando a verificação da equivalência de critérios de avaliação. O segundo conjunto, composto por seis vídeos (diferenciados) teve uma versão transformada por IA (exibida ao grupo experimental) e outra não transformada (exibida ao grupo de controlo). A cada participante foi solicitado que avaliasse, numa escala de Likert de cinco pontos, os vídeos, em seis elementos de criatividade e que caracterizasse a criatividade de cada vídeo com duas palavras. Os resultados quantitativos demonstraram equivalência de critérios nos vídeos partilhados entre o grupo experimental e o grupo de controlo (apenas uma de 36 comparações apresentou diferenças significativas). Relativamente à comparação quantitativa dos vídeos diferenciados (experimental versus controlo), 10 avaliações não apresentaram diferenças significativas, enquanto que cinco tiveram avaliações mais elevadas de criatividade no grupo experimental e cinco no grupo de controlo. Nas comparações qualitativas, em geral, a frequência dos termos usados pelos participantes de ambos os grupos foram semelhantes nos vídeos partilhados. Nos vídeos diferenciados ocorreram algumas diferenças. No seu conjunto, os resultados enfatizam a importância da mediação humana na aplicação de um algoritmo de Inteligência artificial na produção criativa, o que reforça a relevância do conceito de Inteligência Híbrida.
Mestrado em Comunicação Multimédia
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Li, Yu-Ting, and 李毓婷. "Feature-based Artistic Styles Transfer Using Effective Texture Synthesis." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/06806315150453764693.

Повний текст джерела
Анотація:
碩士
義守大學
資訊工程學系碩士班
96
Texture synthesis has been widely studied in recent years and patch-based sampling has proven superior in synthesis quality and computational time. Many technologies also extended the algorithm to perform texture transfer, rendering and image inpainting. These approaches can produce satisfactory results for a wide range of applications. However, it demonstrates a severe drawback in the execution time, because it requires very time consuming blending and iteration processing for image transfer. In 2001 SIGGRAPH Hertzmann presented an image analogies technology. This method was developed to learn very complex and non-linear image filters—for instance, filters that can convert a photograph into various types of artistic renderings having the appearance of oil, watercolor, or pen-and-ink, by analogy with actual (real-life) renderings in these styles. Although this method works well in effect, it has less practical due to its high complexity and long synthesis time. Besides, an input sample image and this sample''s filtered image are both required to render a new synthesized image. This work presents a feature based artistic styles transfer algorithm, which employs patch-based texture synthesis approaches, as well as speeds up texture synthesis process by exploiting Particle Swarm Optimization (PSO) during matching search process. We add some new feature constraints to the current algorithm, therefore, output image has similar visual effect as a manual painting. Once a certain sample image with artistic style is presented, the algorithm can accomplish image analogy by transferring the style to the target.
Стилі APA, Harvard, Vancouver, ISO та ін.

Книги з теми "ARTISTIC STYLE TRANSFER"

1

Engstrom, Craig Lee, and Derrick L. Williams. “Prisoners Rise, Rise, Rise!”. University of Illinois Press, 2017. http://dx.doi.org/10.5406/illinois/9780252037702.003.0009.

Повний текст джерела
Анотація:
This chapter provides a rhetorical analysis of “consciousness-raising hip-hop.” Merging personal stories with an encyclopedic knowledge of contemporary pop culture, it argues that a politically savvy subgenre of hip-hop artists are raising awareness about incarceration in the black community and producing effective strategies for community activism. The hip-hop movement plays an important role in illuminating the problems of the prison-industrial complex by creating spaces of prison protest and modeling sources of community care. The analysis of hip-hop focuses on the artists, music, and (life)styles that promote a type of citizen-orator that is Ciceronian in character. Particular attention is given to those hip-hop artists who fit the definition of “consciousness-raising” by providing hope to prisoners and communities working to transform the U.S. criminal-justice system.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Lee, Adam. The Platonism of Walter Pater. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198848530.001.0001.

Повний текст джерела
Анотація:
This book examines Walter Pater’s deep engagement with Platonism throughout his career, as a teacher of Plato in Oxford’s Literae Humaniores, from his earliest known essay, ‘Diaphaneitè’ (1864), to his final book, Plato and Platonism (1893), treating both his criticism and fiction, including his studies on myth. Pater is influenced by several of Plato’s dialogues, including Phaedrus, Symposium, Theaetetus, Cratylus, and The Republic, which inform his philosophy of aesthetics, history, myth, epistemology, ethics, language, and style. As a philosopher, critic, and artist, Plato embodies what it means to be an author to Pater, who imitates his creative practice from vision to expression. Through the recognition of form in matter, Pater views education as a journey to refine one’s knowledge of beauty in order to transform oneself. Platonism is a point of contact with his contemporaries, including Matthew Arnold and Oscar Wilde, offering a means to take new measure of their literary relationships. The philosophy also provides boundaries for critical encounters with figures across history, including Wordsworth, Michelangelo and Pico della Mirandola in The Renaissance (1873), Marcus Aurelius and Apuleius in Marius the Epicurean (1885), and Montaigne and Giordano Bruno in Gaston de Latour (1896). In the manner Platonism holds that soul or mind is the essence of a person, Pater’s criticism seeks the mind of the author as an affinity, so that his writing enacts Platonic love.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Manieson, Victor. Accelerated Keyboard Musicianship. Noyam Publishers, 2021. http://dx.doi.org/10.38159/npub.eb20211001.

Повний текст джерела
Анотація:
Approaches towards the formal learning of piano playing with respect to musicianship is one that demands the understanding of musical concepts and their applications. Consequently, it requires the boldness to immerse oneself in performance situations while trusting one’s instincts. One needs only to cultivate an amazing ear and a good understanding of music theory to break down progressions “quickly”. Like an alchemist, one would have to pick their creative impulses from their musical toolbox, simultaneously compelling their fingers to coordinate with the brain and the music present to generate “pleasant sounds”. My exploration leading to what will be considered Keyboard Musicianship did not begin in a formal setting. Rather it was the consolidation of my involvement in playing the organ at home, Sunday school, boarding school at Presec-Legon, and playing at weekly gospel band performances off-campus and other social settings that crystalized approaches that can be formally structured. In fact, I did not then consider this lifestyle of musical interpretation worthy of academic inclusivity until I graduated from the national academy of music and was taken on the staff as an instructor in September, 1986. Apparently, what I did that seemed effortless was a special area that was integral to holistic music development. The late Dr. Robert Manford, the then director of the Academy, assigned me to teach Rudiments and Theory of Music to first year students, Keyboard Musicianship to final year students, and to continue giving Piano Accompaniment to students – just as I have been voluntarily doing to help students. The challenge was simply this; there was no official textbook or guide to use in teaching keyboard musicianship then and I was to help guide especially non-piano majors for practical exams in musicianship. What an enterprise! The good news though was that exemplifying functionalism in keyboard, organ, piano, etc. has been my survival activity off campus particularly in church and social settings.Having reflected thoroughly and prayerfully, it dawned on me that piano literacy repertoires were crafted differently than my assignments in Musicianship. Piano literacy repertoires of western music were abundant on campus but applied musicianship demanded a different approach. Playing a sonata, sonatina, mazurka, and waltzes at different proficiency levels was different from punching chords in R&B, Ballard style, Reggae, Highlife or even Hymn playing. However, there are approaches that can link them and also interpretations that can categorize them in other applicable dimensions. A “Retrospective Introspection” demanded that I confront myself constructively with two questions: 1. WHAT MUSICAL ACTIVITIES have I already enjoyed myself in that WARRANT or deserve this challenging assignment? 2. WHAT MUSICAL NOURISHMENT do l believe enriched my artistry that was so observable and Measurable? The answers were shocking! They were: 1. My weekend sojourn from Winneba to Accra to play for churches, brass bands, gospel bands and teaching of Choirs – which often left me penniless. 2. Volunteering to render piano accompaniment to any Voice Major student on campus since my very first year. 3. Applying a principle, I learnt from my father – TRANSFER OF LEARNING – I exported the functionalism of my off-campus musical activities to compliment my formal/academic work. 4. The improvisational influences of Rev. Stevenson Alfred Williams (gospel jazz pianist), Bessa Simmons (band director & keyboardist) and at Ghana Broadcasting Corporation, Mr. Ray Ellis “Afro Piano Jazz Fusion Highlife” The trust and support from lecturers and students in the academy injected an overwhelming and high sense of responsibility in me which nevertheless, guided me to observe structures of other established course outlines and apply myself with respect to approaches that were deemed relevant. Thus, it is in this light that I selected specific concepts worth exploring to validate the functionalism of what my assignment required. Initially, hymn structures, chords I, IV, V and short highlife chordal progressions inverted here and there were considered. Basic reading of notes and intense audiation were injected even as I developed technical exercises to help with the dexterity of stiff fingers. I conclude this preface by stating that, this “Instructional guide/manual” is actually a developmental workbook. I have deliberately juxtaposed simple original piano pieces with musicianship approaches. The blend is to equip learners to develop music literacy and performance proficiencies. The process is expected to compel the learner to immerse/initiate themselves into basic keyboard musicianship. While it is a basic book, I expect it to be a solid foundation for those who commit to it. Many of my former and present students have been requesting for a sort of guide to aid their teaching or refresh their memories. Though not exhaustive, the selections presented here are a response to a long-awaited workbook. I have used most of them not only in Winneba, but also at the Callanwolde Fine Arts Center (Atlanta) and the Piano Lab (Accra). I found myself teaching the same course in the 2009 – 2013 academic year in the Music Department of the University of Education, Winneba when Prof C.W.K Merekeu was Head of Department. My observation is that we still have a lot of work to do in bridging academia and industry. This implies that musicianship must be considered as the bloodline of musicality not only in theory but in practice. I have added simplified versions of my old course outlines as a guide for anyone interested in learning. Finally, I contend that Keyboard Musicianship is a craft and will require of the learner a consistent discipline and respect for: 1. The art of listening 2. Skill acquisition/proficient dexterity 3. Ability to interpret via extemporization and delivery/showmanship. For learners who desire to challenge themselves in intermediate and advanced piano, I recommend my book, “African Pianism. (A contribution to Africology)”
Стилі APA, Harvard, Vancouver, ISO та ін.

Частини книг з теми "ARTISTIC STYLE TRANSFER"

1

Fu, Tsu-Jui, Xin Eric Wang, and William Yang Wang. "Language-Driven Artistic Style Transfer." In Lecture Notes in Computer Science, 717–34. Cham: Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-20059-5_41.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Ruder, Manuel, Alexey Dosovitskiy, and Thomas Brox. "Artistic Style Transfer for Videos." In Lecture Notes in Computer Science, 26–36. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-45886-1_3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Qiu, Ting, Bingbing Ni, Ziang Liu, and Xuanhong Chen. "Fast Optimal Transport Artistic Style Transfer." In MultiMedia Modeling, 37–49. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-67832-6_4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Shibly, Kabid Hassan, Sazia Rahman, Samrat Kumar Dey, and Shahadat Hossain Shamim. "Advanced Artistic Style Transfer Using Deep Neural Network." In Cyber Security and Computer Science, 619–28. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-52856-0_49.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Uhde, Florian, and Sanaz Mostaghim. "Towards a General Framework for Artistic Style Transfer." In Computational Intelligence in Music, Sound, Art and Design, 177–93. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-77583-8_12.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Chowdhury, Atiqul Islam, Fairuz Shadmani Shishir, Ashraful Islam, Eshtiak Ahmed, and Mohammad Masudur Rahman. "Artistic Natural Images Generation Using Neural Style Transfer." In Advances in Intelligent Systems and Computing, 309–17. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-9927-9_31.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Gupta, Bharath, K. Govinda, R. Rajkumar, and Jolly Masih. "Neural Artistic Style Transfer Using Deep Neural Networks." In Advances in Intelligent Systems and Computing, 1–12. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-6887-6_1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Zhang, Yinshu, Jiayi Chen, Xiangyu Si, Zhiqiang Tian, and Xuguang Lan. "Image Artistic Style Transfer Based on Color Distribution Preprocessing." In Communications in Computer and Information Science, 155–64. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-7983-3_14.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Uhde, Florian. "Applicability of Convolutional Neural Network Artistic Style Transfer Algorithms." In Artificial Intelligence and the Arts, 61–81. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-59475-6_3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Uhde, Florian, and Sanaz Mostaghim. "Dissecting Neural Networks Filter Responses for Artistic Style Transfer." In Artificial Intelligence in Music, Sound, Art and Design, 297–312. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72914-1_20.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Тези доповідей конференцій з теми "ARTISTIC STYLE TRANSFER"

1

Xing, Yeli, Jiawei Li, Tao Dai, Qingtao Tang, Li Niu, and Shu-Tao Xia. "Portrait-Aware Artistic Style Transfer." In 2018 25th IEEE International Conference on Image Processing (ICIP). IEEE, 2018. http://dx.doi.org/10.1109/icip.2018.8451054.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Zuo, Zhiwen, Lei Zhao, Shuobin Lian, Haibo Chen, Zhizhong Wang, Ailin Li, Wei Xing, and Dongming Lu. "Style Fader Generative Adversarial Networks for Style Degree Controllable Artistic Style Transfer." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/693.

Повний текст джерела
Анотація:
Artistic style transfer is the task of synthesizing content images with learned artistic styles. Recent studies have shown the potential of Generative Adversarial Networks (GANs) for producing artistically rich stylizations. Despite the promising results, they usually fail to control the generated images' style degree, which is inflexible and limits their applicability for practical use. To address the issue, in this paper, we propose a novel method that for the first time allows adjusting the style degree for existing GAN-based artistic style transfer frameworks in real time after training. Our method introduces two novel modules into existing GAN-based artistic style transfer frameworks: a Style Scaling Injection (SSI) module and a Style Degree Interpretation (SDI) module. The SSI module accepts the value of Style Degree Factor (SDF) as the input and outputs parameters that scale the feature activations in existing models, offering control signals to alter the style degrees of the stylizations. And the SDI module interprets the output probabilities of a multi-scale content-style binary classifier as the style degrees, providing a mechanism to parameterize the style degree of the stylizations. Moreover, we show that after training our method can enable existing GAN-based frameworks to produce over-stylizations. The proposed method can facilitate many existing GAN-based artistic style transfer frameworks with marginal extra training overheads and modifications. Extensive qualitative evaluations on two typical GAN-based style transfer models demonstrate the effectiveness of the proposed method for gaining style degree control for them.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Kotovenko, Dmytro, Artsiom Sanakoyeu, Sabine Lang, and Bjorn Ommer. "Content and Style Disentanglement for Artistic Style Transfer." In 2019 IEEE/CVF International Conference on Computer Vision (ICCV). IEEE, 2019. http://dx.doi.org/10.1109/iccv.2019.00452.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Liu, Zhi-Song, Li-Wen Wang, Wan-Chi Siu, and Vicky Kalogeiton. "Name your style: text-guided artistic style transfer." In 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, 2023. http://dx.doi.org/10.1109/cvprw59228.2023.00359.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Xu, Wenju, Chengjiang Long, and Yongwei Nie. "Learning Dynamic Style Kernels for Artistic Style Transfer." In 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2023. http://dx.doi.org/10.1109/cvpr52729.2023.00972.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Chen, Haibo, Lei Zhao, Zhizhong Wang, Huiming Zhang, Zhiwen Zuo, Ailin Li, Wei Xing, and Dongming Lu. "DualAST: Dual Style-Learning Networks for Artistic Style Transfer." In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2021. http://dx.doi.org/10.1109/cvpr46437.2021.00093.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Zhang, Xinwei, and Xiaoyun Chen. "Fast Artistic Style Transfer via Wavelet Transforms." In 2023 4th International Conference on Information Science, Parallel and Distributed Systems (ISPDS). IEEE, 2023. http://dx.doi.org/10.1109/ispds58840.2023.10235622.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Buchnik, Itay, Or Berebi, Tammy Riklin Raviv, and Nir Shlezinger. "Generating Artistic Images Via Few-Shot Style Transfer." In 2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW). IEEE, 2023. http://dx.doi.org/10.1109/icasspw59220.2023.10193400.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Bae, Eunjee, Jaekyung Kim, and Sanghoon Lee. "Point Cloud-Based Free Viewpoint Artistic Style Transfer." In 2023 IEEE International Conference on Multimedia and Expo Workshops (ICMEW). IEEE, 2023. http://dx.doi.org/10.1109/icmew59549.2023.00058.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Dushkoff, Michael, Ryan McLaughlin, and Raymond Ptucha. "A temporally coherent neural algorithm for artistic style transfer." In 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE, 2016. http://dx.doi.org/10.1109/icpr.2016.7900142.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії