Добірка наукової літератури з теми "Transfert de style"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Transfert de style".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Transfert de style"
Duez, Bernard. "Le transfert comme paradigme processuel de la groupalité psychique : de l'habitude au style." Revue de psychothérapie psychanalytique de groupe 45, no. 2 (2005): 31. http://dx.doi.org/10.3917/rppg.045.0031.
Повний текст джерелаOkubo, Miki. "La notion "gothique" traduite dans la culture pop du Japon contemporain." Jangada: crítica | literatura | artes 1, no. 17 (August 6, 2021): 390–408. http://dx.doi.org/10.35921/jangada.v1i17.351.
Повний текст джерелаKoffi, Vivi, and Jean Lorrain. "L’intégration du successeur dans l’équipe de gestion des entreprises familiales : le cas des femmes chefs d’entreprise." Revue internationale P.M.E. 18, no. 3-4 (February 16, 2012): 73–92. http://dx.doi.org/10.7202/1008483ar.
Повний текст джерелаGold, Iwala. "L’emprunt En Traduction." Tasambo Journal of Language, Literature, and Culture 3, no. 01 (February 15, 2024): 390–95. http://dx.doi.org/10.36349/tjllc.2024.v03i01.045.
Повний текст джерелаLonvaud-Funel, Aline. "La désacidification biologique des vins. Etat de la question, perspectives d'avenir." OENO One 28, no. 2 (June 30, 1994): 161. http://dx.doi.org/10.20870/oeno-one.1994.28.2.1149.
Повний текст джерелаFeuillat, François, Jean-René Perrin, René Keller, Danièle Aubert, Pierre Gelhaye, Claude Houssement, Jean Perrin, and Pierre Michel. "Simulation expérimentale de « l'expérience tonneau ». Mesure des cinétiques d'imprégnation du liquide dans le bois et d'évaporation de surface." OENO One 28, no. 3 (September 30, 1994): 227. http://dx.doi.org/10.20870/oeno-one.1994.28.3.1140.
Повний текст джерелаBouillon, Pierrette, and Katharina Boesefeldt. "Problèmes de traduction automatique dans les sous-langages des bulletins d’avalanches." Meta 37, no. 4 (September 30, 2002): 635–46. http://dx.doi.org/10.7202/002108ar.
Повний текст джерелаBara, Florence, and Marie-France Morin. "Est-il nécessaire d’enseigner l’écriture script en première année ? Les effets du style d’écriture sur le lien lecture/écriture." Nouveaux cahiers de la recherche en éducation 12, no. 2 (July 30, 2013): 149–60. http://dx.doi.org/10.7202/1017456ar.
Повний текст джерелаDorin, Stéphane. "Style du Velours : sociologie du transfert de capital symbolique entre Andy Warhol et le Velvet Underground (1965-1967)." A contrario 3, no. 1 (2005): 45. http://dx.doi.org/10.3917/aco.031.67.
Повний текст джерелаHandayani, Sri. "La chanson folklorique enfantine comme media de l’apprentissage interculturel et du transfert des valeurs morales." Digital Press Social Sciences and Humanities 3 (2019): 00040. http://dx.doi.org/10.29037/digitalpress.43313.
Повний текст джерелаДисертації з теми "Transfert de style"
Mohammed, Omar. "Méthodes d'apprentissage approfondi pour l'extraction et le transfert de style." Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAT035.
Повний текст джерелаOne aspect of a successful human-machine interface (e.g. human-robot interaction, chatbots, speech, handwriting…,etc) is the ability to have a personalized interaction. This affects the overall human experience, and allow for a more fluent interaction. At the moment, there is a lot of work that uses machine learning in order to model such interactions. However, these models do not address the issue of personalized behavior: they try to average over the different examples from different people in the training set. Identifying the human styles (persona) opens the possibility of biasing the models output to take into account the human preference. In this thesis, we focused on the problem of styles in the context of handwriting.Defining and extracting handwriting styles is a challenging problem, since there is no formal definition for those styles (i.e., it is an ill-posed problem). Styles are both social - depending on the writer's training, especially in middle school - and idiosyncratic - depends on the writer's shaping (letter roundness, sharpness…,etc) and force distribution over time. As a consequence, there are no easy/generic metrics to measure the quality of style in a machine behavior.We may want to change the task or adapt to a new person. Collecting data in the human-machine interface domain can be quite expensive and time consuming. Although most of the time the new task has many things in common with the old task, traditional machine learning techniques fail to take advantage of this commonality, leading to a quick degradation in performance. Thus, one of the objectives of my thesis is to study and evaluate the idea of transferring knowledge about the styles between different tasks, within the machine learning paradigm.The objective of my thesis is to study these problems of styles, in the domain of handwriting. Available to us is IRONOFF dataset, an online handwriting datasets, with 410 writers, with ~25K examples of uppercase, lowercase letters and digits drawings. For transfer learning, we used an extra dataset, QuickDraw!, a sketch drawing dataset containing ~50 million drawing over 345 categories.Major contributions of my thesis are:1) Propose a work pipeline to study the problem of styles in handwriting. This involves proposing methodology, benchmarks and evaluation metrics.We choose temporal generative models paradigm in deep learning in order to generate drawings, and evaluate their proximity/relevance to the intended/ground truth drawings. We proposed two metrics, to evaluate the curvature and the length of the generated drawings. In order to ground those metics, we proposed multiple benchmarks - which we know their relative power in advance -, and then verified that the metrics actually respect the relative power relationship.2) Propose a framework to study and extract styles, and verify its advantage against the previously proposed benchmarks.We settled on the idea of using a deep conditioned-autoencoder in order to summarize and extract the style information, without the need to focus on the task identity (since it is given as a condition). We validate this framework to the previously proposed benchmark using our evaluation metrics. We also to visualize on the extracted styles, leading to some exciting outcomes!3) Using the proposed framework, propose a way to transfer the information about styles between different tasks, and a protocol in order to evaluate the quality of transfer.We leveraged the deep conditioned-autoencoder used earlier, by extract the encoder part in it - which we believe had the relevant information about the styles - and use it to in new models trained on new tasks. We extensively test this paradigm over a different range of tasks, on both IRONOFF and QuickDraw! datasets. We show that we can successfully transfer style information between different tasks
Cifka, Ondrej. "Deep learning methods for music style transfer." Electronic Thesis or Diss., Institut polytechnique de Paris, 2021. http://www.theses.fr/2021IPPAT029.
Повний текст джерелаRecently, deep learning methods have enabled transforming musical material in a data-driven manner. The focus of this thesis is on a family of tasks which we refer to as (one-shot) music style transfer, where the goal is to transfer the style of one musical piece or fragment onto another.In the first part of this work, we focus on supervised methods for symbolic music accompaniment style transfer, aiming to transform a given piece by generating a new accompaniment for it in the style of another piece. The method we have developed is based on supervised sequence-to-sequence learning using recurrent neural networks (RNNs) and leverages a synthetic parallel (pairwise aligned) dataset generated for this purpose using existing accompaniment generation software. We propose a set of objective metrics to evaluate the performance on this new task and we show that the system is successful in generating an accompaniment in the desired style while following the harmonic structure of the input.In the second part, we investigate a more basic question: the role of positional encodings (PE) in music generation using Transformers. In particular, we propose stochastic positional encoding (SPE), a novel form of PE capturing relative positions while being compatible with a recently proposed family of efficient Transformers.We demonstrate that SPE allows for better extrapolation beyond the training sequence length than the commonly used absolute PE.Finally, in the third part, we turn from symbolic music to audio and address the problem of timbre transfer. Specifically, we are interested in transferring the timbre of an audio recording of a single musical instrument onto another such recording while preserving the pitch content of the latter. We present a novel method for this task, based on an extension of the vector-quantized variational autoencoder (VQ-VAE), along with a simple self-supervised learning strategy designed to obtain disentangled representations of timbre and pitch. As in the first part, we design a set of objective metrics for the task. We show that the proposed method is able to outperform existing ones
Fares, Mireille. "Multimodal Expressive Gesturing With Style." Electronic Thesis or Diss., Sorbonne université, 2023. http://www.theses.fr/2023SORUS017.
Повний текст джерелаThe generation of expressive gestures allows Embodied Conversational Agents (ECA) to articulate the speech intent and content in a human-like fashion. The central theme of the manuscript is to leverage and control the ECAs’ behavioral expressivity by modelling the complex multimodal behavior that humans employ during communication. The driving forces of the Thesis are twofold: (1) to exploit speech prosody, visual prosody and language with the aim of synthesizing expressive and human-like behaviors for ECAs; (2) to control the style of the synthesized gestures such that we can generate them with the style of any speaker. With these motivations in mind, we first propose a semantically aware and speech-driven facial and head gesture synthesis model trained on the TEDx Corpus which we collected. Then we propose ZS-MSTM 1.0, an approach to synthesize stylized upper-body gestures, driven by the content of a source speaker’s speech and corresponding to the style of any target speakers, seen or unseen by our model. It is trained on PATS Corpus which includes multimodal data of speakers having different behavioral style. ZS-MSTM 1.0 is not limited to PATS speakers, and can generate gestures in the style of any newly coming speaker without further training or fine-tuning, rendering our approach zero-shot. Behavioral style is modelled based on multimodal speakers’ data - language, body gestures, and speech - and independent from the speaker’s identity ("ID"). We additionally propose ZS-MSTM 2.0 to generate stylized facial gestures in addition to the upper-body gestures. We train ZS-MSTM 2.0 on PATS Corpus, which we extended to include dialog acts and 2D facial landmarks
Shen, Tianxiao. "Language style transfer." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/117822.
Повний текст джерелаThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 41-45).
This thesis studies style transfer on the basis of non-parallel text. This is an instance of a broad family of problems including machine translation, decipherment, and attribute modication. The key challenge is to separate the content from style in an unsupervised manner. We assume a shared latent content distribution across different text corpora, and propose a method that leverages refined alignment of latent representations to perform style transfer. The transferred sentences from one style should match example sentences from the other style as a population. To demonstrate the flexibility of the proposed model, we test it on three tasks: sentiment modication, decipherment of word substitution ciphers, and word order recovery. In both automatic and human evaluation our method achieves strong performance.
by Tianxiao Shen.
S.M.
Hart, David Marvin. "Light-Field Style Transfer." BYU ScholarsArchive, 2019. https://scholarsarchive.byu.edu/etd/7763.
Повний текст джерелаGraffieti, Gabriele. "Style Transfer with Generative Adversarial Networks." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/17015/.
Повний текст джерелаMatthews, Nicholas (Nicholas J. ). "Evaluating style transfer in natural language." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/119734.
Повний текст джерелаThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 46-47).
Style transfer is an active area of research growing in popularity in the Natural Language setting. The goal of this thesis is present a comprehensive review of style transfer tasks used to date, analyze these tasks, and delineate important properties and candidate tasks for future methods researchers. Several challenges still exist, including the difficulty of distinguishing between content and style in a sentence. While some state of the art models attempt to overcome this problem, even tasks as simple as sentiment transfer are still non-trivial. Problems of granularity, transferability, and distinguishability have yet to be solved. I provide a comprehensive analysis of the popular sentiment transfer task along with a number of metrics that highlight its shortcomings. Finally, I introduce possible new tasks for consideration, news outlet style transfer and non-parallel error correction, and provide similar analysis for the feasibility of using these tasks as style transfer baselines.
by Nicholas Matthews.
M. Eng.
Battilana, Pietro. "Convolutional Neural Networks for Image Style Transfer." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/16770/.
Повний текст джерелаShih, YiChang. "Data-driven photographic style using local transfer." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/99846.
Повний текст джерелаCataloged from PDF version of thesis.
Includes bibliographical references (pages 139-154).
After taking pictures, photographers often seek to convey their unique moods by altering the style of their photographs, which can involve meticulous contrast management, lighting, dodging, and burning. In this sense, not only are advanced photographers concerned about their pictures' styles; casual photographers who take pictures with cellphone cameras also process their pictures using built-in applications to adjust the image's luminance, coloring, and details. In general, photographers who stylize pictures give them new, different visual appearances, while also preserving the original content. In this context, we investigate problems with novel image stylization, including reproducing the precise time-of-day where the lighting and atmosphere can make a landscape glow, and making a portrait style resemble that created by a renowned photographer. Given an already captured image, however, automatically achieving given styles is challenging. In fact, changing the appearance in a photograph to mimic another time-of-day requires the analysis and modeling of complex 3-D physical light interactions in the scene, while reproducing a portrait photographer's unique style require computers to acquire artistic tastes and a glimpse of the artist's creative process. In this dissertation, we sidestep these Al-complete problems to instead leverage the power of data. We exploit an image database consisting of time-lapse data describing variations in scene appearance during the course of an entire day, and stylish portraits that are already deliberately processed by artists. To leverage these data, we present new algorithms that put input images in dense and local correspondence with examples. In our first method, we change the time-of-day with a single image as the input, which we put in correspondence with a reference time-lapse video. We then extract the local appearance transformations between different frames of the reference, and apply them to the input. In our second method, we transfer the style of a portrait onto a new input by way of local and multi-scale transformations. We demonstrate our methods on public datasets and a large set of photos downloaded from the Internet. We show that we can successfully handle lightings at different times of day and styles by a variety of different artists.
by YiChang Shih.
Ph. D.
Tao, Joakim, and David Thimrén. "Smoothening of Software documentation : comparing a self-made sequence to sequence model to a pre-trained model GPT-2." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-178186.
Повний текст джерелаThis thesis was presented on June 22, 2021, the presentation was done online on Microsoft teams.
Книги з теми "Transfert de style"
Style and the successful girl: Transform your look, transform your life. New York: Gotham Books, 2013.
Знайти повний текст джерелаStencil style: Ideas and projects to transform your home. London: Ward Lock, 1995.
Знайти повний текст джерелаRiley, Lesley. Creative image transfer--any artist, any style, any surface: 16 new mixed-media projects using transfer artist paper. Lafayette, CA: C&T Publishing, 2014.
Знайти повний текст джерелаBump it up: Transform your pregnancy into the ultimate style statement. New York: Ballantine Books, 2010.
Знайти повний текст джерелаIrv, Sternberg, ed. How to run your business so you can leave it in style. New York, NY: Amacom, American Management Association, 1990.
Знайти повний текст джерелаB, Carroll Kathryn, and Brown John H. 1947-, eds. The completely revised how to run your business so you can leave it in style. 2nd ed. [Denver, Colo.]: Business Enterprise Press, 1997.
Знайти повний текст джерелаRosenberg, Merrick. Taking flight!: Master the four behavioral styles and transform your career, your relationships-- your life. Upper Saddle River, N.J: FT Press, 2013.
Знайти повний текст джерелаHughes, Mike. Tweak to transform: Improving teaching : a practical handbook for school leaders. Stafford: Network Educational Press, 2002.
Знайти повний текст джерела1969-, Wang Xun, ed. Ji ceng zhi li mo shi zhuan xing: Yang Cun ge an yan jiu = Administration style transfer at the grass-roots : a case study of Yang Village. Beijing Shi: She hui ke xue wen xian chu ban she, 2008.
Знайти повний текст джерела1969-, Wang Xun, ed. Ji ceng zhi li mo shi zhuan xing: Yang Cun ge an yan jiu = Administration style transfer at the grass-roots : a case study of Yang Village. Beijing Shi: She hui ke xue wen xian chu ban she, 2008.
Знайти повний текст джерелаЧастини книг з теми "Transfert de style"
Sarang, Poornachandra. "Style Transfer." In Artificial Neural Networks with TensorFlow 2, 577–611. Berkeley, CA: Apress, 2020. http://dx.doi.org/10.1007/978-1-4842-6150-7_12.
Повний текст джерелаLiu, Shiguang. "Style Transfer." In Synthesis Lectures on Visual Computing: Computer Graphics, Animation, Computational Photography and Imaging, 55–64. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-26030-8_6.
Повний текст джерелаWutt, Karl. "Stile." In Edition Transfer, 54–57. Vienna: Springer Vienna, 2010. http://dx.doi.org/10.1007/978-3-211-99154-1_7.
Повний текст джерелаKim, Sunnie S. Y., Nicholas Kolkin, Jason Salavon, and Gregory Shakhnarovich. "Deformable Style Transfer." In Computer Vision – ECCV 2020, 246–61. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58574-7_15.
Повний текст джерелаChen, Dongdong, Lu Yuan, and Gang Hua. "Deep Style Transfer." In Computer Vision, 1–8. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-03243-2_863-1.
Повний текст джерелаDana, Kristin J. "Texture Style Transfer." In Computational Texture and Patterns, 47–50. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-031-01823-7_6.
Повний текст джерелаPaper, David. "Fast Style Transfer." In State-of-the-Art Deep Learning Models in TensorFlow, 295–319. Berkeley, CA: Apress, 2021. http://dx.doi.org/10.1007/978-1-4842-7341-8_12.
Повний текст джерелаChen, Dongdong, Lu Yuan, and Gang Hua. "Deep Style Transfer." In Computer Vision, 269–76. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-63416-2_863.
Повний текст джерелаTeoh, Teik Toe, and Zheng Rong. "Neural Style Transfer." In Machine Learning: Foundations, Methodologies, and Applications, 303–19. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-8615-3_19.
Повний текст джерелаMcAllister, Tyler, and Björn Gambäck. "Music Style Transfer Using Constant-Q Transform Spectrograms." In Artificial Intelligence in Music, Sound, Art and Design, 195–211. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-03789-4_13.
Повний текст джерелаТези доповідей конференцій з теми "Transfert de style"
Schaldenbrand, Peter, Zhixuan Liu, and Jean Oh. "StyleCLIPDraw: Coupling Content and Style in Text-to-Drawing Translation." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/688.
Повний текст джерелаPonamaryov, Valeriy, and Victor Kitov. "Image Style Transfer With a Group of Similar Styles." In 33rd International Conference on Computer Graphics and Vision. Keldysh Institute of Applied Mathematics, 2023. http://dx.doi.org/10.20948/graphicon-2023-565-571.
Повний текст джерелаChen, Jiafu, Boyan Ji, Zhanjie Zhang, Tianyi Chu, Zhiwen Zuo, Lei Zhao, Wei Xing, and Dongming Lu. "TeSTNeRF: Text-Driven 3D Style Transfer via Cross-Modal Learning." In Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}. California: International Joint Conferences on Artificial Intelligence Organization, 2023. http://dx.doi.org/10.24963/ijcai.2023/642.
Повний текст джерелаYi, Xiaoyuan, Zhenghao Liu, Wenhao Li, and Maosong Sun. "Text Style Transfer via Learning Style Instance Supported Latent Space." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/526.
Повний текст джерелаZuo, Zhiwen, Lei Zhao, Shuobin Lian, Haibo Chen, Zhizhong Wang, Ailin Li, Wei Xing, and Dongming Lu. "Style Fader Generative Adversarial Networks for Style Degree Controllable Artistic Style Transfer." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/693.
Повний текст джерелаLu, Haofei, and Zhizhong Wang. "Universal Video Style Transfer via Crystallization, Separation, and Blending." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/687.
Повний текст джерелаWang, Zhizhong, Lei Zhao, Haibo Chen, Zhiwen Zuo, Ailin Li, Wei Xing, and Dongming Lu. "DivSwapper: Towards Diversified Patch-based Arbitrary Style Transfer." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/690.
Повний текст джерелаYin, Di, Shujian Huang, Xin-Yu Dai, and Jiajun Chen. "Utilizing Non-Parallel Text for Style Transfer by Making Partial Comparisons." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/747.
Повний текст джерелаSrivastava, Hardik, Sneha Sunil, and K. Shantha Kumari. "Neural Text Style Transfer with Custom Language Styles for Personalized Communication Systems." In 2022 International Conference on Knowledge Engineering and Communication Systems (ICKECS). IEEE, 2022. http://dx.doi.org/10.1109/ickecs56523.2022.10059789.
Повний текст джерелаSemmo, Amir, Tobias Isenberg, and Jürgen Döllner. "Neural style transfer." In the Symposium. New York, New York, USA: ACM Press, 2017. http://dx.doi.org/10.1145/3092919.3092920.
Повний текст джерелаЗвіти організацій з теми "Transfert de style"
Stock, Gregory J. An Investigation of Applications of Neural Style Transfer to Forensic Footwear Comparison. Gaithersburg, MD: National Institute of Standards and Technology, 2023. http://dx.doi.org/10.6028/nist.gcr.23-040.
Повний текст джерелаStock, Gregory. An Investigation of Applications of Neural Style Transfer to Forensic Footwear Comparison. Gaithersburg, MD: National Institute of Standards and Technology, 2023. http://dx.doi.org/10.6028/nist.ir.8460.
Повний текст джерелаEyal, Yoram, and Sheila McCormick. Molecular Mechanisms of Pollen-Pistil Interactions in Interspecific Crossing Barriers in the Tomato Family. United States Department of Agriculture, May 2000. http://dx.doi.org/10.32747/2000.7573076.bard.
Повний текст джерелаLaw, Edward, Samuel Gan-Mor, Hazel Wetzstein, and Dan Eisikowitch. Electrostatic Processes Underlying Natural and Mechanized Transfer of Pollen. United States Department of Agriculture, May 1998. http://dx.doi.org/10.32747/1998.7613035.bard.
Повний текст джерелаXu, Jin-Rong, and Amir Sharon. Comparative studies of fungal pathogeneses in two hemibiotrophs: Magnaporthe grisea and Colletotrichum gloeosporioides. United States Department of Agriculture, May 2008. http://dx.doi.org/10.32747/2008.7695585.bard.
Повний текст джерела