Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Audio cover song.

Zeitschriftenartikel zum Thema „Audio cover song“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-29 Zeitschriftenartikel für die Forschung zum Thema "Audio cover song" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Zeitschriftenartikel für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Sarno, Riyanarto, Dedy Rahman Wijaya und Muhammad Nezar Mahardika. „Music fingerprinting based on bhattacharya distance for song and cover song recognition“. International Journal of Electrical and Computer Engineering (IJECE) 9, Nr. 2 (01.04.2019): 1036. http://dx.doi.org/10.11591/ijece.v9i2.pp1036-1044.

Der volle Inhalt der Quelle
Annotation:
People often have trouble recognizing a song especially, if the song is sung by a not original artist which is called cover song. Hence, an identification system might be used to help recognize a song or to detect copyright violation. In this study, we try to recognize a song and a cover song by using the fingerprint of the song represented by features extracted from MPEG-7. The fingerprint of the song is represented by Audio Signature Type. Moreover, the fingerprint of the cover song is represented by Audio Spectrum Flatness and Audio Spectrum Projection. Furthermore, we propose a sliding algorithm and k-Nearest Neighbor (k-NN) with Bhattacharyya distance for song recognition and cover song recognition. The results of this experiment show that the proposed fingerprint technique has an accuracy of 100% for song recognition and an accuracy of 85.3% for cover song recognition.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Maršík, Ladislav, Petr Martišek, Jaroslav Pokorný, Martin Rusek, Kateřina Slaninová, Jan Martinovič, Matthias Robine, Pierre Hanna und Yann Bayle. „KaraMIR: A Project for Cover Song Identification and Singing Voice Analysis Using a Karaoke Songs Dataset“. International Journal of Semantic Computing 12, Nr. 04 (Dezember 2018): 501–22. http://dx.doi.org/10.1142/s1793351x18400202.

Der volle Inhalt der Quelle
Annotation:
We introduce KaraMIR, a musical project dedicated to karaoke song analysis. Within KaraMIR, we define Kara1k, a dataset composed of 1000 cover songs provided by Recisio Karafun application, and the corresponding 1000 songs by the original artists. Kara1k is mainly dedicated toward cover song identification and singing voice analysis. For both tasks, Kara1k offers novel approaches, as each cover song is a studio-recorded song with the same arrangement as the original recording, but with different singers and musicians. Essentia, harmony-analyser, Marsyas, Vamp plugins and YAAFE have been used to extract audio features for each track in Kara1k. We provide metadata such as the title, genre, original artist, year, International Standard Recording Code and the ground truths for the singer’s gender, backing vocals, duets, and lyrics’ language. KaraMIR project focuses on defining new problems and describing features and tools to solve them. We thus provide a comparison of traditional and new features for a cover song identification task using statistical methods, as well as the dynamic time warping method on chroma, MFCC, chords, keys, and chord distance features. A supporting experiment on the singer gender classification task is also proposed. The KaraMIR project website facilitates the continuous research.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Susanti, Eka Dian, Ari Sapto und Dewa Agung Gede Agung. „Pengembangan Media ECHA (Elaboration, Cover Song, Historycal Content, Audio Visual) Berbasis Vlog Dalam Pembelajaran Sejarah“. Jurnal Pendidikan: Teori, Penelitian, dan Pengembangan 5, Nr. 3 (09.03.2020): 326. http://dx.doi.org/10.17977/jptpp.v5i3.13252.

Der volle Inhalt der Quelle
Annotation:
<p class="Abstract"><strong>Abstract:</strong> Vlog-based ECHA is a learning medium that was developed by adjusting the character and needs of students in sitory subjects. ECHA consists of elaboration, cover song, historycal content, and audio visual. It means that the media elaborates the song that the lyrics have changed with historycal material in the form of audio visual. ECHA media was developed with vlogs to faciltate teacher and student access that can later be used independently. The result of the trial show that this product development is pratical and can increase motivation in class XI IPS 3 SMA Negeri 1 Sumberpucung.</p><strong>Abstrak:</strong> ECHA berbasis Vlog merupakan media pembelajaran yang dikembangkan dengan menyesuaikan karakter dan kebutuhan siswa dalam mata pelajaran Sejarah. ECHA terdiri dari kata <em>elaboration, cover song</em>, <em>historycal content, and audio visual</em>. Asrtinya adalah media yang mengolaborasikan lagu yang telah diubah liriknya dengan materi Sejarah dalam bentuk audio visual. Media ECHA dikembangkan dengan Vlog untuk memudahkan akses guru maupun siswa yang nantinya juga dapat digunakan secara mandiri. Hasil dari uji coba menunjukkan bahwa produk pengembangan ini praktis dan dapat meningkatkan motivasi kelas XI IPS 3 di SMA Negeri 1 Sumberpucung.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Cooper, B. Lee. „Promoting Social Change Through Audio Repetition1“. Journal of Popular Music Studies 30, Nr. 3 (03.09.2018): 45–56. http://dx.doi.org/10.1525/jpms.2018.200015.

Der volle Inhalt der Quelle
Annotation:
The development of contemporary American music is clearly reflected in the integration of black composers, performers, and their songs into mainstream popular record charts. Between 1953 and 1978 a fascinating role reversal occurred. During that quarter century black artists shifted from creators to revivalists. The same role reversal did not apply to white artists, who tended to evolve along a more consistent audience-acceptance continuum. How can this 25-year cycle of social change best be illustrated? What particular elements of black music dramatically entered the pop spectrum during the fifties, and later gained dominance by the end of the sixties? Why did black artists become more and more conservative during the late seventies? A careful examination of audio repetition – cover recordings and song revivals – offers a great deal of revealing information about changes in social, economic and artistic life in America after 1953.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Putra, I. Putu Lukita Wiweka Nugraha. „Kearifan Lokal Musikal dalam Lagu-lagu Album Bali Kumara“. Journal of Music Science, Technology, and Industry 1, Nr. 1 (31.08.2018): 99. http://dx.doi.org/10.31091/jomsti.v1i1.506.

Der volle Inhalt der Quelle
Annotation:
ABSTRAKTopik dari artikel ini adalah kearifan lokal pada lagu-lagu album Bali Kumara I. Tujuan dari penulisan artikel ini adalah untuk mendeskripsikan kearifan lokal pada lagu-lagu album Bali Kumara. Metode yang digunakan dalam studi ini adalah deskriptif kualitatif. I Komang Darmayuda selaku komposer lagu-lagu album Bali Kumara merupakan informan dalam penelitian ini. Data berupa rekaman audio lagu-lagu album Bali Kumara. Kearifan lokal dalam lagu-lagu album Bali Kumara terletak pada penggunaan tangga nada/titi laras pelog dan slendro, serta penggunaan bahasa Bali singgih dan sor. Tangga nada pelog dan slendro diimplementasikan pada melodi-melodi utama, baik oleh alat musik keyboard, gamelan, maupun vokal. Pada beberapa lagu terdapat juga tangga nada lain, seperti diatonis dan kromatis, dimana tidak terlalu memudarkan warna pelog dan slendro. Jika dilihat dari sudut pandang sor singgih basa (tingkatan-tingkatan dalam bahasa Bali), sebagain besar lagu-lagu di album Bali Kumara menggunakan basa singgih, dan beberapa di antaranya menggunakan basa sor. Lagu pop Bali sebagai salah satu kesenian profan yang tumbuh dan berkembang di Bali sudah seyogyanya mengandung unsur-unsur kearifan lokal Bali. Kearifan lokal yang terkandung pada lagu-lagu album Bali Kumara dapat menjadi pijakan bagi para pelaku lagu pop Bali khususnya komposer dalam menciptakan karya-karya berikutnya, sehingga identitas lagu Pop Bali tetap terjaga.Kata kunci: musik, kearifan lokal, lagu, album Bali Kumara. ABSTRACTThe topic of this article covers local wisdom in the songs of Bali Kumara first album. The purpose of writing this article is to describe the local wisdom in the songs of the Bali Kumara first album. The method used in this study is descriptive-qualitative. I Komang Darmayuda as a composer of Bali Kumara first Album songs is an informant in this research. The data is audio recording of songs in Bali Kumara first album. The Balinese local wisdom in the songs of Bali Kumara first album lies in the use of tone named titi laras pelog and slendro, as well as the use of the Balinese language such as singgih and sor. Pelog and slendro tones are implemented in the main melodies, either by keyboard instruments, gamelan or vocals. In some songs, there are also other scales, such as diatonic and chromatic, which do not diminish the pelog and slendro colors. Seen from the point of view basa singgih (levels in the Balinese language), most of the songs in the album Bali Kumara uses basa singgih, and some basa sor. Balinese pop songs as one of the profane art that grow and develop in Bali should contain elements of the Baliense local wisdom. The local wisdom contained in the songs of Bali Kumara album can be a foothold for the performers of Balinese pop songs especially composers in creating works, so that the identity of the Balinese pop songs is maintained. Keywords: music, local genius, song, album of Bali Kumara.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Ingolfsson, Vala, Carolina Montenegro, William D. Service und Christopher B. Sturdy. „Manual versus automatic identification of black-capped chickadee (Poecile atricapillus) vocalizations“. Alberta Academic Review 2, Nr. 2 (10.09.2019): 41–42. http://dx.doi.org/10.29173/aar48.

Der volle Inhalt der Quelle
Annotation:
One time-consuming aspect of bioacoustic research is identifying vocalizations from long audio recordings. SongScope (version 4.1.5. Wildlife Acoustics, Inc.) is a computer program capable of developing acoustic recognizers that can identify wildlife vocalizations. The goal of the current study was to compare the effectiveness of manual identification of black-capped chickadee vocalizations to identification by SongScope recognizers. A recognizer was developed for each main chickadee vocalization by providing previously annotated audio of chickadees. Six chickadees (three male, three female) were recorded in one-hour intervals with and without anthropogenic (i.e., man-made) noise to provide a variety of samples to test the recognizer. These recordings were analyzed via the recognizer and two human coders, with an additional third coder reviewing a random subset of recordings for reliability. Strong agreement was found between the human coders, κ = 0.76, p < 0.00. Agreement between human coders and the recognizer was moderate for fee songs, κ = 0.46, p < 0.00, and strong for fee-bee songs, κ = 0.77, p < 0.00, as well as for chick-a-dee calls, κ = 0.82, p < 0.00. Results showed that male chickadees produced more tseet calls in silence and females produced more gargle calls during noise. No differences were found in vocalizations based on time of day. Our observations also suggest that the chick-a-dee recognizer was capable of identifying gargle and tseet calls along with the intended chick-a-dee calls. Overall, SongScope was effective at identifying fee-bee songs and chick-a-dee calls, but not as effective for identifying fee songs. These recognizers can allow for faster acoustic analyses (by approximately four times) and be continuously improved for greater accuracy.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Kocherzhuk, D. V. „Sound recording in pop art: differencing the «remake» and «remix» musical versions“. Aspects of Historical Musicology 14, Nr. 14 (15.09.2018): 229–44. http://dx.doi.org/10.34064/khnum2-14.15.

Der volle Inhalt der Quelle
Annotation:
Background. Contemporary audio art in search of new sound design, as well as the artists working in the field of music show business, in an attempt to draw attention to the already well-known musical works, often turn to the forms of “remake” or “remix”. However, there are certain disagreements in the understanding of these terms by artists, vocalists, producers and professional sound engineer team. Therefore, it becomes relevant to clarify the concepts of “remake” and “remix” and designate the key differences between these musical phenomena. The article contains reasoned, from the point of view of art criticism, positions concerning the misunderstanding of the terms “remake” and “remix”, which are wide used in the circles of the media industry. The objective of the article is to explore the key differences between the principles of processing borrowed musical material, such as “remix” and “remake” in contemporary popular music, in particular, in recording studios. Research methodology. In the course of the study two concepts – «remake» and «remix» – were under consideration and comparison, on practical examples of some works of famous pop vocalists from Ukraine and abroad. So, the research methodology includes the methods of analysis for consideration of the examples from the Ukrainian, Russian and world show business and the existing definitions of the concepts “remake” and “remix”; as well as comparison, checking, coordination of the latter; formalization and generalization of data in getting the results of our study. The modern strategies of the «remake» invariance development in the work of musicians are taken in account; also, the latest trends in the creation of versions of «remix» by world class artists and performers of contemporary Ukrainian pop music are reflected. The results of the study. The research results reveal the significance of terminology pair «remix» and «remake» in the activities of the pop singer. It found that the differences of two similar in importance terms not all artists in the music industry understand. The article analyzes the main scientific works of specialists in the audiovisual and musical arts, in philosophical and sociological areas, which addressed this issue in the structure of music, such as the studies by V. Tormakhova, V. Otkydach, V. Myslavskyi, I. Tarasova, Yu. Koliadych, L. Zdorovenko and several others, and on this basis the essence of the concepts “remake” and “remix” reveals. The phenomenon of the “remake” is described in detail in the dictionary of V. Mislavsky [5], where the author separately outlined the concept of “remake” not only in musical art, but also in the film industry and the structure of video games. The researcher I. Tarasovа also notes the term “remake” in connection with the problem of protection of intellectual property and the certification of the copyright of the performer and the composer who made the original version of the work [13]. At the same time, the term “remix” in musical science has not yet found a precise definition. In contemporary youth pop culture, the principle of variation of someone else’s musical material called “remix” is associated with club dance music, the principle of “remake” – with the interpretation of “another’s” music work by other artist-singers. “Remake” is a new version or interpretation of a previously published work [5: 31]. Also close to the concept of “remake” the term “cover version” is, which is now even more often uses in the field of modern pop music. This is a repetition of the storyline laid down by the author or performer of the original version, however, in his own interpretation of another artist, while the texture and structure of the work are preserving. A. M. Tormakhova deciphered the term “remake” as a wide spectrum of changes in the musical material associated with the repetition of plot themes and techniques [14: 8]. In a general sense, “a wide spectrum of changes” is not only the technical and emotional interpretation of the work, including the changes made by the performer in style, tempo, rhythm, tessitura, but also it is an aspect of composing activity. For a composer this is an expression of creative thinking, the embodiment of his own vision in the ways of arrangement of material. For a sound director and a sound engineer, a “remix” means the working with computer programs, saturating music with sound effects; for a producer and media corporations it is a business. “Remake” is a rather controversial phenomenon in the music world. On the one hand, it is training for beginners in the field of art; on the other hand, the use of someone else’s musical material in the work can neighbor on plagiarism and provoke the occurrence of certain conflict situations between artists. From the point of view of show business, “remake” is only a method for remind of a piece to the public for the purpose of its commercial use, no matter who the song performed. Basically, an agreement concludes between the artists on the transfer or contiguity of copyright and the right to perform the work for profit. For example, the song “Diva” by F. Kirkorov is a “remake” of the work borrowed from another performer, the winner of the Eurovision Song Contest 1998 – Dana International [17; 20], which is reflected in the relevant agreement on the commercial use of musical material. Remix as a music product is created using computer equipment or the Live Looping music platform due to the processing of the original by introducing various sound effects into the initial track. Interest in this principle of material processing arose in the 80s of the XXth century, when dance, club and DJ music entered into mass use [18]. As a remix, one can considers a single piece of music taken as the main component, which is complemented in sequence by the components of the DJ profile. It can be various samples, the changing of the speed of sounding, the tonality of the work, the “mutation” of the soloist’s voice, the saturation of the voice with effects to achieve a uniform musical ensemble. To the development of such a phenomenon as a “remix” the commercial activities of entertainment facilities (clubs, concert venues, etc.) contributes. The remix principle is connected with the renewal of the musical “hit”, whose popularity gradually decreased, and the rotation during the broadcast of the work did not gain a certain number of listeners. Conclusions. The musical art of the 21st century is full of new experimental and creative phenomena. The process of birth of modified forms of pop works deserves constant attention not only from the representatives of the industry of show business and audiovisual products, but also from scientists-musicologists. Such popular musical phenomena as “remix” and “remake” have a number of differences. So, a “remix” is a technical form of interpreting a piece of music with the help of computer processing of both instrumental parts and voices; it associated with the introduction of new, often very heterogeneous, elements, with tempo changes. A musical product created according to this principle is intended for listeners of “club music” and is not related to the studio work of the performer. The main feature of the “remake”is the presence of studio work of the sound engineer, composer and vocalist; this work is aimed at modernizing the character of the song, which differs from the original version. The texture of the original composition, in the base, should be preserved, but it can be saturated with new sound elements, the vocal line and harmony can be partially changed according to interpreter’s own scheme. The introduction of the scientific definitions of these terms into a common base of musical concepts and the further in-depth study of all theoretical and practical components behind them will contribute to the correct orientation in terminology among the scientific workers of the artistic sphere and actorsvocalists.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Hao, Yiya, Yaobin Chen, Weiwei Zhang, Gong Chen und Liang Ruan. „A real-time music detection method based on convolutional neural network using Mel-spectrogram and spectral flux“. INTER-NOISE and NOISE-CON Congress and Conference Proceedings 263, Nr. 1 (01.08.2021): 5910–18. http://dx.doi.org/10.3397/in-2021-11599.

Der volle Inhalt der Quelle
Annotation:
Audio processing, including speech enhancement system, improves speech intelligibility and quality in real-time communication (RTC) such as online meetings and online education. However, such processing, primarily noise suppression and automatic gain control, is harmful to music quality when the captured signal is music instead of speech. A music detector can solve the issue above by switching off the speech processing when the music is detected. In RTC scenarios, the music detector should be low-complexity and cover various situations, including different types of music, background noises, and other acoustical environments. In this paper, a real-time music detection method with low-computation complexity is proposed, based on a convolutional neural network (CNN) using Mel-spectrogram and spectral flux as input features. The proposed method achieves overall 90.63% accuracy under different music types (classical music, instruments solos, singing-songs, etc.), speech languages (English and Mandarin), and noise types. The proposed method is constructed on a lightweight CNN model with a small feature size, which guarantees real-time processing.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Waldron, David. „Witchcraft for Sale! Commodity vs. Community in the Neopagan Movement“. Nova Religio 9, Nr. 1 (01.08.2005): 32–48. http://dx.doi.org/10.1525/nr.2005.9.1.032.

Der volle Inhalt der Quelle
Annotation:
The growth of the Pagan and Witchcraft revivalist movements (Neopaganism) is well documented in the Anglophone world. However, Witchcraft movements are also closely linked to a vibrant set of subcultures and a multitude of representations in popular culture. In this context investigating the relationship between Witchcraft as a religious community and its representation in consumer culture and mass media is extremely significant. This article examines the ambiguous relationship between witch and Wiccan communities and the vast array of merchandising, popular culture and media representations that surround them. In creating the WIKID WITCH KIT I hope to take you on a magickal and exciting journey! Through ritual, music, song and spoken word I will help you unleash your inner magick and discover the wonderful and positively empowering world of Witchcraft. As part of this journey you will discover your WIKID magickal name, giving you access to our exclusive website and online coven. There you can meet up with other WIKID Witches to swap spells, stories, and ideas. And every full moon I will personally join you for an online gathering——which will be truly WIKID. WIKID Witch Kit features WIKID Magick Fizz/WIKID Magick Potions/WIKID Magick Fire/WIKID Magick Star/WIKID Magick Cord/WIKID Magick Audio CD.1
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Williams, Richard David, und Rafay Mahmood. „A Soundtrack for Reimagining Pakistan? Coke Studio, Memory and the Music Video“. BioScope: South Asian Screen Studies 10, Nr. 2 (Dezember 2019): 111–28. http://dx.doi.org/10.1177/0974927619896771.

Der volle Inhalt der Quelle
Annotation:
Since 2007, Coke Studio has rapidly become one of the most influential platforms in televisual, digital and musical media, and has assumed a significant role in generating new narratives about Pakistani modernity. The musical pieces in Coke Studio’s videos re-work a range of genres and performing arts, encompassing popular and familiar songs, as well as resuscitating classical poetry and the musical traditions of marginalised communities. This re-working of the creative arts of South Asia represents an innovative approach to sound, language, and form, but also poses larger questions about how cultural memory and national narratives can be reimagined through musical media, and then further reworked by media consumers and digital audiences. This article considers how Coke Studio’s music videos have been both celebrated and criticised, and explores the online conversations that compared new covers to the originals, be they much loved or long forgotten. The ways in which the videos are viewed, shared, and dissected online sheds light on new modes of media consumption and self-reflection. Following specific examples, we examine the larger implications of the hybrid text–video–audio object in the digital age, and how the consumers of Coke Studio actively participate in developing new narratives about South Asian history and Pakistani modernity.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Okuno, Hiroshi G., und Kazuhiro Nakadai. „Special Issue on Robot Audition Technologies“. Journal of Robotics and Mechatronics 29, Nr. 1 (20.02.2017): 15. http://dx.doi.org/10.20965/jrm.2017.p0015.

Der volle Inhalt der Quelle
Annotation:
Robot audition, the ability of a robot to listen to several things at once with its own “ears,” is crucial to the improvement of interactions and symbiosis between humans and robots. Since robot audition was originally proposed and has been pioneered by Japanese research groups, this special issue on robot audition technologies of the Journal of Robotics and Mechatronics covers a wide collection of advanced topics studied mainly in Japan. Specifically, two consecutive JSPS Grants-in-Aid for Scientific Research (S) on robot audition (PI: Hiroshi G. Okuno) from 2007 to 2017, JST Japan-France Research Cooperative Program on binaural listening for humanoids (PI: Hiroshi G. Okuno and Patrick Danès) from 2009 to 2013, and the ImPACT Tough Robotics Challenge (PM: Prof. Satoshi Tadokoro) on extreme audition for search and rescue robots since 2015 have contributed to the promotion of robot audition research, and most of the papers in this issue are the outcome of these projects. Robot audition was surveyed in the special issue on robot audition in the Journal of Robotic Society of Japan, Vol.28, No.1 (2011) and in our IEEE ICASSP-2015 paper. This issue covers the most recent topics in robot audition, except for human-robot interactions, which was covered by many papers appearing in Advanced Robotics as well as other journals and international conferences, including IEEE IROS. This issue consists of twenty-three papers accepted through peer reviews. They are classified into four categories: signal processing, music and pet robots, search and rescue robots, and monitoring animal acoustics in natural habitats. In signal processing for robot audition, Nakadai, Okuno, et al. report on HARK open source software for robot audition, Takeda, et al. develop noise-robust MUSIC-sound source localization (SSL), and Yalta, et al. use deep learning for SSL. Odo, et al. develop active SSL by moving artificial pinnae, and Youssef, et al. propose binaural SSL for an immobile or mobile talker. Suzuki, Otsuka, et al. evaluate the influence of six impulse-response-measuring signals on MUSIC-based SSL, Sekiguchi, et al. give an optimal allocation of distributed microphone arrays for sound source separation, and Tanabe, et al. develop 3D SSL by using a microphone array and LiDAR. Nakadai and Koiwa present audio-visual automatic speech recognition, and Nakadai, Tezuka, et al. suppress ego-noise, that is, noise generated by the robot itself. In music and pet robots, Ohkita, et al. propose audio-visual beat tracking for a robot to dance with a human dancer, and Tomo, et al. develop a robot that operates a wayang puppet, an Indonesian world cultural heritage, by recognizing emotion in Gamelan music. Suzuki, Takahashi, et al. develop a pet robot that approaches a sound source. In search and rescue robots, Hoshiba, et al. implement real-time SSL with a microphone array installed on a multicopter UAV, and Ishiki, et al. design a microphone array for multicopters. Ohata, et al. detect a sound source with a multicopter microphone array, and Sugiyama, et al. identify detected acoustic events through a combination of signal processing and deep learning. Bando, et al. enhance the human-voice online and offline for a hose-shaped rescue robot with a microphone array. In monitoring animal acoustics in natural habitats, Suzuki, Matsubayashi, et al. design and implement HARKBird, Matsubayashi, et al. report on the experience of monitoring birds with HARKBird, and Kojima, et al. use a spatial-cue-based probabilistic model to analyze the songs of birds singing in their natural habitat. Aihara, et al. analyze a chorus of frogs with dozens of sound-to-light conversion device Firefly, the design and analysis of which is reported on by Mizumoto, et al. The editors and authors hope that this special issue will promote the further evolution of robot audition technologies in a diversity of applications.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Lawrentschuk, Nathan, Andrew Evans, John Srigley, Joseph L. Chin, Bish Bora, Amber Hunter, Robin McLeod und Neil E. Fleshner. „Surgical margin status among men with organ-confined (pT2) prostate cancer: a population-based study“. Canadian Urological Association Journal 5, Nr. 3 (04.04.2013): 161. http://dx.doi.org/10.5489/cuaj.637.

Der volle Inhalt der Quelle
Annotation:
Background: Following prostate cancer surgery, positive surgicalmargin (PSM) status varies among institutions and there is evidencethat high-volume surgeons and centres obtain better oncologicalresults. However, larger studies recording PSM for radicalprostatectomy (RP) are from large “centres of excellence” and notpopulation-based. Cancer Care Ontario undertook an audit ofpathology reports to determine the province-wide PSM rate forpathological stage T2 (pT2) disease prostate cancer and to assessthe overall and regional-based PSM rates based on surgical volumeto understand gaps in quality of care prior to undertaking qualityimprovement initiatives.Methods: Data were extracted as part of the Pathology ProjectAudit data output (2005, 2006). Pathology reports were submittedto Cancer Care Ontario by Ontario hospitals electronically viathe Pathology Information Management System. An experiencedcancer pathology coder extracted the PSM data from eligible RPcancer specimen pathology reports. Only reports that provideda pathological stage were included in the analysis. Biopsy andtransurethral resection of the prostate reports were excluded. Aconvenience sample of 1346 reports from 2006 and 728 from2005 were analyzed. Regression analysis was performed to assessvolume-margin associations.Results: The median province-wide surgical PSM rate for pT2disease was 33%, ranging 0-100% among 43 hospitals whereRP volumes ranged 12-625. There was no significant correlation(p > 0.05) between volume and PSM by logistic regression withvariable odds ratios (95% confidence interval [CI]) for PSM by quartile(1st = 1.66 [0.93-2.96]; 2nd = 0.97 [0.58-1.62]; 3rd = 1.44[0.91-2.29]) compared to the highest volume last quartile. Mean PSMrates between community and teaching hospitals were not significantlydifferent.Conclusions: The province-wide PSM rate for pT2 disease prostatecancer undergoing RP is higher than those published from “centresof excellence.” Results from larger volume centres were not statisticallysignificantly better, which contradicts previously publisheddata. Factors, such as individual surgeon, patient selection, pathologicalprocessing and interpretation, may explain the differences.Contexte : Après une chirurgie pour traiter un cancer de la prostate,la présence de marges chirurgicales positives (MCP) varie d’unétablissement à l’autre. Des données montrent que les chirurgiens etles centres qui traitent des nombres élevés de patients obtiennent demeilleurs résultats oncologiques. Cela dit, les études de plus grandeenvergure ayant noté la présence de MCP après une prostatectomieradicale (PR) ont été menées dans de grands « centres d’excellence »et ne sont donc pas fondées sur la population. Action Cancer Ontarioa entrepris une vérification de rapports de pathologie afin de déterminerle taux provincial de MCP pour le cancer de la prostate et lestaux de MCP en fonction du nombre de chirurgies dans le but decomprendre les lacunes dans la qualité des soins avant de lancer desinitiatives d’amélioration de la qualité.Méthodologie : Les données ont été obtenues par le PathologyProject Audit (2005, 2006). Des rapports de pathologie ont été soumispar voie électronique à Action Cancer Ontario par des hôpitauxde la province par le biais du Système de gestion d’informationpathologique. Un programmeur expérimenté en pathologie cancéreusea extrait l’information concernant les MCP des rapports depathologie portant sur des échantillons provenant de cas admissiblesde cancer de la prostate traités par PR. Seuls les rapportsfournissant un stade pathologique ont été inclus dans l’analyse. Lesrapports concernant les biopsies et résections transurétrales de laprostate ont été exclus. Un échantillon convenable de 1346 rapportsde 2006 et 728 rapports de 2005 a été analysé. Une analysepar régression a permis d’évaluer les associations entre le nombrede cas traités et les marges chirurgicales.Résultats : Le taux médian de MCP pour la province pour les casde stade pT2 était de 33 %, et se situait entre 0 et 100 % dans43 hôpitaux où le nombre de PR se chiffrait entre 12 et 625. Onn’a noté aucune corrélation significative (p > 0.05) entre le nombred’interventions et les MCP lors d’une analyse de régression logistiquetenant compte des rapports de cotes (intervalle de confiance[CI] à 95 %) pour les marges chirurgicales positives par quartile(1er = 1,6 [0,93-2,96]; 2e = 0,97 [0,58-1,62]; 3e = 1,44 [0,91-2,29])en comparaison avec le dernier quartile pour le nombre le plusélevé. Les taux de MCP n’étaient pas significativement différentsdans les hôpitaux communautaires et les hôpitaux universitaires.Conclusions : Le taux provincial de MCP pour les cas de cancerde la prostate de stade pT2 subissant une PR est plus élevé que les taux provenant des « centres d’excellence ». Les résultats des centres traitant des nombres plus élevés n’étaient pas significativement meilleurs sur le plan statistique, ce qui contredit les données publiées antérieurement. Des facteurs comme le chirurgien concerné, la sélection des patients, et l’analyse et l’interprétation pathologiques peuvent expliquer les différences.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Bonham, Oliver, Bruce Broster, David Cane, Keith Johnson und Kate MacLachlan. „The Development of Canada's Competency Profile for Professional Geoscientists at Entry-to-Practice“. Geoscience Canada 44, Nr. 2 (21.07.2017): 77–84. http://dx.doi.org/10.12789/geocanj.2017.44.118.

Der volle Inhalt der Quelle
Annotation:
Competency-based assessment approaches to professional registration reflect the move by professions, both in Canada and around the world, away from traditional credentials-based assessments centred on a combination of academic achievements and supervised practice time. Entry to practice competencies are the abilities required to enable effective and safe entry-level practice in a profession. In 2012, Geoscientists Canada received funding from the Government of Canada’s Foreign Credentials Recognition Program. A central component of the funding involved the development of a competency profile to assist in assessment for licensing in the geoscience profession. Work concluded with the approval of the Competency Profile for Professional Geoscientists at Entry to Practice by Geoscientists Canada in November 2014. The Competency Profile comprises concise statements in plain language, setting out the skills and abilities that are required to be able to work as a geoscientist, in an effective and safe manner, independent of direct supervision. It covers competencies common to all geoscientists; competencies for the primary subdisciplines of geoscience (geology, environmental geoscience and geophysics); and a generic set of high level competences that can apply in any specific work context in geoscience. The paper is in two parts. Part 1 puts the concept of competencies in context and describes the approach taken to develop the profile, including: input from Subject Matter Experts (practising geoscientists representing a diverse sampling of the profession); extensive national consultation and refinement; and a validation procedure, including a survey of practising Canadian geoscientists. Part 2 introduces the profile, explains its structure, and provides examples of some of the competencies. The full competency profile can be obtained from the Geoscientists Canada website www.geoscientistscanada.ca. Future work will identify specific indicators of proficiency related to each competency and suggest appropriate methodologies to assess such competencies. It will also involve mapping the profile to the existing Canadian reference standard, Geoscience Knowledge and Experience Requirements for Professional Registration in Canada.RÉSUMÉLes approches d'évaluation basées sur les compétences en vue de l'inscription professionnelle reflètent l'abandon par les professions, tant au Canada que partout dans le monde, des évaluations classiques basées sur les titres de compétences et axées sur une combinaison de réalisations académiques et de temps de pratique supervisée. Les compétences au niveau débutant sont les capacités requises pour une pratique efficace et en toute sécurité audit niveau dans une profession. En 2012, Géoscientifiques Canada a reçu un financement du Programme de reconnaissance des titres de compétences étrangers du gouvernement du Canada. Une composante centrale du financement incluait l’élaboration d'un profil des compétences pour faciliter l'évaluation de la délivrance de permis dans la profession de géoscience. Ce travail a été conclu en novembre 2014 avec l'approbation par Géoscientifiques Canada du Profil des compétences pour les géoscientifiques professionnels au niveau débutant. Le profil des compétences comprend des déclarations concises dans un langage clair, définissant les compétences et les capacités requises pour exercer efficacement, en toute sécurité et indépendamment de toute supervision directe, en tant que géoscientifique. Il couvre les compétences communes à tous les géoscientifiques; les compétences pour les sous-disciplines primaires de la géoscience (géologie, géoscience environnementale et géophysique); et un ensemble générique de compétences de haut niveau pouvant s'appliquer dans tout contexte de travail spécifique en géoscience. Le document comporte deux parties. La 1ère partie met en contexte le concept de compétences et décrit l'approche adoptée pour élaborer le profil, y compris : les contributions d'experts dans le domaine (géoscientifiques professionnels représentant un échantillonnage diversifié de la profession); de vastes consultations et perfectionnements à l'échelle nationale; et une procédure de validation, incluant une enquête auprès des géoscientifiques professionnels canadiens. La 2ème partie présente le profil, explique sa structure et fournit des exemples pour certaines des compétences. Le profil des compétences complet est disponible sur le site web de Géoscientifiques Canada www.geoscientistscanada.ca. Les travaux futurs identifieront des indicateurs spécifiques d’aptitude liés à chaque compétence et suggèreront des méthodologies appropriées pour leur évaluation. Ils comprendront également la mise en correspondance du profil avec la norme de référence canadienne existante et les exigences en matière de Connaissances et expérience des géosciences requises pour l'inscription à titre professionnel au Canada.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Collins, Steve. „Amen to That“. M/C Journal 10, Nr. 2 (01.05.2007). http://dx.doi.org/10.5204/mcj.2638.

Der volle Inhalt der Quelle
Annotation:
In 1956, John Cage predicted that “in the future, records will be made from records” (Duffel, 202). Certainly, musical creativity has always involved a certain amount of appropriation and adaptation of previous works. For example, Vivaldi appropriated and adapted the “Cum sancto spiritu” fugue of Ruggieri’s Gloria (Burnett, 4; Forbes, 261). If stuck for a guitar solo on stage, Keith Richards admits that he’ll adapt Buddy Holly for his own purposes (Street, 135). Similarly, Nirvana adapted the opening riff from Killing Jokes’ “Eighties” for their song “Come as You Are”. Musical “quotation” is actively encouraged in jazz, and contemporary hip-hop would not exist if the genre’s pioneers and progenitors had not plundered and adapted existing recorded music. Sampling technologies, however, have taken musical adaptation a step further and realised Cage’s prediction. Hardware and software samplers have developed to the stage where any piece of audio can be appropriated and adapted to suit the creative impulses of the sampling musician (or samplist). The practice of sampling challenges established notions of creativity, with whole albums created with no original musical input as most would understand it—literally “records made from records.” Sample-based music is premised on adapting audio plundered from the cultural environment. This paper explores the ways in which technology is used to adapt previous recordings into new ones, and how musicians themselves have adapted to the potentials of digital technology for exploring alternative approaches to musical creativity. Sampling is frequently defined as “the process of converting an analog signal to a digital format.” While this definition remains true, it does not acknowledge the prevalence of digital media. The “analogue to digital” method of sampling requires a microphone or instrument to be recorded directly into a sampler. Digital media, however, simplifies the process. For example, a samplist can download a video from YouTube and rip the audio track for editing, slicing, and manipulation, all using software within the noiseless digital environment of the computer. Perhaps it is more prudent to describe sampling simply as the process of capturing sound. Regardless of the process, once a sound is loaded into a sampler (hardware or software) it can be replayed using a MIDI keyboard, trigger pad or sequencer. Use of the sampled sound, however, need not be a faithful rendition or clone of the original. At the most basic level of manipulation, the duration and pitch of sounds can be altered. The digital processes that are implemented into the Roland VariOS Phrase Sampler allow samplists to eliminate the pitch or melodic quality of a sampled phrase. The phrase can then be melodically redefined as the samplist sees fit: adapted to a new tempo, key signature, and context or genre. Similarly, software such as Propellerhead’s ReCycle slices drum beats into individual hits for use with a loop sampler such as Reason’s Dr Rex module. Once loaded into Dr Rex, the individual original drum sounds can be used to program a new beat divorced from the syncopation of the original drum beat. Further, the individual slices can be subjected to pitch, envelope (a component that shapes the volume of the sound over time) and filter (a component that emphasises and suppresses certain frequencies) control, thus an existing drum beat can easily be adapted to play a new rhythm at any tempo. For example, this rhythm was created from slicing up and rearranging Clyde Stubblefield’s classic break from James Brown’s “Funky Drummer”. Sonic adaptation of digital information is not necessarily confined to the auditory realm. An audio editor such as Sony’s Sound Forge is able to open any file format as raw audio. For example, a Word document or a Flash file could be opened with the data interpreted as audio. Admittedly, the majority of results obtained are harsh white noise, but there is scope for serendipitous anomalies such as a glitchy beat that can be extracted and further manipulated by audio software. Audiopaint is an additive synthesis application created by Nicolas Fournel for converting digital images into audio. Each pixel position and colour is translated into information designating frequency (pitch), amplitude (volume) and pan position in the stereo image. The user can determine which one of the three RGB channels corresponds to either of the stereo channels. Further, the oscillator for the wave form can be either the default sine wave or an existing audio file such as a drum loop can be used. The oscillator shapes the end result, responding to the dynamics of the sine wave or the audio file. Although Audiopaint labours under the same caveat as with the use of raw audio, the software can produce some interesting results. Both approaches to sound generation present results that challenge distinctions between “musical sound” and “noise”. Sampling is also a cultural practice, a relatively recent form of adaptation extending out of a time honoured creative aesthetic that borrows, quotes and appropriates from existing works to create new ones. Different fields of production, as well as different commentators, variously use terms such as “co-creative media”, “cumulative authorship”, and “derivative works” with regard to creations that to one extent or another utilise existing works in the production of new ones (Coombe; Morris; Woodmansee). The extent of the sampling may range from subtle influence to dominating significance within the new work, but the constant principle remains: an existing work is appropriated and adapted to fit the needs of the secondary creator. Proponents of what may be broadly referred to as the “free culture” movement argue that creativity and innovation inherently relies on the appropriation and adaptation of existing works (for example, see Lessig, Future of Ideas; Lessig, Free Culture; McLeod, Freedom of Expression; Vaidhyanathan). For example, Gwen Stefani’s 2004 release “Rich Girl” is based on Louchie Lou and Michie One’s 1994 single of the same title. Lou and One’s “Rich Girl”, in turn, is a reggae dance hall adaptation of “If I Were a Rich Man” from Fiddler on the Roof. Stefani’s “na na na” vocal riff shares the same melody as the “Ya ha deedle deedle, bubba bubba deedle deedle dum” riff from Fiddler on the Roof. Samantha Mumba adapted David Bowie’s “Ashes to Ashes” for her second single “Body II Body”. Similarly, Richard X adapted Tubeway Army’s “Are ‘Friends’ Electric?’ and Adina Howard’s “Freak Like Me” for a career saving single for Sugababes. Digital technologies enable and even promote the adaptation of existing works (Morris). The ease of appropriating and manipulating digital audio files has given rise to a form of music known variously as mash-up, bootleg, or bastard pop. Mash-ups are the most recent stage in a history of musical appropriation and they epitomise the sampling aesthetic. Typically produced in bedroom computer-based studios, mash-up artists use software such as Acid or Cool Edit Pro to cut up digital music files and reassemble the fragments to create new songs, arbitrarily adding self-composed parts if desired. Comprised almost exclusively from sections of captured music, mash-ups have been referred to as “fictional pop music” because they conjure up scenarios where, for example, Destiny’s Child jams in a Seattle garage with Nirvana or the Spice Girls perform with Nine Inch Nails (Petridis). Once the initial humour of the novelty has passed, the results can be deeply alluring. Mash-ups extract the distinctive characteristics of songs and place them in new, innovative contexts. As Dale Lawrence writes: “the vocals are often taken from largely reviled or ignored sources—cornball acts like Aguilera or Destiny’s Child—and recast in wildly unlikely contexts … where against all odds, they actually work”. Similarly, Crawford argues that “part of the art is to combine the greatest possible aesthetic dissonance with the maximum musical harmony. The pleasure for listeners is in discovering unlikely artistic complementarities and revisiting their musical memories in mutated forms” (36). Sometimes the adaptation works in the favour of the sampled artist: George Clinton claims that because of sampling he is more popular now than in 1976—“the sampling made us big again” (Green). The creative aspect of mash-ups is unlike that usually associated with musical composition and has more in common with DJing. In an effort to further clarify this aspect, we may regard DJ mixes as “mash-ups on the fly.” When Grandmaster Flash recorded his quilt-pop masterpiece, “Adventures of Grandmaster Flash on the Wheels of Steel,” it was recorded while he performed live, demonstrating his precision and skill with turntables. Modern audio editing software facilitates the capture and storage of sound, allowing mash-up artists to manipulate sounds bytes outside of “real-time” and the live performance parameters within which Flash worked. Thus, the creative element is not the traditional arrangement of chords and parts, but rather “audio contexts”. If, as Riley pessimistically suggests, “there are no new chords to be played, there are no new song structures to be developed, there are no new stories to be told, and there are no new themes to explore,” then perhaps it is understandable that artists have searched for new forms of musical creativity. The notes and chords of mash-ups are segments of existing works sequenced together to produce inter-layered contexts rather than purely tonal patterns. The merit of mash-up culture lies in its function of deconstructing the boundaries of genre and providing new musical possibilities. The process of mashing-up genres functions to critique contemporary music culture by “pointing a finger at how stifled and obvious the current musical landscape has become. … Suddenly rap doesn’t have to be set to predictable funk beats, pop/R&B ballads don’t have to come wrapped in cheese, garage melodies don’t have to recycle the Ramones” (Lawrence). According to Theodor Adorno, the Frankfurt School critic, popular music (of his time) was irretrievably simplistic and constructed from easily interchangeable, modular components (McLeod, “Confessions”, 86). A standardised and repetitive approach to musical composition fosters a mode of consumption dubbed by Adorno “quotation listening” and characterised by passive acceptance of, and obsession with, a song’s riffs (44-5). As noted by Em McAvan, Adorno’s analysis elevates the producer over the consumer, portraying a culture industry controlling a passive audience through standardised products (McAvan). The characteristics that Adorno observed in the popular music of his time are classic traits of contemporary popular music. Mash-up artists, however, are not representative of Adorno’s producers for a passive audience, instead opting to wrest creative control from composers and the recording industry and adapt existing songs in pursuit of their own creative impulses. Although mash-up productions may consciously or unconsciously criticise the current state of popular music, they necessarily exist in creative symbiosis with the commercial genres: “if pop songs weren’t simple and formulaic, it would be much harder for mashup bedroom auteurs to do their job” (McLeod, “Confessions”, 86). Arguably, when creating mash-ups, some individuals are expressing their dissatisfaction with the stagnation of the pop industry and are instead working to create music that they as consumers wish to hear. Sample-based music—as an exercise in adaptation—encourages a Foucauldian questioning of the composer’s authority over their musical texts. Recorded music is typically a passive medium in which the consumer receives the music in its original, unaltered form. DJ Dangermouse (Brian Burton) breached this pact to create his Grey Album, which is a mash-up of an a cappella version of Jay-Z’s Black Album and the Beatles’ eponymous album (also known as the White Album). Dangermouse says that “every kick, snare, and chord is taken from the Beatles White Album and is in their original recording somewhere.” In deconstructing the Beatles’ songs, Dangermouse turned the recordings into a palette for creating his own new work, adapting audio fragments to suit his creative impulses. As Joanna Demers writes, “refashioning these sounds and reorganising them into new sonic phrases and sentences, he creates acoustic mosaics that in most instances are still traceable to the Beatles source, yet are unmistakeably distinct from it” (139-40). Dangermouse’s approach is symptomatic of what Schütze refers to as remix culture: an open challenge to a culture predicated on exclusive ownership, authorship, and controlled distribution … . Against ownership it upholds an ethic of creative borrowing and sharing. Against the original it holds out an open process of recombination and creative transformation. It equally calls into question the categories, rifts and borders between high and low cultures, pop and elitist art practices, as well as blurring lines between artistic disciplines. Using just a laptop, an audio editor and a calculator, Gregg Gillis, a.k.a. Girl Talk, created the Night Ripper album using samples from 167 artists (Dombale). Although all the songs on Night Ripper are blatantly sampled-based, Gillis sees his creations as “original things” (Dombale). The adaptation of sampled fragments culled from the Top 40 is part of Gillis’ creative process: “It’s not about who created this source originally, it’s about recontextualising—creating new music. … I’ve always tried to make my own songs” (Dombale). Gillis states that his music has no political message, but is a reflection of his enthusiasm for pop music: “It’s a celebration of everything Top 40, that’s the point” (Dombale). Gillis’ “celebratory” exercises in creativity echo those of various fan-fiction authors who celebrate the characters and worlds that constitute popular culture. Adaptation through sampling is not always centred solely on music. Sydney-based Tom Compagnoni, a.k.a. Wax Audio, adapted a variety of sound bytes from politicians and media personalities including George W. Bush, Alexander Downer, Alan Jones, Ray Hadley, and John Howard in the creation of his Mediacracy E.P.. In one particular instance, Compagnoni used a myriad of samples culled from various media appearances by George W. Bush to recreate the vocals for John Lennon’s Imagine. Created in early 2005, the track, which features speeded-up instrumental samples from a karaoke version of Lennon’s original, is an immediate irony fuelled comment on the invasion of Iraq. The rationale underpinning the song is further emphasised when “Imagine This” reprises into “Let’s Give Peace a Chance” interspersed with short vocal fragments of “Come Together”. Compagnoni justifies his adaptations by presenting appropriated media sound bytes that deliberately set out to demonstrate the way information is manipulated to present any particular point of view. Playing the media like an instrument, Wax Audio juxtaposes found sounds in a way that forces the listener to confront the bias, contradiction and sensationalism inherent in their daily intake of media information. … Oh yeah—and it’s bloody funny hearing George W Bush sing “Imagine”. Notwithstanding the humorous quality of the songs, Mediacracy represents a creative outlet for Compagnoni’s political opinions that is emphasised by the adaptation of Lennon’s song. Through his adaptation, Compagnoni revitalises Lennon’s sentiments about the Vietnam War and superimposes them onto the US policy on Iraq. An interesting aspect of sampled-based music is the re-occurrence of particular samples across various productions, which demonstrates that the same fragment can be adapted for a plethora of musical contexts. For example, Clyde Stubblefield’s “Funky Drummer” break is reputed to be the most sampled break in the world. The break from 1960s soul/funk band the Winstons’ “Amen Brother” (the B-side to their 1969 release “Color Him Father”), however, is another candidate for the title of “most sampled break”. The “Amen break” was revived with the advent of the sampler. Having featured heavily in early hip-hop records such as “Words of Wisdom” by Third Base and “Straight Out of Compton” by NWA, the break “appears quite adaptable to a range of music genres and tastes” (Harrison, 9m 46s). Beginning in the early 1990s, adaptations of this break became a constant of jungle music as sampling technology developed to facilitate more complex operations (Harrison, 5m 52s). The break features on Shy FX’s “Original Nutta”, L Double & Younghead’s “New Style”, Squarepusher’s “Big Acid”, and a cover version of Led Zepplin’s “Whole Lotta Love” by Jane’s Addiction front man Perry Farrell. This is to name but a few tracks that have adapted the break. Wikipedia offers a list of songs employing an adaptation of the “Amen break”. This list, however, falls short of the “hundreds of tracks” argued for by Nate Harrison, who notes that “an entire subculture based on this one drum loop … six seconds from 1969” has developed (8m 45s). The “Amen break” is so ubiquitous that, much like the twelve bar blues structure, it has become a foundational element of an entire genre and has been adapted to satisfy a plethora of creative impulses. The sheer prevalence of the “Amen break” simultaneously illustrates the creative nature of music adaptation as well as the potentials for adaptation stemming from digital technology such as the sampler. The cut-up and rearrangement aspect of creative sampling technology at once suggests the original but also something new and different. Sampling in general, and the phenomenon of the “Amen break” in particular, ensures the longevity of the original sources; sampled-based music exhibits characteristics acquired from the source materials, yet the illegitimate offspring are not their parents. Sampling as a technology for creatively adapting existing forms of audio has encouraged alternative approaches to musical composition. Further, it has given rise to a new breed of musician that has adapted to technologies of adaptation. Mash-up artists and samplists demonstrate that recorded music is not simply a fixed or read-only product but one that can be freed from the composer’s original arrangement to be adapted and reconfigured. Many mash-up artists such as Gregg Gillis are not trained musicians, but their ears are honed from enthusiastic consumption of music. Individuals such as DJ Dangermouse, Gregg Gillis and Tom Compagnoni appropriate, reshape and re-present the surrounding soundscape to suit diverse creative urges, thereby adapting the passive medium of recorded sound into an active production tool. References Adorno, Theodor. “On the Fetish Character in Music and the Regression of Listening.” The Culture Industry: Selected Essays on Mass Culture. Ed. J. Bernstein. London, New York: Routledge, 1991. Burnett, Henry. “Ruggieri and Vivaldi: Two Venetian Gloria Settings.” American Choral Review 30 (1988): 3. Compagnoni, Tom. “Wax Audio: Mediacracy.” Wax Audio. 2005. 2 Apr. 2007 http://www.waxaudio.com.au/downloads/mediacracy>. Coombe, Rosemary. The Cultural Life of Intellectual Properties. Durham, London: Duke University Press, 1998. Demers, Joanna. Steal This Music: How Intellectual Property Law Affects Musical Creativity. Athens, London: University of Georgia Press, 2006. Dombale, Ryan. “Interview: Girl Talk.” Pitchfork. 2006. 9 Jan. 2007 http://www.pitchforkmedia.com/article/feature/37785/Interview_Interview_Girl_Talk>. Duffel, Daniel. Making Music with Samples. San Francisco: Backbeat Books, 2005. Forbes, Anne-Marie. “A Venetian Festal Gloria: Antonio Lotti’s Gloria in D Major.” Music Research: New Directions for a New Century. Eds. M. Ewans, R. Halton, and J. Phillips. London: Cambridge Scholars Press, 2004. Green, Robert. “George Clinton: Ambassador from the Mothership.” Synthesis. Undated. 15 Sep. 2005 http://www.synthesis.net/music/story.php?type=story&id=70>. Harrison, Nate. “Can I Get an Amen?” Nate Harrison. 2004. 8 Jan. 2007 http://www.nkhstudio.com>. Lawrence, Dale. “On Mashups.” Nuvo. 2002. 8 Jan. 2007 http://www.nuvo.net/articles/article_292/>. Lessig, Lawrence. The Future of Ideas. New York: Random House, 2001. ———. Free Culture: How Big Media Uses Technology and the Law to Lock Down Culture and Control Creativity. New York: The Penguin Press, 2004. McAvan, Em. “Boulevard of Broken Songs: Mash-Ups as Textual Re-Appropriation of Popular Music Culture.” M/C Journal 9.6 (2006) 3 Apr. 2007 http://journal.media-culture.org.au/0612/02-mcavan.php>. McLeod, Kembrew. “Confessions of an Intellectual (Property): Danger Mouse, Mickey Mouse, Sonny Bono, and My Long and Winding Path as a Copyright Activist-Academic.” Popular Music & Society 28.79. ———. Freedom of Expression: Overzealous Copyright Bozos and Other Enemies of Creativity. United States: Doubleday Books. Morris, Sue. “Co-Creative Media: Online Multiplayer Computer Game Culture.” Scan 1.1 (2004). 8 Jan. 2007 http://scan.net.au/scan/journal/display_article.php?recordID=16>. Petridis, Alexis. “Pop Will Eat Itself.” The Guardian UK. March 2003. 8 Jan. 2007 http://www.guardian.co.uk/arts/critic/feature/0,1169,922797,00.html>. Riley. “Pop Will Eat Itself—Or Will It?”. The Truth Unknown (archived at Archive.org). 2003. 9 Jan. 2007 http://web.archive.org/web/20030624154252 /www.thetruthunknown.com/viewnews.asp?articleid=79>. Schütze, Bernard. “Samples from the Heap: Notes on Recycling the Detritus of a Remixed Culture”. Horizon Zero 2003. 8 Jan. 2007 http://www.horizonzero.ca/textsite/remix.php?tlang=0&is=8&file=5>. Vaidhyanathan, Siva. Copyrights and Copywrongs: The Rise of Intellectual Property and How It Threatens Creativity. New York, London: New York University Press, 2003. Woodmansee, Martha. “On the Author Effect: Recovering Collectivity.” The Construction of Authorship: Textual Appropriation in Law and Literature. Eds. M. Woodmansee, P. Jaszi and P. Durham; London: Duke University Press, 1994. 15. Citation reference for this article MLA Style Collins, Steve. "Amen to That: Sampling and Adapting the Past." M/C Journal 10.2 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0705/09-collins.php>. APA Style Collins, S. (May 2007) "Amen to That: Sampling and Adapting the Past," M/C Journal, 10(2). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0705/09-collins.php>.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Ryan, Robin Ann. „Forest as Place in the Album "Canopy": Culturalising Nature or Naturalising Culture?“ M/C Journal 19, Nr. 3 (22.06.2016). http://dx.doi.org/10.5204/mcj.1096.

Der volle Inhalt der Quelle
Annotation:
Every act of art is able to reveal, balance and revive the relations between a territory and its inhabitants (François Davin, Southern Forest Sculpture Walk Catalogue)Introducing the Understory Art in Nature TrailIn February 2015, a colossal wildfire destroyed 98,300 hectares of farm and bushland surrounding the town of Northcliffe, located 365 km south of Perth, Western Australia (WA). As the largest fire in the recorded history of the southwest region (Southern Forest Arts, After the Burn 8), the disaster attracted national attention however the extraordinary contribution of local knowledge in saving a town considered by authorities to be “undefendable” (Kennedy) is yet to be widely appreciated. In accounting for a creative scene that survived the conflagration, this case study sees culture mobilised as a socioeconomic resource for conservation and the healing of community spirit.Northcliffe (population 850) sits on a coastal plain that hosts majestic old-growth forest and lush bushland. In 2006, Southern Forest Arts (SFA) dedicated a Southern Forest Sculpture Walk for creative professionals to develop artworks along a 1.2 km walk trail through pristine native forest. It was re-branded “Understory—Art in Nature” in 2009; then “Understory Art in Nature Trail” in 2015, the understory vegetation layer beneath the canopy being symbolic of Northcliffe’s deeply layered caché of memories, including “the awe, love, fear, and even the hatred that these trees have provoked among the settlers” (Davin in SFA Catalogue). In the words of the SFA Trailguide, “Every place (no matter how small) has ‘understories’—secrets, songs, dreams—that help us connect with the spirit of place.”In the view of forest arts ecologist Kumi Kato, “It is a sense of place that underlies the commitment to a place’s conservation by its community, broadly embracing those who identify with the place for various reasons, both geographical and conceptual” (149). In bioregional terms such communities form a terrain of consciousness (Berg and Dasmann 218), extending responsibility for conservation across cultures, time and space (Kato 150). A sustainable thematic of place must also include livelihood as the third party between culture and nature that establishes the relationship between them (Giblett 240). With these concepts in mind I gauge creative impact on forest as place, and, in turn, (altered) forest’s impact on people. My abstraction of physical place is inclusive of humankind moving in dialogic engagement with forest. A mapping of Understory’s creative activities sheds light on how artists express physical environments in situated creative practices, clusters, and networks. These, it is argued, constitute unique types of community operating within (and beyond) a foundational scene of inspiration and mystification that is metaphorically “rising from the ashes.” In transcending disconnectedness between humankind and landscape, Understory may be understood to both culturalise nature (as an aesthetic system), and naturalise culture (as an ecologically modelled system), to build on a trope introduced by Feld (199). Arguably when the bush is cultured in this way it attracts consumers who may otherwise disconnect from nature.The trail (henceforth Understory) broaches the histories of human relations with Northcliffe’s natural systems of place. Sub-groups of the Noongar nation have inhabited the southwest for an estimated 50,000 years and their association with the Northcliffe region extends back at least 6,000 years (SFA Catalogue; see also Crawford and Crawford). An indigenous sense of the spirit of forest is manifest in Understory sculpture, literature, and—for the purpose of this article—the compilation CD Canopy: Songs for the Southern Forests (henceforth Canopy, Figure 1).As a cultural and environmental construction of place, Canopy sustains the land with acts of seeing, listening to, and interpreting nature; of remembering indigenous people in the forest; and of recalling the hardships of the early settlers. I acknowledge SFA coordinator and Understory custodian Fiona Sinclair for authorising this investigation; Peter Hill for conservation conversations; Robyn Johnston for her Canopy CD sleeve notes; Della Rae Morrison for permissions; and David Pye for discussions. Figure 1. Canopy: Songs for the Southern Forests (CD, 2006). Cover image by Raku Pitt, 2002. Courtesy Southern Forest Arts, Northcliffe, WA.Forest Ecology, Emotion, and ActionEstablished in 1924, Northcliffe’s ill-founded Group Settlement Scheme resulted in frontier hardship and heartbreak, and deforestation of the southwest region for little economic return. An historic forest controversy (1992-2001) attracted media to Northcliffe when protesters attempting to disrupt logging chained themselves to tree trunks and suspended themselves from branches. The signing of the Western Australian Regional Forest Agreement in 1999 was followed, in 2001, by deregulation of the dairy industry and a sharp decline in area population.Moved by the gravity of this situation, Fiona Sinclair won her pitch to the Manjimup Council for a sound alternative industry for Northcliffe with projections of jobs: a forest where artists could work collectively and sustainably to reveal the beauty of natural dimensions. A 12-acre pocket of allocated Crown Land adjacent to the town was leased as an A-Class Reserve vested for Education and Recreation, for which SFA secured unified community ownership and grants. Conservation protocols stipulated that no biomass could be removed from the forest and that predominantly raw, natural materials were to be used (F. Sinclair and P. Hill, personal interview, 26 Sep. 2014). With forest as prescribed image (wider than the bounded chunk of earth), Sinclair invited the artists to consider the themes of spirituality, creativity, history, dichotomy, and sensory as a basis for work that was to be “fresh, intimate, and grounded in place.” Her brief encouraged artists to work with humanity and imagination to counteract residual community divisiveness and resentment. Sinclair describes this form of implicit environmentalism as an “around the back” approach that avoids lapsing into political commentary or judgement: “The trail is a love letter from those of us who live here to our visitors, to connect with grace” (F. Sinclair, telephone interview, 6 Apr. 2014). Renewing community connections to local place is essential if our lives and societies are to become more sustainable (Pedelty 128). To define Northcliffe’s new community phase, artists respected differing associations between people and forest. A structure on a karri tree by Indigenous artist Norma MacDonald presents an Aboriginal man standing tall and proud on a rock to become one with the tree and the forest: as it was for thousands of years before European settlement (MacDonald in SFA Catalogue). As Feld observes, “It is the stabilizing persistence of place as a container of experiences that contributes so powerfully to its intrinsic memorability” (201).Adhering to the philosophy that nature should not be used or abused for the sake of art, the works resonate with the biorhythms of the forest, e.g. functional seats and shelters and a cascading retainer that directs rainwater back to the resident fauna. Some sculptures function as receivers for picking up wavelengths of ancient forest. Forest Folk lurk around the understory, while mysterious stone art represents a life-shaping force of planet history. To represent the reality of bushfire, Natalie Williamson’s sculpture wraps itself around a burnt-out stump. The work plays with scale as small native sundew flowers are enlarged and a subtle beauty, easily overlooked, becomes apparent (Figure 2). The sculptor hopes that “spiders will spin their webs about it, incorporating it into the landscape” (SFA Catalogue).Figure 2. Sundew. Sculpture by Natalie Williamson, 2006. Understory Art in Nature Trail, Northcliffe, WA. Image by the author, 2014.Memory is naturally place-oriented or at least place-supported (Feld 201). Topaesthesia (sense of place) denotes movement that connects our biography with our route. This is resonant for the experience of regional character, including the tactile, olfactory, gustatory, visual, and auditory qualities of a place (Ryan 307). By walking, we are in a dialogue with the environment; both literally and figuratively, we re-situate ourselves into our story (Schine 100). For example, during a summer exploration of the trail (5 Jan. 2014), I intuited a personal attachment based on my grandfather’s small bush home being razed by fire, and his struggle to support seven children.Understory’s survival depends on vigilant controlled (cool) burns around its perimeter (Figure 3), organised by volunteer Peter Hill. These burns also hone the forest. On 27 Sept. 2014, the charred vegetation spoke a spring language of opportunity for nature to reassert itself as seedpods burst and continue the cycle; while an autumn walk (17 Mar. 2016) yielded a fresh view of forest colour, patterning, light, shade, and sound.Figure 3. Understory Art in Nature Trail. Map Created by Fiona Sinclair for Southern Forest Sculpture Walk Catalogue (2006). Courtesy Southern Forest Arts, Northcliffe, WA.Understory and the Melody of CanopyForest resilience is celebrated in five MP3 audio tours produced for visitors to dialogue with the trail in sensory contexts of music, poetry, sculptures and stories that name or interpret the setting. The trail starts in heathland and includes three creek crossings. A zone of acacias gives way to stands of the southwest signature trees karri (Eucalyptus diversicolor), jarrah (Eucalyptus marginata), and marri (Corymbia calophylla). Following a sheoak grove, a riverine environment re-enters heathland. Birds, insects, mammals, and reptiles reside around and between the sculptures, rendering the earth-embedded art a fusion of human and natural orders (concept after Relph 141). On Audio Tour 3, Songs for the Southern Forests, the musician-composers reflect on their regionally focused items, each having been birthed according to a personal musical concept (the manner in which an individual artist holds the totality of a composition in cultural context). Arguably the music in question, its composers, performers, audiences, and settings, all have a role to play in defining the processes and effects of forest arts ecology. Local musician Ann Rice billeted a cluster of musicians (mostly from Perth) at her Windy Harbour shack. The energy of the production experience was palpable as all participated in on-site forest workshops, and supported each other’s items as a musical collective (A. Rice, telephone interview, 2 Oct. 2014). Collaborating under producer Lee Buddle’s direction, they orchestrated rich timbres (tone colours) to evoke different musical atmospheres (Table 1). Composer/Performer Title of TrackInstrumentation1. Ann RiceMy Placevocals/guitars/accordion 2. David PyeCicadan Rhythmsangklung/violin/cello/woodblocks/temple blocks/clarinet/tapes 3. Mel RobinsonSheltervocal/cello/double bass 4. DjivaNgank Boodjakvocals/acoustic, electric and slide guitars/drums/percussion 5. Cathie TraversLamentaccordion/vocals/guitar/piano/violin/drums/programming 6. Brendon Humphries and Kevin SmithWhen the Wind First Blewvocals/guitars/dobro/drums/piano/percussion 7. Libby HammerThe Gladevocal/guitar/soprano sax/cello/double bass/drums 8. Pete and Dave JeavonsSanctuaryguitars/percussion/talking drum/cowbell/soprano sax 9. Tomás FordWhite Hazevocal/programming/guitar 10. David HyamsAwakening /Shaking the Tree /When the Light Comes guitar/mandolin/dobro/bodhran/rainstick/cello/accordion/flute 11. Bernard CarneyThe Destiny Waltzvocal/guitar/accordion/drums/recording of The Destiny Waltz 12. Joel BarkerSomething for Everyonevocal/guitars/percussion Table 1. Music Composed for Canopy: Songs for the Southern Forests.Source: CD sleeve and http://www.understory.com.au/art.php. Composing out of their own strengths, the musicians transformed the geographic region into a living myth. As Pedelty has observed of similar musicians, “their sounds resonate because they so profoundly reflect our living sense of place” (83-84). The remainder of this essay evidences the capacity of indigenous song, art music, electronica, folk, and jazz-blues to celebrate, historicise, or re-imagine place. Firstly, two items represent the phenomenological approach of site-specific sensitivity to acoustic, biological, and cultural presence/loss, including the materiality of forest as a living process.“Singing Up the Land”In Aboriginal Australia “there is no place that has not been imaginatively grasped through song, dance and design, no place where traditional owners cannot see the imprint of sacred creation” (Rose 18). Canopy’s part-Noongar language song thus repositions the ancient Murrum-Noongar people within their life-sustaining natural habitat and spiritual landscape.Noongar Yorga woman Della Rae Morrison of the Bibbulmun and Wilman nations co-founded The Western Australian Nuclear Free Alliance to campaign against the uranium mining industry threatening Ngank Boodjak (her country, “Mother Earth”) (D.R. Morrison, e-mail, 15 July 2014). In 2004, Morrison formed the duo Djiva (meaning seed power or life force) with Jessie Lloyd, a Murri woman of the Guugu Yimidhirr Nation from North Queensland. After discerning the fundamental qualities of the Understory site, Djiva created the song Ngank Boodjak: “This was inspired by walking the trail […] feeling the energy of the land and the beautiful trees and hearing the birds. When I find a spot that I love, I try to feel out the lay-lines, which feel like vortexes of energy coming out of the ground; it’s pretty amazing” (Morrison in SFA Canopy sleeve) Stanza 1 points to the possibilities of being more fully “in country”:Ssh!Ni dabarkarn kooliny, ngank boodja kookoorninyListen, walk slowly, beautiful Mother EarthThe inclusion of indigenous language powerfully implements an indigenous interpretation of forest: “My elders believe that when we leave this life from our physical bodies that our spirit is earthbound and is living in the rocks or the trees and if you listen carefully you might hear their voices and maybe you will get some answers to your questions” (Morrison in SFA Catalogue).Cicadan Rhythms, by composer David Pye, echoes forest as a lively “more-than-human” world. Pye took his cue from the ambient pulsing of male cicadas communicating in plenum (full assembly) by means of airborne sound. The species were sounding together in tempo with individual rhythm patterns that interlocked to create one fantastic rhythm (Australian Broadcasting Corporation, Composer David Pye). The cicada chorus (the loudest known lovesong in the insect world) is the unique summer soundmark (term coined by Truax Handbook, Website) of the southern forests. Pye chased various cicadas through Understory until he was able to notate the rhythms of some individuals in a patch of low-lying scrub.To simulate cicada clicking, the composer set pointillist patterns for Indonesian anklung (joint bamboo tubes suspended within a frame to produce notes when the frame is shaken or tapped). Using instruments made of wood to enhance the rich forest imagery, Pye created all parts using sampled instrumental sounds placed against layers of pre-recorded ambient sounds (D. Pye, telephone interview, 3 Sept. 2014). He takes the listener through a “geographical linear representation” of the trail: “I walked around it with a stopwatch and noted how long it took to get through each section of the forest, and that became the musical timing of the various parts of the work” (Pye in SFA Canopy sleeve). That Understory is a place where reciprocity between nature and culture thrives is, likewise, evident in the remaining tracks.Musicalising Forest History and EnvironmentThree tracks distinguish Canopy as an integrative site for memory. Bernard Carney’s waltz honours the Group Settlers who battled insurmountable terrain without any idea of their destiny, men who, having migrated with a promise of owning their own dairy farms, had to clear trees bare-handedly and build furniture from kerosene tins and gelignite cases. Carney illuminates the culture of Saturday night dancing in the schoolroom to popular tunes like The Destiny Waltz (performed on the Titanic in 1912). His original song fades to strains of the Victor Military Band (1914), to “pay tribute to the era where the inspiration of the song came from” (Carney in SFA Canopy sleeve). Likewise Cathie Travers’s Lament is an evocation of remote settler history that creates a “feeling of being in another location, other timezone, almost like an endless loop” (Travers in SFA Canopy sleeve).An instrumental medley by David Hyams opens with Awakening: the morning sun streaming through tall trees, and the nostalgic sound of an accordion waltz. Shaking the Tree, an Irish jig, recalls humankind’s struggle with forest and the forces of nature. A final title, When the Light Comes, defers to the saying by conservationist John Muir that “The wrongs done to trees, wrongs of every sort, are done in the darkness of ignorance and unbelief, for when the light comes the heart of the people is always right” (quoted by Hyams in SFA Canopy sleeve). Local musician Joel Barker wrote Something for Everyone to personify the old-growth karri as a king with a crown, with “wisdom in his bones.”Kevin Smith’s father was born in Northcliffe in 1924. He and Brendon Humphries fantasise the untouchability of a maiden (pre-human) moment in a forest in their song, When the Wind First Blew. In Libby Hammer’s The Glade (a lover’s lament), instrumental timbres project their own affective languages. The jazz singer intended the accompanying double bass to speak resonantly of old-growth forest; the cello to express suppleness and renewal; a soprano saxophone to impersonate a bird; and the drums to imitate the insect community’s polyrhythmic undercurrent (after Hammer in SFA Canopy sleeve).A hybrid aural environment of synthetic and natural forest sounds contrasts collision with harmony in Sanctuary. The Jeavons Brothers sampled rustling wind on nearby Mt Chudalup to absorb into the track’s opening, and crafted a snare groove for the quirky eco-jazz/trip-hop by banging logs together, and banging rocks against logs. This imaginative use of percussive found objects enhanced their portrayal of forest as “a living, breathing entity.”In dealing with recent history in My Place, Ann Rice cameos a happy childhood growing up on a southwest farm, “damming creeks, climbing trees, breaking bones and skinning knees.” The rich string harmonies of Mel Robinson’s Shelter sculpt the shifting environment of a brewing storm, while White Haze by Tomás Ford describes a smoky controlled burn as “a kind of metaphor for the beautiful mystical healing nature of Northcliffe”: Someone’s burning off the scrubSomeone’s making sure it’s safeSomeone’s whiting out the fearSomeone’s letting me breathe clearAs Sinclair illuminates in a post-fire interview with Sharon Kennedy (Website):When your map, your personal map of life involves a place, and then you think that that place might be gone…” Fiona doesn't finish the sentence. “We all had to face the fact that our little place might disappear." Ultimately, only one house was lost. Pasture and fences, sheds and forest are gone. Yet, says Fiona, “We still have our town. As part of SFA’s ongoing commission, forest rhythm workshops explore different sound properties of potential materials for installing sound sculptures mimicking the surrounding flora and fauna. In 2015, SFA mounted After the Burn (a touring photographic exhibition) and Out of the Ashes (paintings and woodwork featuring ash, charcoal, and resin) (SFA, After the Burn 116). The forthcoming community project Rising From the Ashes will commemorate the fire and allow residents to connect and create as they heal and move forward—ten years on from the foundation of Understory.ConclusionThe Understory Art in Nature Trail stimulates curiosity. It clearly illustrates links between place-based social, economic and material conditions and creative practices and products within a forest that has both given shelter and “done people in.” The trail is an experimental field, a transformative locus in which dedicated physical space frees artists to culturalise forest through varied aesthetic modalities. Conversely, forest possesses agency for naturalising art as a symbol of place. Djiva’s song Ngank Boodjak “sings up the land” to revitalise the timelessness of prior occupation, while David Pye’s Cicadan Rhythms foregrounds the seasonal cycle of entomological music.In drawing out the richness and significance of place, the ecologically inspired album Canopy suggests that the community identity of a forested place may be informed by cultural, economic, geographical, and historical factors as well as endemic flora and fauna. Finally, the musical representation of place is not contingent upon blatant forms of environmentalism. The portrayals of Northcliffe respectfully associate Western Australian people and forests, yet as a place, the town has become an enduring icon for the plight of the Universal Old-growth Forest in all its natural glory, diverse human uses, and (real or perceived) abuses.ReferencesAustralian Broadcasting Commission. “Canopy: Songs for the Southern Forests.” Into the Music. Prod. Robyn Johnston. Radio National, 5 May 2007. 12 Aug. 2014 <http://www.abc.net.au/radionational/programs/intothemusic/canopy-songs-for-the-southern-forests/3396338>.———. “Composer David Pye.” Interview with Andrew Ford. The Music Show, Radio National, 12 Sep. 2009. 30 Jan. 2015 <http://canadapodcasts.ca/podcasts/MusicShowThe/1225021>.Berg, Peter, and Raymond Dasmann. “Reinhabiting California.” Reinhabiting a Separate Country: A Bioregional Anthology of Northern California. Ed. Peter Berg. San Francisco: Planet Drum, 1978. 217-20.Crawford, Patricia, and Ian Crawford. Contested Country: A History of the Northcliffe Area, Western Australia. Perth: UWA P, 2003.Feld, Steven. 2001. “Lift-Up-Over Sounding.” The Book of Music and Nature: An Anthology of Sounds, Words, Thoughts. Ed. David Rothenberg and Marta Ulvaeus. Middletown, CT: Wesleyan UP, 2001. 193-206.Giblett, Rod. People and Places of Nature and Culture. Bristol: Intellect, 2011.Kato, Kumi. “Addressing Global Responsibility for Conservation through Cross-Cultural Collaboration: Kodama Forest, a Forest of Tree Spirits.” The Environmentalist 28.2 (2008): 148-54. 15 Apr. 2014 <http://link.springer.com/article/10.1007/s10669-007-9051-6#page-1>.Kennedy, Sharon. “Local Knowledge Builds Vital Support Networks in Emergencies.” ABC South West WA, 10 Mar. 2015. 26 Mar. 2015 <http://www.abc.net.au/local/stories/2015/03/09/4193981.htm?site=southwestwa>.Morrison, Della Rae. E-mail. 15 July 2014.Pedelty, Mark. Ecomusicology: Rock, Folk, and the Environment. Philadelphia, PA: Temple UP, 2012.Pye, David. Telephone interview. 3 Sep. 2014.Relph, Edward. Place and Placelessness. London: Pion, 1976.Rice, Ann. Telephone interview. 2 Oct. 2014.Rose, Deborah Bird. Nourishing Terrains: Australian Aboriginal Views of Landscape and Wilderness. Australian Heritage Commission, 1996.Ryan, John C. Green Sense: The Aesthetics of Plants, Place and Language. Oxford: Trueheart Academic, 2012.Schine, Jennifer. “Movement, Memory and the Senses in Soundscape Studies.” Canadian Acoustics: Journal of the Canadian Acoustical Association 38.3 (2010): 100-01. 12 Apr. 2016 <http://jcaa.caa-aca.ca/index.php/jcaa/article/view/2264>.Sinclair, Fiona. Telephone interview. 6 Apr. 2014.Sinclair, Fiona, and Peter Hill. Personal Interview. 26 Sep. 2014.Southern Forest Arts. Canopy: Songs for the Southern Forests. CD coordinated by Fiona Sinclair. Recorded and produced by Lee Buddle. Sleeve notes by Robyn Johnston. West Perth: Sound Mine Studios, 2006.———. Southern Forest Sculpture Walk Catalogue. Northcliffe, WA, 2006. Unpaginated booklet.———. Understory—Art in Nature. 2009. 12 Apr. 2016 <http://www.understory.com.au/>.———. Trailguide. Understory. Presented by Southern Forest Arts, n.d.———. After the Burn: Stories, Poems and Photos Shared by the Local Community in Response to the 2015 Northcliffe and Windy Harbour Bushfire. 2nd ed. Ed. Fiona Sinclair. Northcliffe, WA., 2016.Truax, Barry, ed. Handbook for Acoustic Ecology. 2nd ed. Cambridge Street Publishing, 1999. 10 Apr. 2016 <http://www.sfu.ca/sonic-studio/handbook/Soundmark.html>.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Cover, Rob. „Reading the Remix: Methods for Researching and Analysing the Interactive Textuality of Remix“. M/C Journal 16, Nr. 4 (12.08.2013). http://dx.doi.org/10.5204/mcj.686.

Der volle Inhalt der Quelle
Annotation:
IntroductionWith the proliferation of remixed (audio-visual) texts such as fan music videos, slash video, mash-ups and digital stories utilising existing and new visual and audio material on sites such as YouTube, questions are opened as to the efficacy of current forms of media textual analysis. Remixed texts have been positioned as a new and transformative form of art that, despite industry copyright concerns, do not compete with existing texts but makes use of them as ‘found material’ in order to produce an ostensibly intertextual experience (Lessig). Intertexts include pastiche, parody and/or allusion to extant texts and, at the same time, acknowledge that no text is purely original but is built on its ostensible or tacit relationality with a broad range of other texts—relationalities which may be activated in reading or be coded into the text. Remixes are often the work of fan audiences who seek to engage in a participatory manner—a particular reading position that shifts into the act of writing—with texts, television, film and music of which one is a member of an avid audience or a community audience that engage with each other through collaborative production of new texts based on old. The remix is a substantial outcome of such readerly, writerly and collaborative engagement whereby meanings drawn intertextually from the original text are re-produced, expanded upon, critiqued or re-framed through several different activities which may include: re-ordering existing audio-visual material in a way which, according to Constance Penley, was once done by Star Trek fans using magnetic tape and two video recorders to produce new narratives and interpretative frames by cutting and suturing material in an order different from that broadcast (Penley);presenting new meanings to texts, stories or narratives by taking visual material either in short cuts or long scenes and layering over popular music audio tracks, which is commonly done by television fans, such as fans of Buffy the Vampire Slayer in the early 2000s, who produce new meanings or re-emphasise old ones around relationships by bringing together (sometimes cheesy) songs with televised footage (Cover, More Than a Watcher). In both cases, the texts are both new and old—they are a remix of existing material, but the act of remixing produces new frames for the activations of meanings or new narratives, that sit between the interactive and the intertextual. The fact that these forms can be traced back to pre-digital technologies of the 1970s (in the case of Penley’s Star Trek fan videos) or the pre-YouTube and Web 2.0 participatory sharing (in the case of Cover’s Buffy the Vampire Slayer fan videos, distributed through newslists, email and private website) indicates the deep-seated cultural desire or demand for participatory engagement and co-creative forms of encountering texts (Cover, Interactivity). In light of this new form of participatory communication experience, there has emerged a methodological gap requiring new frameworks for researching and analysing the remix text as a text, and within the context of its interactivity, intertextuality, layering and the ways in which these together reconfigure existing narratives and produce new narrative. This paper outlines some approaches used in teaching students about contemporary interactive and convergent digital texts by undertaking practical textual analyses of sample remix audio-video texts. I will discuss some of the more important theoretical issues concerning the analysis of remix texts, with particular attention to notions of interactivity, intertextuality and layering. I will then outline some practical steps for undertaking this kind of analysis in the classroom. By understanding the remix text through a metaphor of layering (drawn from Photoshopping and digital manipulation terminology), a method for ‘remix analysis’ can be put forward that presents innovative ways of engaging with textuality and narrative. Such analyses incorporate narrative sourcing, identification of user-generated content, sequencing, digital manipulation, framing and audio/visual juxtaposition as starting points for reading the remix text. Remix analyses, in this framework, optimise a reflective engagement with contemporary issues of copyright and intellectual property, textual genealogy, intertextuality, co-creative production and emergent forms of interactivity. Interactivity and Intertextuality For Lawrence Lessig, remix is a form of creativity that puts in question the separation between reader and writer. It emphasises instead the participatory form in which read-write creativity (or co-creativity) becomes the normative standard of high-level engagement with extant texts through both selection and arrangement (56). Remix culture, for Lessig, makes use of digital technologies that have been developed for other purposes and practices and delivers forms of collage, complexity, and co-creativity directed towards a broader audience. The role played by YouTube as a sharing site which makes available the massive number of remixed texts is testament to the form’s significance as an interactive, intertextual creation or co-creation. As Burgess and Green have argued, consumer co-creation is fundamental to YouTube’s mission and role in the distribution of texts (4-5). It is more than a peripheral site for re-distribution of either existing texts or private video-logs but, today, operates as a mainstream component in a broader media matrix. In this matrix, the experience of textual audiencehood is re-coded as participatory engagement with prior texts, in order both to reflect on those texts and to produce new ones in a co-creative capacity. This is not to suggest that YouTube is not complicit in copyright regimes that actively seek to restrict participatory and co-creative artistic practice in favour of older models of textual ownership and control over distribution (Cover, Audience Inter/Active). Its digital capacity to police remixed texts that have been marked by corporate copyright holders as unavailable for further use or manipulation has been a substantial development on the side of traditional copyright in the push-pull struggle between free co-creativity and limiting regimes (Cover, Interactivity), although this does not altogether stem the production of the remix as a substantial experience of artistic practice and of user participatory engagement with media matrices. Central to understanding and analysing the remix as a text in its own right is the fact that it is interactive, a point which leads to the assertion that analytic tools suited to traditional, non-interactive texts are not always going to be adequate to the task of unpacking and drawing out thematic and conceptual material from a remix’s narrative. Although interactivity has been difficult to define, the form of interactivity in which we see the remix is that which involves an element of co-creativity between the author of a source text and the user of the text who interacts with the source to transform it into something new. Spiro Kiousis has argued that while definitions of interactivity are amorphous, there is value in the concept “as long as we all accept that the term implies some degree of receiver feedback and is usually linked to new technologies” (357). For Lelia Green, however, interactivity implies the capacity of a communication medium to have its products altered by the actions of a user or audience (xx). In the case of the latter, interactivity covers not only the sorts of texts in which audience or user engagement is required as a built-in part of the process, such as in digital games, but those texts, forms, mediums and experiences in which existing texts are manipulated, revised, re-used or brought together, such as in the remix. Drawing on Bordewijk and van Kaam, Sally McMillan delineated the concept of interactivity into a typology of four intersecting levels or uses: Allocution, in which interactive engagement is minimal, and is set within the context of a single, central broadcaster and multiple receivers on the periphery (273). This would ordinarily include most traditional mass media forms such as television and the selection of channels.Consultation, which occurs in the use of a database, such as a website, where a user actively searches for pre-provided information but does not seek to alter that information (273). Access here does not alter the content, source, narrative or information that has been requested. Drawing information from Wikipedia without the intent of editing information may alter the metadata or framework through providing the site with tracking information, but in this case the textuality of the text as accessed is not transformed through this level of interactivity.Registration, which does record access patterns and accumulates information from the periphery in a central registry which alters the information, significance or context of the material (273). McMillan’s early Web 1.0 example of registrational interactivity was the internet ‘cookie’, which tracks and customises content of internet sites visited by the user. However, as a category of interactivity in which the narrative or form of the text itself is altered in its reading or use, it might also be said to include the electronic game as well as forms of communications engagement which access a source text, manipulate, customise or re-form it using commonly-available or sophisticated software, and re-distribute it through digital means. Here, the narrative is knowingly acted upon in ways which alter it for other uses. Conversational, which occurs when individuals interact directly with each other, usually in real-time in ways which mimic face-to-face engagement without physical presence at a locale (273). An online written chat using a relay platform provided by a social networking site that does not record the text is an example; likewise using a video skype account is also conversational interactivity. While McMillan’s ‘registrational’ definition of interactivity, as the one which gives greater capacity to an audience to change, alter and manipulate a text or a textual narrative, allows considerable redefinition of the traditional author-text-audience relationship, none of the four-scale definitions adequately allow for the ways in which remix texts are at once interactive, intertextual, intermedial and built through participatory re-layering and re-organising of a broader corpus of material in ways typically not invited by the original texts or their original distributional mediums—hence the concerns around copyright and distributional control (Cover, Audience Inter/Active). As an outcome of registrational interactivity, the remix presents itself not merely in terms of how the relationship is structured in the context of new digital media, but also shifts how the audience has been conceived historically in terms of its ability to control the text and its internal structure and coherence. In light of both new developments in interactivity with the text as found in the increasing popularity of new media forms such as electronic gaming, and the ‘backlash’ development of new technologies, software and legal methods that actively seek to prevent alteration and re-distribution of texts, the historical and contemporary conception of the author-text-audience affinity can be characterised as a tactical war of contention for control over the text. This is a struggle set across a number of different contexts, media forms, sites and author/audience capacities but is embodied in the legal, cultural and economic skirmishes over the form and use of remix texts. More significantly, however, the remix is an interactivity that is conscious of the intertextuality that produces the various juxtapositions to create new narratives. All texts are intertextual, and the concept of intertextuality takes into account the network of other, similar texts to which any new text contributes and by which it is influenced. This similarity can be produced by several factors, including genre, allusion, trace, pastiche and aesthetics. Intertextuality can include the fact that a text is related to and permeated by the discourse of its sources (Bignell 92), but in all cases it shapes the meanings, significations and potential readings of a text in a way attuned to the polysemy of contemporary cultural production. In the context of interactivity, however, it is through co-creative engagement that intertextuality of both the source and the new text are drawn out as a deliberate act of creation. Layering As an interactive and deliberately intertextual text, the remix or mash-up is best understood as layered intermedia, that is, as a narrative comprised of—or fused between—moving image and sound, audio which includes dialogue, effects, incidental and narrative-related music. In that context, no individual component of the text can be understood or analysed away from the elements into which it has been remixed. New meanings emerge in intermedia remixes not simply because originary or new intertextualities are produced by user-creators relying on existing sources, but because those sources themselves no longer operate with the same set of meanings and significations, allowing the productive activation of new meanings (Bennett). While it is important to pay attention to the fact that the narrative of a remix text works only through the reconfiguration of the intermedia of audio and visual in order to create a new text with subsequent new potential meanings, the analysis must pay attention to the various forms of layering that constitute all audio-visual texts. For Lessig, such layering is a digital form of collage (70). However it is also the means by which, on the one hand, new intertextualities are developed through juxtaposition of different sources in order to give them all new significations and to activate new meanings and, on the other hand, to draw attention to the existing potential intertextuality of the sources and the polysemy of meaning. Understanding layering of texts involves understanding a text in a three-dimensional capacity. This is where some basic awareness of digital image manipulation through application software such as Photoshop and Gimp can be instructive in providing frameworks through which to understand contemporary digital media forms and analyse the ways in which they, as potential, productively activate sets or ranges of meanings. Such digital manipulation programs require the user to think about, say, an image as being built upon and manipulated across different layers, whereby a core image is ‘drawn out’ into its third dimension through adding, shifting, changing, re-figuring and re-framing—layer over layer. The core remains, but is radically altered by what occurs at the different layers. Likewise, the remix is produced through interacting with a number of different source texts together within a conceptual framework that is three-dimension and operates across layers. These include the two primary layers of the visual and the audio—for remixes are typically audio-visual—but also through interacting with a range of intertextual meanings that, likewise, can be understood in three-dimensional layers across the temporality of an audio-visual moving text. Method of Analysis A simple typology for analysing remix texts—focusing particularly on fan videos on YouTube, including same-sexualised fan fiction known as slash and those texts which re-order television and film material juxtaposed against popular music tracks—emerged from a first-year undergraduate digital media cultures course I taught at The University of Adelaide in 2010. With a broad range of meanings, views, interpretations and engagements emerging in large-group teaching, we workshopped possible scenarios with the aim of establishing some steps that can be used to consider the place of the remix in the context of its narrative interactivity and intertextual groundings. A typological method for analysis is not necessarily the most sophisticated way in which to draw out narrative threads and strands from a remix text and, indeed, there may be value in exploring remixed texts from other perspectives such as the YouTube-enabled participatory reflectiveness that emerges from community and commentary perspectives. However, to understand the narrative elements that emerge from a remix there is also great value in beginning with an unstitching of its constituent components in order to understand the interactive, intertextual, intermedial formation of the remix through its structuration and selectivity and assembling of extant texts. To best describe a typology for analysing the remix as a text and an interactive intertext, we might use an example. Let us say, hypothetically, a YouTube remix video of three- minutes-and-fifty-seconds in length that takes various scenes from the television series Arrested Development, perhaps the two characters of adult brothers GOB and Michael Bluth, from across its four years and sets them against a single audio track, Belle & Sebastian’s Seeing Other People. Such an example would not be an uncommon remix, which may be an expression of fandom for Arrested Development or perhaps an expression of critical engagement that actively draws attention to the range of reading positions, formations and potential productive activation of meanings (Bennett) around sibling relationships in the original. That is, by juxtaposing a popular audio track about the awkwardness of romantic relationships against images of the closeness, distance and competitiveness of the two brothers is to give it a ‘slash’ element, thereby presenting a narrative which either implies a pseudo-sexual or romantic component to the brotherly relationship (an activity not uncommon in the production of slash) or makes a critical statement about the way in which the theatrics of touch, familial hugging, looking and seeing or positioning in visual frames is utilised in the series in ways which are open to alternative readings. Now that it is determined such a remix might actively and self-consciously play with the juxtaposition across two layers to create additional meanings, the real work of analysis can be undertaken. This, of course, could include thematic, discursive or narrative analysis of the text alone. However, if one is to work with the notion that a remix is always produced in both interactivity and intertextuality, then a number of steps must be taken at the level of individual layers and, subsequently, together. This aids in understanding the sourcing, collocation, positioning, re-ordering in order to come to a depth of interpretation as to a possible meaning of the remix among the many available in a polysemic cultural product. Step One: Determine the Video Narrative Source. This involves establishing if the remix’s video material is from a singular source (such as a single film or television episode), multiple sources (many films) and, if multiple, if these are from the same genre, with the same actors, same director or different in each case. It also involves ascertaining if there is user-generated visual content such as additional material, animation or captioning. Exploring the possible arrangements of the visual source, while assuming that the audio track remains singular and identifiable, provides opportunities to consider the thematic, genre and story elements and their significations for the resulting new, co-creative narrative of the remix. This step invites the scholar to consider how the remix’s discernible narrative differs from the scholar’s reading of the source video texts, how the visual material signifies without its original audio component (for example, the dialogue in a television episode) and the ways in which the separation of the source visual from audio presents new interpretative frames. Step Two: Understand the Narrative Sequence. Has the video material been cut (pieces extracted and re-joined? Has the temporal order of the video material been re-sequenced. How do these shifts and changes impact on the narrative or story told? In our example here, we might find a series of scenes of two characters hugging or touching, with the narrative elements from the original episode that occur between—that is, that give a context to those hugs—removed. Asking how the removal of that contextual material presents the source clips as a new narrative and a new interactively-derived creation is central to this step. Step Three: Visual Manipulation. What additional visual manipulation features have been added—fade-ins, fade-outs, framing, changes to the speed or playback time of the source video? Accounting for these enables the viewer to position the remix narrative at a point of distance from the source, shifting from derivative to intertextual. Naturally, these must be understood in the context of the earlier steps while foregrounding the interactive form of the remix as a co-created piece that is more than just an intervention into an original text but the utilisation through manipulation of a range of texts to produce a new one. Step Four: Narrative Engagement and Collocation. Here, the scholar must assess the extent to which the audio source has a ‘fit’ with the visual. Thematic and discourse analysis (among others) can be applied to determine the way in which audio track, in addition to the above four steps and manipulations, productively activate new meanings, contexts and frames in the narrative. Importantly, however, this step requires not only asking what the audio does to the video, but the reverse. Using the Arrested Development example, one must ask what the visual material does for the meanings that are denoted within the audio, its musical elements and its lyrics: to what extent does the video source ‘fit’ with or re-position the significance of the audio dialogue and present it with meanings it would not otherwise have in an audio-online context (or, of course, in the context of its use in an ‘authorised’ music video). Together, these four steps present one possible means of ‘coming at’ the interactive and intertextual roots of the remix as a co-creative text. It is not merely to analyse how the source has been used or how the remix allows the sources to be presented or distributed differently, but to understand how new narratives emerge in the context of the various ‘mixings’ that come out of interactive engagement with the text to produce intertextual activation of meanings. Analysing remix texts through this method opens the possibility not only of being able to articulate readings of the text that are built on interactivity and layering, but a critique of the contemporary conditions of textual production. By demonstrating the ways in which a text can be understood to be located not just within intertextuality but within intertextual layers, it is possible to reflect more broadly on all textuality as being produced, disseminated and having its meanings productively activated in the context of ‘degrees’ of layers and ‘degrees’ of of interactivity. References Bennett, T. “Texts, Readers, Reading Formations.” Literature and History 9.2 (1983): 214-227. Bignell, J. Media Semiotics: An Introduction. Manchester: Manchester University Press, 1997. Bordewijk, J.L., and B. van Kaam. “Towards a New Classification of Tele-Information Services.” InterMedia 14.1 (1986): 16-21. Burgess, J., and J. Green. YouTube: Online Video and Participatory Culture. Cambridge: Polity, 2009. Cover, R. “Interactivity: Reconceiving the Audience in the Struggle for Textual ‘Control’.” Australian Journal of Communication, 31.1 (2004): 107-120. — — —. “Audience Inter/Active: Interactive Media, Narrative Control & Reconceiving Audience History.” New Media & Society 8.1 (2006): 213-232. — — —. “More than a Watcher: Buffy Fans, Amateur Music Videos, Romantic Slash and Intermedia.” Music, Sound and Silence in Buffy the Vampire Slayer. Ed. P. Attinello, J. K. Halfyard & V. Knights, London: Ashgate, 2010. 131-148. Green, L. Communication, Technology and Society. St. Leonards, NSW: Allen & Unwin, 2002. Jenkins, H. “What Happened before YouTube.” YouTube: Online Video and Participatory Culture. Ed. J. Burgess and J. Green. Cambridge: Polity, 2009. 109-125. Kiousis, S. “Interactivity: A Concept Explication.” New Media & Society 4.3 (2002): 355-383. Lessig, L. Remix: Making Art and Commerce Thrive in the Hybrid Economy. London: Bloomsbury Academic, 2008. McMillan, S. “A Four-Part Model of Cyber-Interactivity: Some Cyber-Places are More Interactive than Others.” New Media & Society 4.2 (2002): 271-291. Penley, C. Nasa/Trek: Popular Science and Sex in America. London & New York: Verso, 1997.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Cruikshank, Lauren. „Synaestheory: Fleshing Out a Coalition of Senses“. M/C Journal 13, Nr. 6 (25.11.2010). http://dx.doi.org/10.5204/mcj.310.

Der volle Inhalt der Quelle
Annotation:
Everyone thinks I named my cat Mango because of his orange eyes but that’s not the case. I named him Mango because the sounds of his purrs and his wheezes and his meows are all various shades of yellow-orange. (Mass 3) Synaesthesia, a condition where stimulus in one sense is perceived in that sense as well as in another, is thought to be a neurological fluke, marked by cross-sensory reactions. Mia, a character in the children’s book A Mango-Shaped Space, has audition colorée or coloured hearing, the most common form of synaesthesia where sounds create dynamic coloured photisms in the visual field. Others with the condition may taste shapes (Cytowic 5), feel colours (Duffy 52), taste sounds (Cytowic 118) or experience a myriad of other sensory combinations. Most non-synaesthetes have never heard of synaesthesia and many treat the condition with disbelief upon learning of it, while synaesthetes are often surprised to hear that others don’t have it. Although there has been a resurgence of interest in synaesthesia recently in psychology, neuroscience and philosophy (Ward and Mattingley 129), there is no widely accepted explanation for how or why synaesthetic perception occurs. However, if we investigate what meaning this particular condition may offer for rethinking not only what constitutes sensory normalcy, but also the ocular-centric bias in cultural studies, especially media studies, synaesthesia may present us with very productive coalitions indeed.Some theorists posit the ultimate role of media of all forms “to transfer sense experiences from one person to another” (Bolter and Grusin 3). Alongside this claim, many “have also maintained that the ultimate function of literature and the arts is to manifest this fusion of the senses” found in synaesthesia (Dann ix). If the most primary of media aims are to fuse and transfer sensory experiences, manifesting these goals would be akin to transferring synaesthetic experience to non-synaesthetes. In some cases, this synaesthetic transfer has been the explicit goal of media forms, from the invention of kaleidoscopes as colour symphonies in 1818 (Dann 66) to the 2002 launch of the video game Rez, the packaging for which reads “Discover a new world. A world of sound, visuals and vibrations. Release your instincts, open your senses and experience synaesthesia” (Rez). Recent innovations such as touch screen devices, advances in 3D film and television technologies and a range of motion-sensing video gaming consoles extend media experience far beyond the audio-visual and as such, present both serious challenges and important opportunities for media and culture scholars to reinvigorate ways of thinking about media experience, sensory embodiment and what might be learned from engaging with synaesthesia. Fleshing out the Field While acknowledging synaesthesia as a specific condition that enhances and complicates the lives of many individuals, I also suggest that synaesthesia is a useful mode of interference into our current ocular-centric notions of culture. Vision and visual phenomena hold a particularly powerful role in producing and negotiating meanings, values and relationships in the contemporary cultural arena and as a result, the eye has become privileged as the “master sense of the modern era” (Jay Scopic 3). Proponents of visual culture claim that the majority of modern life takes place through sight and that “human experience is now more visual and visualized than ever before ... in this swirl of imagery, seeing is much more than believing. It is not just a part of everyday life, it is everyday life” (Mirzoeff 1). In order to enjoy this privilege as the master sense, vision has been disentangled from the muscles and nerves of the eyeball and relocated to the “mind’s eye”, a metaphor that equates a kind of disembodied vision with knowledge. Vision becomes the most non-sensual of the senses, and made to appear “as a negative reference point for the other senses...on the side of detachment, separation” (Connor) or even “as the absence of sensuality” (Haraway). This creates a paradoxical “visual culture” in which the embodied eye is, along with the ear, skin, tongue and nose, strangely absent. If visual culture has been based on the separation of the senses, and in fact, a refutation of embodied senses altogether, what about that which we might encounter and know in the world that is not encompassed by the mind’s eye? By silencing the larger sensory context, what are we missing? What ocular-centric assumptions have we been making? What responsibilities have we ignored?This critique does not wish to do away with the eye, but instead to re-embrace and extend the field of vision to include an understanding of the eye and what it sees within the context of its embodied abilities and limitations. Although the mechanics of the eye make it an important and powerful sensory organ, able to perceive at a distance and provide a wealth of information about our surroundings, it is also prone to failures. Equipped as it is with eyelids and blind spots, reliant upon light and gullible to optical illusions (Jay, Downcast 4), the eye has its weaknesses and these must be addressed along with its abilities. Moreover, by focusing only on what is visual in culture, we are missing plenty of import. The study of visual culture is not unlike studying an electrical storm from afar. The visually impressive jagged flash seems the principal aspect of the storm and quite separate from the rumbling sound that rolls after it. We perceive them and name them as two distinct phenomena; thunder and lightning. However, this separation is a feature only of the distance between where we stand and the storm. Those who have found themselves in the eye of an electrical storm know that the sight of the bolt, the sound of the crash, the static tingling and vibration of the crack and the smell of ozone are mingled. At a remove, the bolt appears separate from the noise only artificially because of the safe distance. The closer we are to the phenomenon, the more completely it envelops us. Although getting up close and personal with an electrical storm may not be as comfortable as viewing it from afar, it does offer the opportunity to better understand the total experience and the thrill of intensities it can engage across the sensory palette. Similarly, the false separation of the visual from the rest of embodied experience may be convenient, but in order to flesh out this field, other embodied senses and sensory coalitions must be reclaimed for theorising practices. The senses as they are traditionally separated are simply put, false categories. Towards SynaestheoryAny inquiry inspired by synaesthesia must hold at its core the idea that the senses cannot be responsibly separated. This notion applies firstly to the separation of senses from one another. Synaesthetic experience and experiment both insist that there is rich cross-fertility between senses in synaesthetes and non-synaesthetes alike. The French verb sentir is instructive here, as it can mean “to smell”, “to taste” or “to feel”, depending on the context it is used in. It can also mean simply “to sense” or “to be aware of”. In fact, the origin of the phrase “common sense” meant exactly that, the point at which the senses meet. There also must be recognition that the senses cannot be separated from cognition or, in the Cartesian sense, that body and mind cannot be divided. An extensive and well-respected study of synaesthesia conducted in the 1920s by Raymond Wheeler and Thomas Cutsforth, non-synaesthetic and synaesthetic researchers respectively, revealed that the condition was not only a quirk of perception, but of conception. Synaesthetic activity, the team deduced “is an essential mechanism in the construction of meaning that functions in the same way as certain unattended processes in non-synaesthetes” (Dann 82). With their synaesthetic imagery impaired, synaesthetes are unable to do even a basic level of thinking or recalling (Dann, Cytowic). In fact, synaesthesia may be a universal process, but in synaesthetes, “a brain process that is normally unconscious becomes bared to consciousness so that synaesthetes know they are synaesthetic while the rest of us do not” (166). Maurice Merleau-Ponty agrees, claiming:Synaesthetic perception is the rule, and we are unaware of it only because scientific knowledge shifts the centre of gravity of experience, so that we unlearn how to see, hear, and generally speaking, feel in order to deduce, from our bodily organisation and the world as the physicist conceives it, what we are to see, hear and feel. (229) With this in mind, neither the mind’s role nor the body’s role in synaesthesia can be isolated, since the condition itself maintains unequivocally that the two are one.The rich and rewarding correlations between senses in synaesthesia prompt us to consider sensory coalitions in other experiences and contexts as well. We are urged to consider flows of sensation seriously as experiences in and of themselves, with or without interpretation and explanation. As well, the debates around synaesthetic experience remind us that in order to speak to phenomena perceived and conceived it is necessary to recognise the specificities, ironies and responsibilities of any embodied experience. Ultimately, synaesthesia helps to highlight the importance of relationships and the complexity of concepts necessary in order to practice a more embodied and articulate theorising. We might call this more inclusive approach “synaestheory”.Synaestheorising MediaDystopia, a series of photographs by artists Anthony Aziz and Sammy Cucher suggests a contemporary take on Decartes’s declaration that “I will now close my eyes, I will stop my ears, I will turn away my senses from their objects” (86). These photographs consist of digitally altered faces where the subject’s skin has been stretched over the openings of eyes, nose, mouth and ears, creating an interesting image both in process and in product. The product of a media mix that incorporates photography and computer modification, this image suggests the effects of the separation from our senses that these media may imply. The popular notion that media allow us to surpass our bodies and meet without our “meat” tagging along is a trope that Aziz and Cucher expose here with their computer-generated cover-up. By sealing off the senses, they show us how little we now seem to value them in a seemingly virtual, post-embodied world. If “hybrid media require hybrid analyses” (Lunenfeld in Graham 158), in our multimedia, mixed media, “mongrel media” (Dovey 114) environment, we need mongrel theory, synaestheory, to begin to discuss the complexities at hand. The goal here is producing an understanding of both media and sensory intelligences as hybrid. Symptomatic of our simple sense of media is our tendency to refer to media experiences as “audio-visual”: stimuli for the ear, eye or both. However, even if media are engineered to be predominately audio and/or visual, we are not. Synaestheory examines embodied media use, including the sensory information that the media does not claim to concentrate on, but that is still engaged and present in every mediated experience. It also examines embodied media use by paying attention to the pops and clicks of the material human-media interface. It does not assume simple sensory engagement or smooth engagement with media. These bumps, blisters, misfirings and errors are just as crucial a part of embodied media practice as smooth and successful interactions. Most significantly, synaesthesia insists simply that sensation matters. Sensory experiences are material, rich, emotional, memorable and important to the one sensing them, synaesthete or not. This declaration contradicts a legacy of distrust of the sensory in academic discourse that privileges the more intellectual and abstract, usually in the form of the detached text. However, academic texts are sensory too, of course. Sound, feeling, movement and sight are all inseparable from reading and writing, speaking and listening. We might do well to remember these as root sensory situations and by extension, recognise the importance of other sensual forms.Indeed, we have witnessed a rise of media genres that appeal to our senses first with brilliant and detailed visual and audio information, and story or narrative second, if at all. These media are “direct and one-dimensional, about little, other than their ability to commandeer the sight and the senses” (Darley 76). Whereas any attention to the construction of the media product is a disastrous distraction in narrative-centred forms, spectacular media reveals and revels in artifice and encourages the spectator to enjoy the simulation as part of the work’s allure. It is “a pleasure of control, but also of being controlled” (MacTavish 46). Like viewing abstract art, the impact of the piece will be missed if we are obsessed with what the artwork “is about”. Instead, we can reflect on spectacular media’s ability, like that of an abstract artwork, to impact our senses and as such, “renew the present” (Cubitt 32).In this framework, participation in any medium can be enjoyed not only as an interpretative opportunity, but also as an experience of sensory dexterity and relevance with its own pleasures and intelligences; a “being-present”. By focusing our attention on sensory flows, we may be able to perceive aspects of the world or ourselves that we had previously missed. Every one of us–synaesthete or nonsynaesthete–has a unique blueprint of reality, a unique way of coding knowledge that is different from any other on earth [...] By quieting down the habitually louder parts of our mind and turning the dial of our attention to its darker, quieter places, we may hear our personal code’s unique and usually unheard “song”, needing the touch of our attention to turn up its volume. (Duffy 123)This type of presence to oneself has been termed a kind of “perfect immediacy” and is believed to be cultivated through meditation or other sensory-focused experiences such as sex (Bolter and Grusin 260), art (Cubitt 32), drugs (Dann 184) or even physical pain (Gromala 233). Immersive media could also be added to this list, if as Bolter and Grusin suggest, we now “define immediacy as being in the presence of media” (236). In this case, immediacy has become effectively “media-cy.”A related point is the recognition of sensation’s transitory nature. Synaesthetic experiences and sensory experiences are vivid and dynamic. They do not persist. Instead, they flow through us and disappear, despite any attempts to capture them. You cannot stop or relive pure sound, for example (Gross). If you stop it, you silence it. If you relive it, you are experiencing another rendition, different even if almost imperceptibly from the last time you heard it. Media themselves are increasingly transitory and shifting phenomena. As media forms emerge and fall into obsolescence, spawning hybrid forms and spinoffs, the stories and memories safely fixed into any given media become outmoded and ultimately inaccessible very quickly. This trend towards flow over fixation is also informed by an embodied understanding of our own existence. Our sensations flow through us as we flow through the world. Synaesthesia reminds us that all sensation and indeed all sensory beings are dynamic. Despite our rampant lust for statis (Haraway), it is important to theorise with the recognition that bodies, media and sensations all flow through time and space, emerging and disintegrating. Finally, synaesthesia also encourages an always-embodied understanding of ourselves and our interactions with our environment. In media experiences that traditionally rely on vision the body is generally not only denied, but repressed (Balsamo 126). Claims to disembodiment flood the rhetoric around new media as an emancipatory element of mediated experience and somehow, seeing is superimposed on embodied being to negate it. However phenomena such as migraines, sensory release hallucinations, photo-memory, after-images, optical illusions and most importantly here, the “crosstalk” of synaesthesia (Cohen Kadosh et al. 489) all attest to the co-involvement of the body and brain in visual experience. Perhaps useful here for understanding media involvement in light of synaestheory is a philosophy of “mingled bodies” (Connor), where the world and its embodied agents intermingle. There are no discrete divisions, but plenty of translation and transfer. As Sean Cubitt puts it, “the world, after all, touches us at the same moment that we touch it” (37). We need to employ non-particulate metaphors that do away with the dichotomies of mind/body, interior/exterior and real/virtual. A complex embodied entity is not an object or even a series of objects, but embodiment work. “Each sense is in fact a nodal cluster, a clump, confection or bouquet of all the other senses, a mingling of the modalities of mingling [...] the skin encompasses, implies, pockets up all the other sense organs: but in doing so, it stands as a model for the way in which all the senses in their turn also invaginate all the others” (Connor). The danger here is of delving into a nostalgic discussion of a sort of “sensory unity before the fall” (Dann 94). The theory that we are all synaesthetes in some ways can lead to wistfulness for a perfect fusion of our senses, a kind of synaesthetic sublime that we had at one point, but lost. This loss occurs in childhood in some theories, (Maurer and Mondloch) and in our aboriginal histories in others (Dann 101). This longing for “original syn” is often done within a narrative that equates perfect sensory union with a kind of transcendence from the physical world. Dann explains that “during the modern upsurge in interest that has spanned the decades from McLuhan to McKenna, synaesthesia has continued to fulfil a popular longing for metaphors of transcendence” (180). This is problematic, since elevating the sensory to the sublime does no more service to understanding our engagements with the world than ignoring or degrading the sensory. Synaestheory does not tolerate a simplification of synaesthesia or any condition as a ticket to transcendence beyond the body and world that it is necessarily grounded in and responsible to. At the same time, it operates with a scheme of senses that are not a collection of separate parts, but blended; a field of intensities, a corporeal coalition of senses. It likewise refuses to participate in the false separation of body and mind, perception and cognition. More useful and interesting is to begin with metaphors that assume complexity without breaking phenomena into discrete pieces. This is the essence of a new anti-separatist synaestheory, a way of thinking through embodied humans in relationships with media and culture that promises to yield more creative, relevant and ethical theorising than the false isolation of one sense or the irresponsible disregard of the sensorium altogether.ReferencesAziz, Anthony, and Sammy Cucher. Dystopia. 1994. 15 Sep. 2010 ‹http://www.azizcucher.net/1994.php>. Balsamo, Anne. “The Virtual Body in Cyberspace.” Technologies of the Gendered Body: Reading Cyborg Women. Durham: Duke UP, 1997. 116-32.Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge: MIT Press, 1999.Cohen Kadosh, Roi, Avishai Henik, and Vincent Walsh. “Synaesthesia: Learned or Lost?” Developmental Science 12.3 (2009): 484-491.Connor, Steven. “Michel Serres’ Five Senses.” Michel Serres Conference. Birkbeck College, London. May 1999. 5 Oct. 2010 ‹http://www.bbk.ac.uk/eh/skc/5senses.htm>. Cubitt, Sean. “It’s Life, Jim, But Not as We Know It: Rolling Backwards into the Future.” Fractal Dreams: New Media in Social Context. Ed. Jon Dovey. London: Lawrence and Wishart, 1996. 31-58.Cytowic, Richard E. The Man Who Tasted Shapes: A Bizarre Medical Mystery Offers Revolutionary Insights into Emotions, Reasoning and Consciousness. New York: Putnam Books, 1993.Dann, Kevin T. Bright Colors Falsely Seen: Synaesthesia and the Search for Transcendental Knowledge. New Haven: Yale UP, 1998.Darley, Andrew. Visual Digital Culture: Surface Play and Spectacle in New Media Genres. London: Routledge, 2000.Descartes, Rene. Discourse on Method and the Meditations. Trans. Johnn Veitch. New York: Prometheus Books, 1989.Dovey, Jon. “The Revelation of Unguessed Worlds.” Fractal Dreams: New Media in Social Context. Ed. Jon Dovey. London: Lawrence and Wishart, 1996. 109-35. Duffy, Patricia Lynne. Blue Cats and Chartreuse Kittens: How Synesthetes Color Their Worlds. New York: Times Books, 2001.Graham, Beryl. “Playing with Yourself: Pleasure and Interactive Art.” Fractal Dreams: New Media in Social Context. Ed. Jon Dovey. London: Lawrence and Wishart, 1996. 154-81.Gromala, Diana. "Pain and Subjectivity in Virtual Reality." Clicking In: Hot Links to a Digital Culture. Ed. Lynn Hershman Leeson. Seattle: Bay Press, 1996. 222-37.Haraway, Donna. “At the Interface of Nature and Culture.” Seminar. European Graduate School. Saas-Fee, Switzerland, 17-19 Jun. 2003.Jay, Martin. Downcast Eyes: The Denigration of Vision in Twentieth Century French Thought. Berkeley: University of California P, 1993.Jay, Martin. "Scopic Regimes of Modernity." Hal Foster, Ed. Vision and Visuality. New York: Dia Art Foundation, 1988. 2-23.MacTavish. Andrew. “Technological Pleasure: The Performance and Narrative of Technology in Half-Life and other High-Tech Computer Games.” ScreenPlay: Cinema/Videogames/Interfaces. Eds. Geoff King and Tanya Krzywinska. London: Wallflower P, 2002. Mass, Wendy. A Mango-Shaped Space. Little, Brown and Co., 2003.Maurer, Daphne, and Catherine J. Mondloch. “Neonatal Synaesthesia: A Re-Evaluation.” Eds. Lynn C. Robertson and Noam Sagiv. Synaesthesia: Perspectives from Cognitive Neuroscience. Oxford: Oxford UP, 2005.Merleau-Ponty, Maurice. Phenomenology of Perception. Trans. Colin Smith. London: Routledge, 1989.Mirzoeff, Nicholas. “What Is Visual Culture?” The Visual Culture Reader. Ed. Nicholas Mirzoeff. London: Routledge, 1998. 3-13.Rez. United Game Artists. Playstation 2. 2002.Stafford, Barbara Maria. Good Looking: Essays on the Virtue of Images. Cambridge: MIT Press, 1996.Ward, Jamie, and Jason B. Mattingley. “Synaesthesia: An Overview of Contemporary Findings and Controversies.” Cortex 42.2 (2006): 129-136.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Maher, Laura-Jane. „You Got Spirit, Kid: Transmedial Life-Writing across Time and Space“. M/C Journal 21, Nr. 1 (14.03.2018). http://dx.doi.org/10.5204/mcj.1365.

Der volle Inhalt der Quelle
Annotation:
In November 2015 the progressive rock band, Coheed and Cambria, released their latest album and art-book, both titled The Color before the Sun (Color) (2015). This album deviates from their previous six releases by explicitly using a biographical frame for the art-book, the album, and their paratexts. This is a divergence from the band’s concept album approach, a transmedia storyworld, The Amory Wars (TAW) (2002-17), which fictionalised the life experiences of Claudio Sanchez, the band’s lead singer. When scholars discuss transmedia they often refer to fantastic and speculative fictions, such as the Star Wars (1977-2018), Star Trek (1966-2018), Doctor Who (1963-2018) and Marvel Universe (1961-2018) franchises, and TAW fits this framework. However, there is increasing consideration of the impact transmedia reading and writing practices have on storytelling that straddles representations of the “real” world. By making collaborative life-writing explicit, Color encourages readers to resist colonising ontologies. Framing the life-writing within the band’s earlier auto-fiction(s) (TAW), Color destabilises genre divides between fiction and life-writing, and positions readers to critique Sanchez’s narration of his subjectivity. This enables readers to abstract their critique to ontological narratives that have a material impact on their own subjectivities: law, medicine, religion, and economics.The terms subject and identity are often used interchangeably in the study of life-writing. By “subjectivity” I mean the individual’s understanding of their status and role in relation to their community, culture, socio-political context, and the operations of power dynamics therein. In contrast “identity” speaks to the sense of self. While TAW and Color share differing literary conceits—one is a space opera, the other is more explicitly biographical—they both explore Sanchez’s subjectivity and can be imagined as a web of connections between recordings (both audio and video), social media, books (comics, art books, novels and scripts), and performances that contribute to a form of transmedia life-writing. Life-writing is generic term that covers “protean forms of contemporary personal narrative” (Eakin 1). These narratives can be articulated across expressive practices, including interviews, profiles, diaries, social media, prose, poetry and so on. Zachary Leader notes in his introduction to On Life-Writing that “theoreticians and historians of life-writing commonly fuse or meld sub-genres [… and this] blurring of distinctions may help to account for life-writing’s growing acceptance as a field of academic study” (1-2). The growing relationship between life-writing and transmedia is therefore unsurprising.This article ties my research considering the construction of subjectivity through transmedia life-writing, with Emma Hill and Máiréad Nic Craith’s consideration of transmedia storytelling’s political potential (87-109). My intention is to determine how readers might construct their own subjectivity to resist oppressive interpellations. Hill and Nic Craith argue that the “lack of closure” in transmedia storyworlds creates “a greater breadth and depth of interpretation … than a single telling could achieve” (104). They conclude that “this expansive quality has allowed the campaigners to continue their activism in a number of different arenas” (104). I contest their assertion that transmedia lacks closure, and instead contend that closure, or the recognition of meaning, inheres with the reader (McCloud 33) rather than in a universalised meaning attributed to the text: transmedia storytelling therefore arouses political potential in reading communities. It is precisely this feature that enables the “expansive quality” valued in political activism. I therefore focus my discussion on the readers of transmedia life-writing, rather than on its writer(s). I argue that in reading a life or lives across multiple media the reader is exposed to the texts’ self-referential citations, its extra-diegetic reiterations, and its contradictions. The reader is invited to make meaning from these citations, reiterations and contradictions; they are positioned to confront the ways in which space and time shape life-writing and subjectivity. Transmedia life-writing can therefore empower readers to invoke critical reading practices.The reader’s agency offers the potential for resistance and revolution. This agency is invited in Color where readers are asked to straddle the fictional world of TAW and the “real” world. The Unravelling Palette of Dawn (2015) is the literary narrative that parallels this album. The book is written by Chondra Echert, Sanchez’s collaborator and wife, and is an amalgam of personal essay and photo-book. It opens by invoking the space opera that informs The Amory Wars: “Sector.12, Paris, Earth. A man and a woman sit in a café debating their fate” (n.p.). This situates the reader in the fictional world of TAW, but also brings the reader into the mundanity and familiarity of a discussion between two people. The reader is witness to a discussion between intimates that focusses on the question of “where to from here.” The idea of “fate” is either misunderstood or misapplied: fate is predetermined, and undebatable. The reader is therefore positioned to remember the band’s previous “concept,” and juxtapose it against a new “realistic” trajectory: fictional characters might have a fate that is determined by their writer, but does that fate extend to the writer themselves? To what extent is Sanchez and Echert’s auto/biography crafted by writers other than themselves?The opening passage provides a skin for the protagonists of the essay, enabling a fantastical space within which Echert and Sanchez might cloak themselves, as they have done throughout TAW. However, this conceit is peeled away on the second page:This might have been the story you find yourself holding. A Sci-fi tale, shrouded in fiction. The real life details modified. All names changed. Threads neatly tied up at the end and altered for the sake of ego and feelings.But the truth is rarely so well planned. The story isn’t filled with epic action scenes or glossed-over romance. Reality is gritty and mucky and thrown together in the last seconds. It’s painful. It is not beautiful … and so it is. The events that inspired this record are acutely personal. (n.p.)In this passage Echert makes reference to the method of storytelling employed throughout the texts that make up TAW. She lays bare the shroud of fiction that covers the lived realities of her and her husband’s lives. She goes on to note that their lives have been interpreted “to fit the bounds of the concept” (n.p.), that is TAW as a space opera, and that the current album was an opportunity to “pull back the curtain” (n.p.) on this conceit. This narrative is echoed by Sanchez in the documentary component of the project, The Physics of Color (2015). Like Echert, Sanchez locates the narrative’s genesis in Paris, but in the Paris of our own world, where he and Echert finalised the literary component of the band’s previous project, The Afterman (2012). Color, like the previous works, is written as a collaboration, not just between Sanchez and Echert, but also by the other members of the band who contributed to the composition of each track. This collaborative writing is an example of relationality that facilitates a critical space for readers and invites them to consider the ways in which their own subjectivity is constructed.Ivor Goodson and Scherto Gill provide a means of critically engaging with relational reading practices. They position narrative as a tool that can be used to engage in critical self, and social, reflection. Their theory of critical narrative as a form of pedagogy enables readers to shift away from reading Color as auto-fiction and towards reading it as an act of collaborative auto/biography. This transition reflects a shifting imperative from the personal, particularly questions of identity, to the political, to engaging with the web of human relations, in order to explore subjectivity. Given transmedia is generally employed by writers of fantasy and speculative literatures, it can be difficult for readers to negotiate their expectations: transmedia is not just a tool for franchises, but can also be a tool for political resistance.Henry Jenkins initiated the conversation about transmedia reading practices and reality television in his chapters about early seasons of Survivor and American Idol in his book Convergence Culture. He identifies the relationship between viewers and these shows as one that shifts from “real-time interaction toward asynchronous participation” (59): viewers continue their engagement with the shows even when they are not watching a broadcast. Hill and Nic Craith provide a departure from literary and media studies approaches to transmedia by utilising an anthropological approach to understanding storyworlds. They maintain that both media studies and anthropological methodologies “recognize that storytelling is a continually contested act between different communities (whether media communities or social communities), and that the final result is indicative of the collective rather than the individual” (88–89). They argue that this collectivity results from “negotiated meaning” between the text and members of the reading community. This is a recognition of the significance held by readers of life-writing regarding the “biographical contract” (Lejeune 22) resulting from the “rationally motivated inter subjective recognition of norms” (Habermas n.p.). Collectivity is analogous to relationality: the way in which the readers’ subjectivity is impacted upon by their engagement with the storyworld, helixed with the writer(s) of transmedia life-writing having their subjectivity impacted upon by their engagement with reader responses to their developing texts. However, the term “relationality” is used to slightly different effect in both transmedia and life-writing studies. Colin Harvey’s definition of transmedia storytelling as relational emphasises the relationships between different media “with the wider storyworld in question, and by extension the wider culture” (2). This can be juxtaposed with Paul John Eakin’s assertion that life-writing as a genre that requires interaction between the author and their audience: “autobiography of the self but the biography and autobiography of the other” (58). It seems to me that the differing articulations of “relationality” arising from both life-writing and transmedia scholarship rely on, but elide, the relationship between the reader and the storyworld. In both instances it is left to the reader to make meaning from the text, both in terms of understanding the subject(s) represented in relation to their own, and also as the nexus between the transmedia text, the storyworld, and the broader culture. The readers’ own experiences, their memories, are central to this relationality.The song “Colors” (2015), which Echert notes in her essay was the first song to be written for the album, chronicles the anxieties that arose after Sanchez and Echert discovered that their home (which they had been leasing out) had been significantly damaged by their tenants. In the documentary The Physics of Color, both Echert and Sanchez speak about this song as a means for Sanchez to reassert his identity as a musician after an extended period where he struggled with the song-writing process. The song is pared back, the staccato guitar in the introduction echoing a similar theme in the introduction to the song “The Afterman” (2012) which was released on the band’s previous album. This tonal similarity, the plucked electric guitar and the shared rhythm, provides a sense of thoroughness between the songs, inviting the listener to remember the ways in which the music on Color is in conversation with the previous albums. This conversation is significant: it relies on the reader’s experience of their own memory. In his book Fantastic Transmedia, Colin Harvey argues that memories are “the mechanisms by which the ‘storyworld’ was effectively sewn together, helping create a common diegetic space for me—and countless others—to explore” (viii). Both readers’ and creators’ experiences of personal and political time and space in relation to the storyworld challenge traditional understandings of readers’ agency in relation to the storyworld, and this challenge can be abstracted to frame the reader’s agency in relation to other economic, political, and social manifestations of power.In “The Audience” Sanchez sings:This is my audience, forever oneTogether burning starsCut from the same diseaseEver longing what and who we areIn the documentary, Sanchez states that this song is an acknowledgement that he, the band and their audience are “one and the same in [their] oddity, and it’s like … family.” Echert echoes this, referring to the intimate relationships built with fans over the years at conventions, shows and through social media: “they’ve superseded fandom and become a part of this extended family.” Readers come to this song with the memory of TAW: the memory of “burning Star IV,” a line that is included in the titles of two of Coheed’s albums (Good Apollo, I’m Burning Star IV Vols. 1 (2005) and 2 (2007), and to the Monstar disease that is referenced throughout Second Stage Turbine Blade, both the album (2002) and the comic books (2010). As a depiction of his destabilised identity however, the lyrics can also be read as a poetic commentary on Sanchez’s experiences with renegotiating his subjectivity: his status as an identity that gains its truth through consensus with others, an audience who is “ever longing what and who we are.” In the documentary Sanchez states “I could do the concept thing again with this album, you know, take it and manipulate it and make it this other sort of dimension … but this one … it means so much more to be … I really wanted this to be exposed, I really want this to be my story.” Sanchez imagines that his story, its truth, its sacredness, is contingent on its exposure on being shared with an audience. For Sanchez his subjectivity arises from on his relationality with his audience. This puts the reader at the centre of the storyworld. The assertion of subjectivity arises as a result of community.However, there is an uncertainty that floats in the lacunae between the texts contributing to the Color storyworld. As noted, in the documentary, both Echert and Sanchez speak lovingly of their relationships with Coheed audiences, but Sanchez goes on to acknowledge that “there’s a little bit of darkness in there too, that I don’t know if I want to bring up… I’ll keep that a mystery,” and some of the “The Audience” lyrics hint at a more sinister relationship between the audience and the band:Thieves of our timeWatch as they rape your integrityMarch as the beat suggests.One reader, Hecatonchair, discusses these lyrics in a Reddit post responding to “The Audience”. They write:The lyrics are pretty aggressive, and could easily be read as an attack against either the music industry or the fans. Considering the title and chorus, I think the latter is who it was intended to reach, but both interpretations are valid.This acknowledgement by the poster that there the lyrics are polyvalent speaks to the decisions that readers are positioned to make in responding to the storyworld.This phrase makes explicit the inconsistency between what Sanchez says about the band’s fans, and what he feels. It is left to the reader to account for this inconsistency between the song lyrics and the writers’ assertions. Hecatonchair and the five readers who respond to their post all write that they enjoy the song, regardless of what they read as its aggressive position on the band’s relationship with them as audience members. In identifying as both audience members and readers with different interpretations, the Reddit commentators recognise their identities in intersecting communities, and demonstrate their agency as subjects. Goodson and Gill invoke Charles Taylor’s assertion that one of the defining elements of “identity” is a “defining community,” that is “identity is lived in social and historical particulars, such as the literature, philosophy, religious teaching and great conversations taking place along one’s life’s journeys” (Goodson and Gill 27).Harvey identified readers as central to transmedia practices. In reading a life across multiple media readers assert agency within the storyworld: they choose which texts to engage with, and how and when to engage with them. They must remember, or more specifically re-member, the life or lives with which they are engaging. This re-membering is an evocative metaphor: it could be described as Frankensteinian, the bringing together of texts and media through a reading that is stretched across the narrative, like the creature’s yellow skin. It also invokes older stories of death (the author’s) and resurrection (of the author, by the reader): the murder and dismemberment of Osiris by his brother Set, and Isis, Osiris's wife, who rejoins the fragmented pieces of Osiris, and briefly brings him back.Coheed and Cambria regularly cite musical themes or motifs across their albums, while song lyrics are quoted in the text of comic books and the novel. The readers recognise and weave together these citations with the more explicitly autobiographical writing in Color. Readers are positioned to critique the function of a canonical truth underpinning the storyworld: whose life is being told? Sanchez invokes memory throughout the album by incorporating soundscapes, such as the sounds of a train-line on the song “Island.” Sanchez notes he and his wife would hear these sounds as they took the train from their home in Brooklyn to the island of Manhattan. Sanchez brings his day-to-day experiences to his readers as overlapping but not identical accounts of perspectives. They enable a plurality of truths and destabilise the Western focus on a singular or universal truth of lived experience.When life-writing is constructed transmedially the author must—of necessity—relinquish control over their story’s temporality. This includes both the story’s internal and external temporalities. By internal temporality I am referring to the manner in which time plays out within the story: given that the reader can enter into and engage with the story through a number of media, the responsibility for constructing the story’s timeline lies with the reader; they may therefore choose, or only be able, to engage with the story’s timeline in a haphazard, rather than a chronological, manner. For example, in Sanchez’ previous work, TAW, comic book components of the storyworld were often released years after the albums with which they were paired. Readers can only engage with the timelines as they are published, as they loop back through and between the storyworld’s temporality.The different media—CD, comic, novel, or art-book—often represent different perspectives or experiences within the same or at least within overlapping internal temporalities: significant incidences are narrated between the media. This results in an unstable external temporality, over which the author, again, has no control. The reader may listen to the music before reading the book, or the other way around, but reading the book and listening to the music simultaneously may not be feasible, and may detract from the experience of engaging with each aspect of the storyworld. This brings us back to the importance of memory to readers of transmedia narratives: they must remember in order to, as Harvey says, stitch together a common “diegetic space.” Although the author often relinquishes control to the external temporality of the text, placing the reader in control of the internal temporality of their life-writing destabilises the authority that is often attributed to an auto/biographer. It also makes explicit that transmedia life-writing is an ongoing project. This allows the author(s) to account for “a reflexive process where individuals take the opportunity to evaluate their actions in connection with their intentions and thus ‘write a further part’ of their histories” (Goodson and Gill 33).Goodson and Gill note that “life’s events are never linear and any intention for life to be coherent and progressive in accordance with a ‘plan’ will constantly be interrupted” (30). This is why transmedia offers writers and readers a more authentic means of engaging with life-writing. Its weblike structure enables readers to view subjectivity through a number of lenses: transmedia life-writing narrates a relational subjectivity that resists attempts at delineation. There is still a “continuity” that arises when Sanchez invokes the storyworld’s self-referential citations, reiterations, and contradictions in order to “[define] narratives within a temporal, social and cultural framework” (Goodson and Gill 29), however transmedia life-writing refuses to limit itself, or its readers, to the narratives of space and time that regulate mono-medial life-writing. Instead it positions readers to “unmask the world and then change it” (43).ReferencesArendt, Hannah. The Human Condition. Chicago: U of Chicago P, 1958.Coheed and Cambria. Second Stage Turbine Blade. New York: Equal Vision Records, 2002.———. In Keeping Secrets of Silent Earth: 3. New York: Equal Vision Records, 2003.———. Good Apollo I’m Burning Star IV, Vol. 1: From Fear through the Eyes of Madness. New York: Columbia, 2005.———. Good Apollo I’m Burning Star IV, Vol. 2: No World for Tomorrow. New York: Columbia, 2007.———. The Year of the Black Rainbow. New York: Columbia, 2010.———. The Afterman: Ascension. Los Angeles: Hundred Handed/Everything Evil, 2012.———. The Afterman: Descension. Los Angeles: Hundred Handed/Everything Evil, 2013.———. The Colour before the Sun. Brooklyn: the bag.on-line.adventures and Everything Evil Records, 2015.———. “The Physics of Color” Documentary DVD. Brooklyn: Everything Evil Records, 2015. Eakin, Paul John. How Our Lives Become Stories: Making Selves. Ithaca: Cornell UP, 1999. ———. The Ethics of Life Writing. Ithaca: Cornell UP, 2004.Echert, Chondra. The Unravelling Palette of Dawn. Brooklyn: the bag.on-line.adventures and Everything Evil Records, 2015.Goodson, Ivor, and Scherto Gill. Critical Narrative as Pedagogy. London: Bloomsbury Publishing, 2014.Habermas, Jürgen. The Theory of Communicative Action, Vol. 1: Reason and the Rationalisation of Society. Trans. Thomas McCarthy. Cambridge: Polity Press, 1984.Harvey, Colin. Fantastic Transmedia: Narrative, Play and Memory Across Science-Fiction and Fantasy Storyworlds. London: Palgrave Macmillan, 2015.Hecatonchair. “r/TheFence's Song of the Day Database Update Day 9: The Audience”. 11 Feb. 2018 <https://www.reddit.com/r/TheFence/comments/4eno9o/rthefences_song_of_the_day_database_update_day_9/>.Hill, Emma, and Máiréad Nic Craith. “Medium and Narrative Change: The Effects of Multiple Media on the ‘Glasgow Girls’ Story and Their Real-Life Campaign.” Narrative Culture 3.1 (2016). 9 Dec. 2017 <http://www.jstor.org/stable/10.13110/narrcult.3.1.0087>.Jenkins, Henry. Convergence Culture: Where Old and New Media Collide. New York: New York UP, 2006.Leader, Zachary, ed. On Life-Writing. Oxford: Oxford UP, 2015.Lejeune, Philippe, and Paul John Eakin, eds. On Autobiography. Trans. Katherine Leary. Minneapolis: U of Minnesota P, 1989.McCloud, Scott. Understanding Comics: The Invisible Art, New York: Harper Perennial, 1994.Sanchez, Claudio, and Gus Vasquez. The Amory Wars Sketchbook. Los Angeles: Evil Ink Comics, 2006.———, Gus Vasquez, et al. The Amory Wars: The Second Stage Turbine Blade Ultimate Edition. Los Angeles: BOOM! Studios, 2010.———, Peter David, Chris Burnham, et al. In Keeping Secrets of Silent Earth: 3 Ultimate Edition. Los Angeles: BOOM! Studios, 2010.———, and Christopher Shy. Good Apollo I’m Burning Star IV, Vol. 1: From Fear through the Eyes of Madness. Los Angeles: Evil Ink Comics, 2005.———, and Peter David. Year of the Black Rainbow. Nashville: Evil Ink Books, 2010.———, and Nathan Spoor, The Afterman. Los Angeles: Evil Ink Comics/Hundred Handed Inc., 2012.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Charman, Suw, und Michael Holloway. „Copyright in a Collaborative Age“. M/C Journal 9, Nr. 2 (01.05.2006). http://dx.doi.org/10.5204/mcj.2598.

Der volle Inhalt der Quelle
Annotation:
The Internet has connected people and cultures in a way that, just ten years ago, was unimaginable. Because of the net, materials once scarce are now ubiquitous. Indeed, never before in human history have so many people had so much access to such a wide variety of cultural material, yet far from heralding a new cultural nirvana, we are facing a creative lock-down. Over the last hundred years, copyright term has been extended time and again by a creative industry eager to hold on to the exclusive rights to its most lucrative materials. Previously, these rights guaranteed a steady income because the industry controlled supply and, in many cases, manufactured demand. But now culture has moved from being physical artefacts that can be sold or performances that can be experienced to being collections of 1s and 0s that can be easily copied and exchanged. People are revelling in the opportunity to acquire and experience music, movies, TV, books, photos, essays and other materials that they would otherwise have missed out on; and they picking up the creative ball and running with it, making their own version, remixes, mash-ups and derivative works. More importantly than that, people are producing and sharing their own cultural resources, publishing their own original photos, movies, music, writing. You name it, somewhere someone is making it, just for the love of it. Whilst the creative industries are using copyright law in every way they can to prosecute, shut down, and scare people away from even legitimate uses of cultural materials, the law itself is becoming increasingly inadequate. It can no longer deal with society’s demands and expectations, nor can it cope with modern forms of collaboration facilitated by technologies that the law makers could never have anticipated. Understanding Copyright Copyright is a complex area of law and even a seemingly simple task like determining whether a work is in or out of copyright can be a difficult calculation, as illustrated by flowcharts from Tim Padfield of the National Archives examining the British system, and Bromberg & Sunstein LLP which covers American works. Despite the complexity, understanding copyright is essential in our burgeoning knowledge economies. It is becoming increasingly clear that sharing knowledge, skills and expertise is of great importance not just within companies but also within communities and for individuals. There are many tools available today that allow people to work, synchronously or asynchronously, on creative endeavours via the Web, including: ccMixter, a community music site that helps people find material to remix; YouTube, which hosts movies; and JumpCut:, which allows people to share and remix their movies. These tools are being developed because of the increasing number of cultural movements toward the appropriation and reuse of culture that are encouraging people to get involved. These movements vary in their constituencies and foci, and include the student movement FreeCulture.org, the Free Software Foundation, the UK-based Remix Commons. Even big business has acknowledged the importance of cultural exchange and development, with Apple using the tagline ‘Rip. Mix. Burn.’ for its controversial 2001 advertising campaign. But creators—the writers, musicians, film-makers and remixers—frequently lose themselves in the maze of copyright legislation, a maze complicated by the international aspect of modern collaboration. Understanding of copyright law is at such a low ebb because current legislation is too complex and, in parts, out of step with modern technology and expectations. Creators have neither the time nor the motivation to learn more—they tend to ignore potential issues and continue labouring under any misapprehensions they have acquired along the way. The authors believe that there is an urgent need for review, modernisation and simplification of intellectual property laws. Indeed, in the UK, intellectual property is currently being examined by a Treasury-level review lead by Andrew Gowers. The Gowers Review is, at the time of writing, accepting submissions from interested parties and is due to report in the Autumn of 2006. Internationally, however, the situation is likely to remain difficult, so creators must grasp the nettle, educate themselves about copyright, and ensure that they understand the legal ramifications of collaboration, publication and reuse. What Is Collaboration? Wikipedia, a free online encyclopaedia created and maintained by unpaid volunteers, defines collaboration as “all processes wherein people work together—applying both to the work of individuals as well as larger collectives and societies” (Wikipedia, “Collaboration”). These varied practices are some of our most common and basic tendencies and apply in almost every sphere of human behaviour; working together with others might be described as an instinctive, pragmatic or social urge. We know we are collaborating when we work in teams with colleagues or brainstorm an idea with a friend, but there are many less familiar examples of collaboration, such as taking part in a Mexican wave or standing in a queue. In creative works, the law expects collaborators to obtain permission to reuse work created by others before they embark upon that reuse. Yet this distinction between ‘my’ work and ‘your’ work is entirely a legal and social construct, as opposed to an absolute fact of human nature, and new technologies are blurring the boundaries between what is ‘mine’ and what is ‘yours’ whilst new cultural movements posit a third position, ‘ours’. Yochai Benkler coined the term ‘commons-based peer production’ (Benkler, Coase’s Penguin; The Wealth of Nations) to describe collaborative efforts, such as free and open-source software or projects such as Wikipedia itself, which are based on sharing information. Benkler posits this particular example of collaboration as an alternative model for economic development, in contrast to the ‘firm’ and the ‘market’. Benkler’s notion sits uncomfortably with the individualistic precepts of originality which dominate IP policy, but with examples of commons-based peer production on the increase, it cannot be ignored when considering how new technologies and ways of working interact with existing and future copyright legislation. The Development of Collaboration When we think of collaboration we frequently imagine academics working together on a research paper, or musicians jamming together to write a new song. In academia, researchers working on a project are expected to write papers for publication in journals on a regular basis. The motto ‘publish or die’ is well known to anyone who has worked in academic circle—publishing papers is the lifeblood of the academic career, forming the basis of a researcher’s status within the academic community and providing data and theses for other researchers to test and build upon. In these circumstances, copyright is often assigned by the authors to a journal and, because there is no direct commercial outcome for the authors, conflicts regarding copyright tend to be restricted to issues such as reuse and reproduction. Within the creative industries, however, the focus of the collaboration is to derive commercial benefit from the work, so copyright issues, such as division of fees and royalties, plagiarism, and rights for reuse are much more profitable and hence they are more vigorously pursued. All of these issues are commonly discussed, documented and well understood. Less well understood is the interaction between copyright and the types of collaboration that the Internet has facilitated over the last decade. Copyright and Wikis Ten years ago, Ward Cunningham invented the ‘wiki’—a Web page which could be edited in situ by anyone with a browser. A wiki allows multiple users to read and edit the same page and, in many cases, those users are either anonymous or identified only by a nickname. The most famous example of a wiki is Wikipedia, which was started by Jimmy Wales in 2001 and now has over a million articles and over 1.2 million registered users (Wikipedia, “Wikipedia Statistics”). The culture of online wiki collaboration is a gestalt—the whole is greater than the sum of the parts and the collaborators see the overall success of the project as more important than their contribution to it. The majority of wiki software records every single edit to every page, creating a perfect audit trail of who changed which page and when. Because copyright is granted for the expression of an idea, in theory, this comprehensive edit history would allow users to assert copyright over their contributions, but in practice it is not possible to delineate clearly between different people’s contributions and, even if it was possible, it would simply create a thicket of rights which could never be untangled. In most cases, wiki users do not wish to assert copyright and are not interested in financial gain, but when wikis are set up to provide a source of information for reuse, copyright licensing becomes an issue. In the UK, it is not possible to dedicate a piece of work to the public domain, nor can you waive your copyright in a work. When a copyright holder wishes to licence their work, they can only assign that licence to another person or a legal entity such as a company. This is because in the UK, the public domain is formed of the ‘leftovers’ of intellectual property—works for which copyright has expired or those aspects of creative works which do not qualify for protection. It cannot be formally added to, although it certainly can be reduced by, for example, extension of copyright term which removes work from the public domain by re-copyrighting previously unprotected material. So the question becomes, to whom does the content of a wiki belong? At this point traditional copyright doctrines are of little use. The concept of individuals owning their original contribution falls down when contributions become so entangled that it’s impossible to split one person’s work from another. In a corporate context, individuals have often signed an employment contract in which they assign copyright in all their work to their employer, so all material created individually or through collaboration is owned by the company. But in the public sphere, there is no employer, there is no single entity to own the copyright (the group of contributors not being in itself a legal entity), and therefore no single entity to give permission to those who wish to reuse the content. One possible answer would be if all contributors assigned their copyright to an individual, such as the owner of the wiki, who could then grant permission for reuse. But online communities are fluid, with people joining and leaving as the mood takes them, and concepts of ownership are not as straightforward as in the offline world. Instead, authors who wished to achieve the equivalent of assigning rights to the public domain would have to publish a free licence to ‘the world’ granting permission to do any act otherwise restricted by copyright in the work. Drafting such a licence so that it is legally binding is, however, beyond the skills of most and could be done effectively only by an expert in copyright. The majority of creative people, however, do not have the budget to hire a copyright lawyer, and pro bono resources are few and far between. Copyright and Blogs Blogs are a clearer-cut case. Blog posts are usually written by one person, even if the blog that they are contributing to has multiple authors. Copyright therefore resides clearly with the author. Even if the blog has a copyright notice at the bottom—© A.N. Other Entity—unless there has been an explicit or implied agreement to transfer rights from the writer to the blog owner, copyright resides with the originator. Simply putting a copyright notice on a blog does not constitute such an agreement. Equally, copyright in blog comments resides with the commenter, not the site owner. This reflects the state of copyright with personal letters—the copyright in a letter resides with the letter writer, not the recipient, and owning letters does not constitute a right to publish them. Obviously, by clicking the ‘submit’ button, commenters have decided themselves to publish, but it should be remembered that that action does not transfer copyright to the blog owner without specific agreement from the commenter. Copyright and Musical Collaboration Musical collaboration is generally accepted by legal systems, at least in terms of recording (duets, groups and orchestras) and writing (partnerships). The practice of sampling—taking a snippet of a recording for use in a new work—has, however, changed the nature of collaboration, shaking up the recording industry and causing a legal furore. Musicians have been borrowing directly from each other since time immemorial and the student of classical music can point to many examples of composers ‘quoting’ each other’s melodies in their own work. Folk musicians too have been borrowing words and music from each other for centuries. But sampling in its modern form goes back to the musique concrète movement of the 1940s, when musicians used portions of other recordings in their own new compositions. The practice developed through the 50s and 60s, with The Beatles’ “Revolution 9” (from The White Album) drawing heavily from samples of orchestral and other recordings along with speech incorporated live from a radio playing in the studio at the time. Contemporary examples of sampling are too common to pick highlights, but Paul D. Miller, a.k.a. DJ Spooky ‘that Subliminal Kid’, has written an analysis of what he calls ‘Rhythm Science’ which examines the phenomenon. To begin with, sampling was ignored as it was rare and commercially insignificant. But once rap artists started to make significant amounts of money using samples, legal action was taken by originators claiming copyright infringement. Notable cases of illegal sampling were “Pump Up the Volume” by M/A/R/R/S in 1987 and Vanilla Ice’s use of Queen/David Bowie’s “Under Pressure” in the early 90s. Where once artists would use a sample and sort out the legal mess afterwards, such high-profile litigation has forced artists to secure permission for (or ‘clear’) their samples before use, and record companies will now refuse to release any song with uncleared samples. As software and technology progress further, so sampling progresses along with it. Indeed, sampling has now spawned mash-ups, where two or more songs are combined to create a musical hybrid. Instead of using just a portion of a song in a new composition which may be predominantly original, mash-ups often use no original material and rely instead upon mixing together tracks creatively, often juxtaposing musical styles or lyrics in a humorous manner. One of the most illuminating examples of a mash-up is DJ Food Raiding the 20th Century which itself gives a history of sampling and mash-ups using samples from over 160 sources, including other mash-ups. Mash-ups are almost always illegal, and this illegality drives mash-up artists underground. Yet, despite the fact that good mash-ups can spread like wildfire on the Internet, bringing new interest to old and jaded tracks and, potentially, new income to artists whose work had been forgotten, this form of musical expression is aggressively demonised upon by the industry. Given the opportunity, the industry will instead prosecute for infringement. But clearing rights is a complex and expensive procedure well beyond the reach of the average mash-up artist. First, you must identify the owner of the sound recording, a task easier said than done. The name of the rights holder may not be included in the original recording’s packaging, and as rights regularly change hands when an artist’s contract expires or when a record label is sold, any indication as to the rights holder’s identity may be out of date. Online musical databases such as AllMusic can be of some use, but in the case of older or obscure recordings, it may not be possible to locate the rights holder at all. Works where there is no identifiable rights holder are called ‘orphaned works’, and the longer the term of copyright, the more works are orphaned. Once you know who the rights holder is, you can negotiate terms for your proposed usage. Standard fees are extremely high, especially in the US, and typically discourage use. This convoluted legal culture is an anachronism in desperate need of reform: sampling has produced some of the most culturally interesting and financially valuable recordings of the past thirty years, so should be supported rather than marginalised. Unless the legal culture develops an acceptance for these practices, the associated financial and cultural benefits for society will not be realised. The irony is that there is already a successful model for simplifying licensing. If a musician wishes to record a cover version of a song, then royalty terms are set by law and there is no need to seek permission. In this case, the lawmakers have recognised the social and cultural benefit of cover versions and created a workable solution to the permissions problem. There is no logical reason why a similar system could not be put in place for sampling. Alternatives to Traditional Copyright Copyright, in its default structure, is a disabling force. It says that you may not do anything with my work without my permission and forces creators wishing to make a derivative work to contact me in order to obtain that permission in writing. This ‘permissions society’ has become the norm, but it is clear that it is not beneficial to society to hide away so much of our culture behind copyright, far beyond the reach of the individual creator. Fortunately there are fast-growing alternatives which simplify whilst encouraging creativity. Creative Commons is a global movement started by academic lawyers in the US who thought to write a set of more flexible copyright licences for creative works. These licenses enable creators to precisely tailor restrictions imposed on subsequent users of their work, prompting the tag-line ‘some rights reserved’ Creators decide if they will allow redistribution, commercial or non-commercial re-use, or require attribution, and can combine these permissions in whichever way they see fit. They may also choose to authorise others to sample their works. Built upon the foundation of copyright law, Creative Commons licences now apply to some 53 million works world-wide (Doctorow), and operate in over 60 jurisdictions. Their success is testament to the fact that collaboration and sharing is a fundamental part of human nature, and treating cultural output as property to be locked away goes against the grain for many people. Creative Commons are now also helping scientists to share not just the results of their research, but also data and samples so that others can easily replicate experiments and verify or refute results. They have thus created Science Commons in an attempt to free up data and resources from unnecessary private control. Scientists have been sharing their work via personal Web pages and other Websites for many years, and additional tools which allow them to benefit from network effects are to be welcomed. Another example of functioning alternative practices is the Remix Commons, a grassroots network spreading across the UK that facilitates artistic collaboration. Their Website is a forum for exchange of cultural materials, providing a space for creators to both locate and present work for possible remixing. Any artistic practice which can reasonably be rendered online is welcomed in their broad church. The network’s rapid expansion is in part attributable to its developers’ understanding of the need for tangible, practicable examples of a social movement, as embodied by their ‘free culture’ workshops. Collaboration, Copyright and the Future There has never been a better time to collaborate. The Internet is providing us with ways to work together that were unimaginable even just a decade ago, and high broadband penetration means that exchanging large amounts of data is not only feasible, but also getting easier and easier. It is possible now to work with other artists, writers and scientists around the world without ever physically meeting. The idea that the Internet may one day contain the sum of human knowledge is to underestimate its potential. The Internet is not just a repository, it is a mechanism for new discoveries, for expanding our knowledge, and for making links between people that would previously have been impossible. Copyright law has, in general, failed to keep up with the amazing progress shown by technology and human ingenuity. It is time that the lawmakers learnt how to collaborate with the collaborators in order to bring copyright up to date. References Apple. “Rip. Mix. Burn.” Advertisement. 28 April 2006 http://www.theapplecollection.com/Collection/AppleMovies/mov/concert_144a.html>. Benkler, Yochai. Coase’s Penguin. Yale Law School, 1 Dec. 2002. 14 April 2006 http://www.benkler.org/CoasesPenguin.html>. ———. The Wealth of Nations. New Haven: Yape UP, 2006. Bromberg & Sunstein LLP. Flowchart for Determining when US Copyrights in Fixed Works Expire. 14 Apr. 2006 http://www.bromsun.com/practices/copyright-portfolio-development/flowchart.htm>. DJ Food. Raiding the 20th Century. 14 April 2006 http://www.ubu.com/sound/dj_food.html>. Doctorow, Cory. “Yahoo Finds 53 Million Creative Commons Licensed Works Online.” BoingBoing 5 Oct. 2005. 14 April 2006 http://www.boingboing.net/2005/10/05/yahoo_finds_53_milli.html>. Miller, Paul D. Rhythm Science. Cambridge, Mass.: MIT Press, 2004. Padfield, Tim. “Duration of Copyright.” The National Archives. 14 Apr. 2006 http://www.kingston.ac.uk/library/copyright/documents/DurationofCopyright FlowchartbyTimPadfieldofTheNationalArchives_002.pdf>. Wikipedia. “Collaboration.” 14 April 2006 http://en.wikipedia.org/wiki/Collaboration>. ———. “Wikipedia Statistics.” 14 April 2006 http://en.wikipedia.org/wiki/Special:Statistics>. Citation reference for this article MLA Style Charman, Suw, and Michael Holloway. "Copyright in a Collaborative Age." M/C Journal 9.2 (2006). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0605/02-charmanholloway.php>. APA Style Charman, S., and M. Holloway. (May 2006) "Copyright in a Collaborative Age," M/C Journal, 9(2). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0605/02-charmanholloway.php>.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Chapman, Owen. „Mixing with Records“. M/C Journal 4, Nr. 2 (01.04.2001). http://dx.doi.org/10.5204/mcj.1900.

Der volle Inhalt der Quelle
Annotation:
Introduction "Doesn't that wreck your records?" This is one of the first things I generally get asked when someone watches me at work in my home or while spinning at a party. It reminds me of a different but related question I once asked someone who worked at Rotate This!, a particularly popular Toronto DJ refuge, a few days after I had bought my first turntable: DJO: "How do you stop that popping and crackling sound your record gets when you scratch back and forth on the same spot for a while?" CLERK: "You buy two copies of everything, one you keep at home all wrapped-up nice and never use, and the other you mess with." My last $150 had just managed to pay for an old Dual direct drive record player. The precious few recently-released records I had were gifts. I nodded my head and made my way over to the rows of disks which I flipped through to make it look like I was maybe going to buy something. Lp cover after lp cover stared back at me all with names I had absolutely never heard of before, organised according to a hyper- hybridised classification scheme that completely escaped my dictionary-honed alphabetic expectations. Worst of all, there seemed to be only single copies of everything left! A sort of outsider's vertigo washed over me, and 3 minutes after walking into unfamiliar territory, I zipped back out onto the street. Thus was to begin my love/hate relationship with the source of all DJ sounds, surliness and misinformation--the independent record shop. My query had (without my planning) boldly pronounced my neophyte status. The response it solicited challenged my seriousness. How much was I willing to invest in order to ride "the wheels of steel"? Sequence 1 Will Straw describes the meteoric rise to prominence of the CD format, If the compact disk has emerged as one of the most dazzlingly effective of commodity forms, this has little to do with its technical superiority to the vinyl record (which we no longer remember to notice). Rather, the effectiveness has to do with its status as the perfect crossover consumer object. As a cutting-edge audiophile invention, it seduced the technophilic, connoisseurist males who typically buy new sound equipment and quickly build collections of recordings. At the same time, its visual refinement and high price rapidly rendered it legitimate as a gift. In this, the CD has found a wide audience among the population of casual record buyers.(61) Straw's point has to do with the fate of musical recordings within contemporary commodity culture. In the wake of a late 70's record industry slump, music labels turned their attention toward the recapturing of casual record sales (read: aging baby boomers). The general shape of this attempt revolved around a re-configuring of the record- shopping experience dedicated towards reducing "the intimidation seen as endemic to the environment of the record store."(59) The CD format, along with the development of super-sized, general interest (all-genre) record outlets has worked (according to Straw) to streamline record sales towards more-predictable patterns, all the while causing less "selection stress."(59) Re-issues and compilations, special-series trademarks, push-button listening stations, and maze-like display layouts, combined with department store-style service ("Can I help you find anything?") all work towards eliminating the need for familiarity with particular music "scenes" in order to make personally gratifying (and profit engendering) musical choices. Straw's analysis is exemplary in its dissatisfaction with treating the arena of personal musical choice as unaffected by any constraints apart from subjective matters of taste. Straw's evaluation also isolates the vinyl record as an object eminently ready (post-digital revolution) for subcultural appropriation. Its displacement by the CD as the dominant medium for collecting recorded music involved the recasting of the turntable as outdated and inferior, thereby relegating it to the dusty attic, basement or pawn shop (along with crates upon crates upon crates of records). These events set the stage for vinyl's spectacular rise from the ashes. The most prominent feature of this re-emergence has to do not simply with possession of the right kind of stuff (the cachet of having a music collection difficult for others to borrow aside), but with what vinyl and turntable technology can do. Bridge In Subculture: The Meaning of Style, Dick Hebdige claims that subcultures are, cultures of conspicuous consumption...and it is through the distinctive rituals of consumption, through style, that the subculture at once reveals its "secret identity" and communicates its forbidden meanings. It is basically the way in which commodities are used in subculture which mark the subculture off from more orthodox cultural formations.(103 Hebdige borrows the notion of bricolage from Levi Strauss in order to describe the particular kind of use subcultures make of the commodities they appropriate. Relationships of identity, difference and order are developed from out of the minds of those who make use of the objects in question and are not necessarily determined by particular qualities inherent to the objects themselves. Henceforth a safety pin more often used for purposes like replacing missing buttons or temporarily joining pieces of fabric can become a punk fashion statement once placed through the nose, ear or torn Sex Pistols tee-shirt. In the case of DJ culture, it is the practice of mixing which most obviously presents itself as definitive of subcultural participation. The objects of conspicuous consumption in this case--record tracks. If mixing can be understood as bricolage, then attempts "to discern the hidden messages inscribed in code"(18) by such a practice are not in vain. Granting mixing the power of meaning sets a formidable (semiotic) framework in place for investigating the practice's outwardly visible (spectacular) form and structure. Hebdige's description of bricolage as a particularly conspicuous and codified type of using, however, runs the risk of privileging an account of record collecting and mixing which interprets it entirely on the model of subjective expression.(1.) What is necessary is a means of access to the dialogue which takes place between a DJ and her records as such. The contents of a DJ's record bag (like Straw's CD shopping bag) are influenced by more that just her imagination, pocket book and exposure to different kinds of music. They are also determined in an important way by each other. Audio mixing is not one practice, it is many, and the choice to develop or use one sort of skill over another is intimately tied up with the type and nature of track one is working with. Sequence 2 The raw practice of DJing relies heavily on a slider integral to DJ mixers known as the _cross-fader_(ital). With the standard DJ set up, when the cross-fader is all the way to the left, the left turntable track plays through the system; vice versa when the fader is all the way to the right. In between is the "open" position which allows both inputs to be heard simultaneously. The most straightforward mixing technique, "cutting," involves using this toggle to quickly switch from one source to another--resulting in the abrupt end of one sound- flow followed by its instantaneous replacement. This technique can be used to achieve a variety of different effects--from the rather straightforward stringing together of the final beat of a four bar sequence from one track with a strong downbeat from something new in order to provide continuous, but sequential musical output, to the thoroughly difficult practice of "beat juggling," where short excerpts of otherwise self-contained tracks ("breaks") are isolated and then extended indefinitely through the use of two copies of the same record (while one record plays, the DJ spins the other back to the downbeat of the break in question, which is then released in rhythm). In both cases timing and rhythm are key. These features of the practice help to explain DJ predilections for tracks which make heavy, predictable use of their rhythm sections. "Blending" is a second technique which uses the open position on the cross-fader to mix two inputs into a live sonic collage. Tempo, rhythm and "density" of source material have an enormous impact on the end result. While any two tracks can be layered in this way, beats that are not synchronized are quick to create cacophony, and vocals also tend to clash dramatically. Melodic lines in general pose certain challenges here since these are in particular keys and have obvious starts and finishes. This is one reason why tracks produced specifically for DJing often have such long, minimal intros and exits. This makes it much easier to create "natural" sounding blends. Atmospheric sounds, low-frequency hums, speech samples and repetitive loops with indeterminate rhythm structures are often used for these segments in order to allow drawn-out, subtle transitions when moving between tracks. If an intro contains a fixed beat (as is the case often with genres constructed specifically for non-stop dancing like house, techno and to some extent drum and bass), then those who want seamless blends need to "beat match" if they want to maintain a dancer's groove. The roots of this technique go back to disco and demand fairly strict genre loyalty in order to insure that a set's worth of tracks all hover around the same tempo, defined in beats-per- minute, or BPMs. The basic procedure involves finding the downbeat of the track one wishes to mix through a set of headphones, releasing that beat in time with the other record while making fine tempo- adjustments via the turntable's pitch control to the point where the track coming through the earphones and the track being played over the system are in synch. The next step is "back-spinning" or "needle dropping" to the start of the track to be mixed, then releasing it again, this time with the cross-fader open. Volume levels can then be adjusted in order to allow the new track to slowly take prominence (the initial track being close to its end at this point) before the cross-fader is closed into the new position and the entire procedure is repeated. Scratching is perhaps the most notorious mixing technique and involves the most different types of manipulations. The practice is most highly developed in hip hop (and related genres like drum and bass) and is used both as an advanced cutting technique for moving between tracks as well as a sonic end-in-itself. It's genesis is attributed to a South Bronx DJ known as Grand Wizard Theodore who was the first (1977) to try to make creative use of the sound associated with moving a record needle back and forth over the same drumbeat, a phenomena familiar to DJs used to cueing-up downbeats through headphones. This trick is now referred to as the "baby scratch," and it along with an ever-increasing host of mutations and hybrids make- up the skills that pay the bills for hip hop DJs. In the case of many of these techniques, the cross-fader is once again used heavily in order to remove unwanted elements of particular scratches from the mix, as well as adding certain staccato and volume-fading effects. Isolated, "pure" sounds are easiest to scratch with and are therefore highly sought after by this sort of DJ--a pastime affectionately referred to as "digging in the crates." Sources of such sounds are extremely diverse, but inevitably revolve around genre's which use minimal orchestration (like movie-soundtracks), accentuated rhythms with frequent breakdowns (like funk or jazz), or which eschew musical form all together (like sound-effects, comedy and children's records). Exit To answer the question which started this investigation, in the end, how wrecked my records get depends a lot on what I'm using them for. To be sure, super-fast scratching patterns and tricks that use lots of back-spinning like beat-juggling will eventually "burn" static into spots on one's records. But with used records costing as little as $1 for three, and battle records (2.) widely available, the effect of this feature of the technology on the actual pursuit of the practice is negligible. And most techniques don't noticeably burn records at all, especially if a DJ's touch is light enough to allow for minimal tone-arm weight (a parameter which controls a turntable's groove-tracking ability). This is the kind of knowledge which comes from interaction with objects. It is also the source of a great part of the subcultural bricoleur's stylistic savvy. Herein lies the essence of the intimidating power of the indie record shop--its display of intimate, physical familiarity with the hidden particularities of the new vinyl experience. Investigators confronted with such familiarity need to find ways to go beyond analyses which stop at the level of acknowledgment of the visible logic displayed by spectacular subcultural practices if they wish to develop nuanced accounts of subcultural life. Such plumbing of the depths often requires listening in the place of observing--whether to first-hand accounts collected through ethnography or to the subtle voice of the objects themselves. (1.) An example of such an account: "DJ-ing is evangelism; a desire to share songs. A key skill is obviously not just to drop the popular, well-known songs at the right part of the night, but to pick the right new releases, track down the obscurer tunes and newest imports, get hold of next month's big tune this month; you gather this pile, this tinder, together, then you work the records, mix them, drop them, cut them, scratch them, melt them, beat them all together until they unite. Voilà; disco inferno." Dave Haslam, "DJ Culture," p. 169. (2.) Records specifically designed by and for scratch DJs and which consist of long strings of scratchable sounds. References Haslam, David. "DJ Culture." The Clubcultures Reader. Oxford: Blackwell Publishers. 1997 Hebdige, Dick. Subculture: The Meaning of Style. London: Melvin and Co. Ltd.. 1979 Straw, Will. "Organized Disorder: The Changing Space of the Record Shop." The Clubcultures Reader. Oxford: Blackwell Publishers. 1997
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Mathur, Suchitra. „From British “Pride” to Indian “Bride”“. M/C Journal 10, Nr. 2 (01.05.2007). http://dx.doi.org/10.5204/mcj.2631.

Der volle Inhalt der Quelle
Annotation:
The release in 2004 of Gurinder Chadha’s Bride and Prejudice marked yet another contribution to celluloid’s Austen mania that began in the 1990s and is still going strong. Released almost simultaneously on three different continents (in the UK, US, and India), and in two different languages (English and Hindi), Bride and Prejudice, however, is definitely not another Anglo-American period costume drama. Described by one reviewer as “East meets West”, Chadha’s film “marries a characteristically English saga [Austen’s Pride and Prejudice] with classic Bollywood format “transforming corsets to saris, … the Bennetts to the Bakshis and … pianos to bhangra beats” (Adarsh). Bride and Prejudice, thus, clearly belongs to the upcoming genre of South Asian cross-over cinema in its diasporic incarnation. Such cross-over cinema self-consciously acts as a bridge between at least two distinct cinematic traditions—Hollywood and Bollywood (Indian Hindi cinema). By taking Austen’s Pride and Prejudice as her source text, Chadha has added another dimension to the intertextuality of such cross-over cinema, creating a complex hybrid that does not fit neatly into binary hyphenated categories such as “Asian-American cinema” that film critics such as Mandal invoke to characterise diaspora productions. An embodiment of contemporary globalised (post?)coloniality in its narrative scope, embracing not just Amritsar and LA, but also Goa and London, Bride and Prejudice refuses to fit into a neat East versus West cross-cultural model. How, then, are we to classify this film? Is this problem of identity indicative of postmodern indeterminacy of meaning or can the film be seen to occupy a “third” space, to act as a postcolonial hybrid that successfully undermines (neo)colonial hegemony (Sangari, 1-2)? To answer this question, I will examine Bride and Prejudice as a mimic text, focusing specifically on its complex relationship with Bollywood conventions. According to Gurinder Chadha, Bride and Prejudice is a “complete Hindi movie” in which she has paid “homage to Hindi cinema” through “deliberate references to the cinema of Manoj Kumar, Raj Kapoor, Yash Chopra and Karan Johar” (Jha). This list of film makers is associated with a specific Bollywood sub-genre: the patriotic family romance. Combining aspects of two popular Bollywood genres, the “social” (Prasad, 83) and the “romance” (Virdi, 178), this sub-genre enacts the story of young lovers caught within complex familial politics against the backdrop of a nationalist celebration of Indian identity. Using a cinematic language that is characterised by the spectacular in both its aural and visual aspects, the patriotic family romance follows a typical “masala” narrative pattern that brings together “a little action and some romance with a touch of comedy, drama, tragedy, music, and dance” (Jaikumar). Bride and Prejudice’s successful mimicry of this language and narrative pattern is evident in film reviews consistently pointing to its being very “Bollywoodish”: “the songs and some sequences look straight out of a Hindi film” says one reviewer (Adarsh), while another wonders “why this talented director has reduced Jane Austen’s creation to a Bollywood masala film” (Bhaskaran). Setting aside, for the moment, these reviewers’ condemnation of such Bollywood associations, it is worthwhile to explore the implications of yoking together a canonical British text with Indian popular culture. According to Chadha, this combination is made possible since “the themes of Jane Austen’s novels are a ‘perfect fit’ for a Bollywood style film” (Wray). Ostensibly, such a comment may be seen to reinforce the authority of the colonial canonical text by affirming its transnational/transhistorical relevance. From this perspective, the Bollywood adaptation not only becomes a “native” tribute to the colonial “master” text, but also, implicitly, marks the necessary belatedness of Bollywood as a “native” cultural formation that can only mimic the “English book”. Again, Chadha herself seems to subscribe to this view: “I chose Pride and Prejudice because I feel 200 years ago, England was no different than Amritsar today” (Jha). The ease with which the basic plot premise of Pride and Prejudice—a mother with grown-up daughters obsessed with their marriage—transfers to a contemporary Indian setting does seem to substantiate this idea of belatedness. The spatio-temporal contours of the narrative require changes to accommodate the transference from eighteenth-century English countryside to twenty-first-century India, but in terms of themes, character types, and even plot elements, Bride and Prejudice is able to “mimic” its master text faithfully. While the Bennets, Bingleys and Darcy negotiate the relationship between marriage, money and social status in an England transformed by the rise of industrial capitalism, the Bakshis, Balraj and, yes, Will Darcy, undertake the same tasks in an India transformed by corporate globalisation. Differences in class are here overlaid with those in culture as a middle-class Indian family interacts with wealthy non-resident British Indians and American owners of multinational enterprises, mingling the problems created by pride in social status with prejudices rooted in cultural insularity. However, the underlying conflicts between social and individual identity, between relationships based on material expediency and romantic love, remain the same, clearly indicating India’s belated transition from tradition to modernity. It is not surprising, then, that Chadha can claim that “the transposition [of Austen to India] did not offend the purists in England at all” (Jha). But if the purity of the “master” text is not contaminated by such native mimicry, then how does one explain the Indian anglophile rejection of Bride and Prejudice? The problem, according to the Indian reviewers, lies not in the idea of an Indian adaptation, but in the choice of genre, in the devaluation of the “master” text’s cultural currency by associating it with the populist “masala” formula of Bollywood. The patriotic family romance, characterised by spectacular melodrama with little heed paid to psychological complexity, is certainly a far cry from the restrained Austenian narrative that achieves its dramatic effect exclusively through verbal sparring and epistolary revelations. When Elizabeth and Darcy’s quiet walk through Pemberley becomes Lalita and Darcy singing and dancing through public fountains, and the private economic transaction that rescues Lydia from infamy is translated into fisticuff between Darcy and Wickham in front of an applauding cinema audience, mimicry does smack too much of mockery to be taken as a tribute. It is no wonder then that “the news that [Chadha] was making Bride and Prejudice was welcomed with broad grins by everyone [in Britain] because it’s such a cheeky thing to do” (Jha). This cheekiness is evident throughout the film, which provides a splendid over-the-top cinematic translation of Pride and Prejudice that deliberately undermines the seriousness accorded to the Austen text, not just by the literary establishment, but also by cinematic counterparts that attempt to preserve its cultural value through carefully constructed period pieces. Chadha’s Bride and Prejudice, on the other hand, marries British high culture to Indian popular culture, creating a mimic text that is, in Homi Bhabha’s terms, “almost the same, but not quite” (86), thus undermining the authority, the primacy, of the so-called “master” text. This postcolonial subversion is enacted in Chadha’s film at the level of both style and content. If the adaptation of fiction into film is seen as an activity of translation, of a semiotic shift from one language to another (Boyum, 21), then Bride and Prejudice can be seen to enact this translation at two levels: the obvious translation of the language of novel into the language of film, and the more complex translation of Western high culture idiom into the idiom of Indian popular culture. The very choice of target language in the latter case clearly indicates that “authenticity” is not the intended goal here. Instead of attempting to render the target language transparent, making it a non-intrusive medium that derives all its meaning from the source text, Bride and Prejudice foregrounds the conventions of Bollywood masala films, forcing its audience to grapple with this “new” language on its own terms. The film thus becomes a classic instance of the colony “talking back” to the metropolis, of Caliban speaking to Prospero, not in the language Prospero has taught him, but in his own native tongue. The burden of responsibility is shifted; it is Prospero/audiences in the West that have the responsibility to understand the language of Bollywood without dismissing it as gibberish or attempting to domesticate it, to reduce it to the familiar. The presence in Bride and Prejudice of song and dance sequences, for example, does not make it a Hollywood musical, just as the focus on couples in love does not make it a Hollywood-style romantic comedy. Neither The Sound of Music (Robert Wise, 1965) nor You’ve Got Mail (Nora Ephron, 1998) corresponds to the Bollywood patriotic family romance that combines various elements from distinct Hollywood genres into one coherent narrative pattern. Instead, it is Bollywood hits like Dilwale Dulhaniya Le Jayenge (Aditya Chopra, 1995) and Pardes (Subhash Ghai, 1997) that constitute the cinema tradition to which Bride and Prejudice belongs, and against which backdrop it needs to be seen. This is made clear in the film itself where the climactic fight between Darcy and Wickham is shot against a screening of Manoj Kumar’s Purab Aur Paschim (East and West) (1970), establishing Darcy, unequivocally, as the Bollywood hero, the rescuer of the damsel in distress, who deserves, and gets, the audience’s full support, denoted by enthusiastic applause. Through such intertextuality, Bride and Prejudice enacts a postcolonial reversal whereby the usual hierarchy governing the relationship between the colony and the metropolis is inverted. By privileging through style and explicit reference the Indian Bollywood framework in Bride and Prejudice, Chadha implicitly minimises the importance of Austen’s text, reducing it to just one among several intertextual invocations without any claim to primacy. It is, in fact, perfectly possible to view Bride and Prejudice without any knowledge of Austen; its characters and narrative pattern are fully comprehensible within a well-established Bollywood tradition that is certainly more familiar to a larger number of Indians than is Austen. An Indian audience, thus, enjoys a home court advantage with this film, not the least of which is the presence of Aishwarya Rai, the Bollywood superstar who is undoubtedly the central focus of Chadha’s film. But star power apart, the film consolidates the Indian advantage through careful re-visioning of specific plot elements of Austen’s text in ways that clearly reverse the colonial power dynamics between Britain and India. The re-casting of Bingley as the British Indian Balraj re-presents Britain in terms of its immigrant identity. White British identity, on the other hand, is reduced to a single character—Johnny Wickham—which associates it with a callous duplicity and devious exploitation that provide the only instance in this film of Bollywood-style villainy. This re-visioning of British identity is evident even at the level of the film’s visuals where England is identified first by a panning shot that covers everything from Big Ben to a mosque, and later by a snapshot of Buckingham Palace through a window: a combination of its present multicultural reality juxtaposed against its continued self-representation in terms of an imperial tradition embodied by the monarchy. This reductionist re-visioning of white Britain’s imperial identity is foregrounded in the film by the re-casting of Darcy as an American entrepreneur, which effectively shifts the narratorial focus from Britain to the US. Clearly, with respect to India, it is now the US which is the imperial power, with London being nothing more than a stop-over on the way from Amritsar to LA. This shift, however, does not in itself challenge the more fundamental West-East power hierarchy; it merely indicates a shift of the imperial centre without any perceptible change in the contours of colonial discourse. The continuing operation of the latter is evident in the American Darcy’s stereotypical and dismissive attitude towards Indian culture as he makes snide comments about arranged marriages and describes Bhangra as an “easy dance” that looks like “screwing in a light bulb with one hand and patting a dog with the other.” Within the film, this cultural snobbery of the West is effectively challenged by Lalita, the Indian Elizabeth, whose “liveliness of mind” is exhibited here chiefly through her cutting comebacks to such disparaging remarks, making her the film’s chief spokesperson for India. When Darcy’s mother, for example, dismisses the need to go to India since yoga and Deepak Chopra are now available in the US, Lalita asks her if going to Italy has become redundant because Pizza Hut has opened around the corner? Similarly, she undermines Darcy’s stereotyping of India as the backward Other where arranged marriages are still the norm, by pointing out the eerie similarity between so-called arranged marriages in India and the attempts of Darcy’s own mother to find a wife for him. Lalita’s strategy, thus, is not to invert the hierarchy by proving the superiority of the East over the West; instead, she blurs the distinction between the two, while simultaneously introducing the West (as represented by Darcy and his mother) to the “real India”. The latter is achieved not only through direct conversational confrontations with Darcy, but also indirectly through her own behaviour and deportment. Through her easy camaraderie with local Goan kids, whom she joins in an impromptu game of cricket, and her free-spirited guitar-playing with a group of backpacking tourists, Lalita clearly shows Darcy (and the audience in the West) that so-called “Hicksville, India” is no different from the so-called cosmopolitan sophistication of LA. Lalita is definitely not the stereotypical shy retiring Indian woman; this jean-clad, tractor-riding gal is as comfortable dancing the garbha at an Indian wedding as she is sipping marguerites in an LA restaurant. Interestingly, this East-West union in Aishwarya Rai’s portrayal of Lalita as a modern Indian woman de-stabilises the stereotypes generated not only by colonial discourse but also by Bollywood’s brand of conservative nationalism. As Chadha astutely points out, “Bride and Prejudice is not a Hindi film in the true sense. That rikshawallah in the front row in Patna is going to say, ‘Yeh kya hua? Aishwarya ko kya kiya?’ [What did you do to Aishwarya?]” (Jha). This disgruntlement of the average Indian Hindi-film audience, which resulted in the film being a commercial flop in India, is a result of Chadha’s departures from the conventions of her chosen Bollywood genre at both the cinematic and the thematic levels. The perceived problem with Aishwarya Rai, as articulated by the plaintive question of the imagined Indian viewer, is precisely her presentation as a modern (read Westernised) Indian heroine, which is pretty much an oxymoron within Bollywood conventions. In all her mainstream Hindi films, Aishwarya Rai has conformed to these conventions, playing the demure, sari-clad, conventional Indian heroine who is untouched by any “anti-national” western influence in dress, behaviour or ideas (Gangoli,158). Her transformation in Chadha’s film challenges this conventional notion of a “pure” Indian identity that informs the Bollywood “masala” film. Such re-visioning of Bollywood’s thematic conventions is paralleled, in Bride and Prejudice, with a playfully subversive mimicry of its cinematic conventions. This is most obvious in the song-and-dance sequences in the film. While their inclusion places the film within the Bollywood tradition, their actual picturisation creates an audio-visual pastiche that freely mingles Bollywood conventions with those of Hollywood musicals as well as contemporary music videos from both sides of the globe. A song, for example, that begins conventionally enough (in Bollywood terms) with three friends singing about one of them getting married and moving away, soon transforms into a parody of Hollywood musicals as random individuals from the marketplace join in, not just as chorus, but as developers of the main theme, almost reducing the three friends to a chorus. And while the camera alternates between mid and long shots in conventional Bollywood fashion, the frame violates the conventions of stylised choreography by including a chaotic spill-over that self-consciously creates a postmodern montage very different from the controlled spectacle created by conventional Bollywood song sequences. Bride and Prejudice, thus, has an “almost the same, but not quite” relationship not just with Austen’s text but also with Bollywood. Such dual-edged mimicry, which foregrounds Chadha’s “outsider” status with respect to both traditions, eschews all notions of “authenticity” and thus seems to become a perfect embodiment of postcolonial hybridity. Does this mean that postmodern pastiche can fulfill the political agenda of postcolonial resistance to the forces of globalised (neo)imperialism? As discussed above, Bride and Prejudice does provide a postcolonial critique of (neo)colonial discourse through the character of Lalita, while at the same time escaping the trap of Bollywood’s explicitly articulated brand of nationalism by foregrounding Lalita’s (Westernised) modernity. And yet, ironically, the film unselfconsciously remains faithful to contemporary Bollywood’s implicit ideological framework. As most analyses of Bollywood blockbusters in the post-liberalisation (post-1990) era have pointed out, the contemporary patriotic family romance is distinct from its earlier counterparts in its unquestioning embrace of neo-conservative consumerist ideology (Deshpande, 187; Virdi, 203). This enthusiastic celebration of globalisation in its most recent neo-imperial avatar is, interestingly, not seen to conflict with Bollywood’s explicit nationalist agenda; the two are reconciled through a discourse of cultural nationalism that happily co-exists with a globalisation-sponsored rampant consumerism, while studiously ignoring the latter’s neo-colonial implications. Bride and Prejudice, while self-consciously redefining certain elements of this cultural nationalism and, in the process, providing a token recognition of neo-imperial configurations, does not fundamentally question this implicit neo-conservative consumerism of the Bollywood patriotic family romance. This is most obvious in the film’s gender politics where it blindly mimics Bollywood conventions in embodying the nation as a woman (Lalita) who, however independent she may appear, not only requires male protection (Darcy is needed to physically rescue Lakhi from Wickham) but also remains an object of exchange between competing systems of capitalist patriarchy (Uberoi, 207). At the film’s climax, Lalita walks away from her family towards Darcy. But before Darcy embraces the very willing Lalita, his eyes seek out and receive permission from Mr Bakshi. Patriarchal authority is thus granted due recognition, and Lalita’s seemingly bold “independent” decision remains caught within the politics of patriarchal exchange. This particular configuration of gender politics is very much a part of Bollywood’s neo-conservative consumerist ideology wherein the Indian woman/nation is given enough agency to make choices, to act as a “voluntary” consumer, within a globalised marketplace that is, however, controlled by the interests of capitalist patriarchy. The narrative of Bride and Prejudice perfectly aligns this framework with Lalita’s project of cultural nationalism, which functions purely at the personal/familial level, but which is framed at both ends of the film by a visual conjoining of marriage and the marketplace, both of which are ultimately outside Lalita’s control. Chadha’s attempt to appropriate and transform British “Pride” through subversive postcolonial mimicry, thus, ultimately results only in replacing it with an Indian “Bride,” with a “star” product (Aishwarya Rai / Bride and Prejudice / India as Bollywood) in a splendid package, ready for exchange and consumption within the global marketplace. All glittering surface and little substance, Bride and Prejudice proves, once again, that postmodern pastiche cannot automatically double as politically enabling postcolonial hybridity (Sangari, 23-4). References Adarsh, Taran. “Balle Balle! From Amritsar to L.A.” IndiaFM Movie Review 8 Oct. 2004. 19 Feb. 2007 http://indiafm.com/movies/review/7211/index.html>. Austen, Jane. Pride and Prejudice. 1813. New Delhi: Rupa and Co., 1999. Bhabha, Homi. “Of Mimicry and Man: The Ambivalence of Colonial Discourse.” The Location of Culture. Routledge: New York, 1994. 85-92. Bhaskaran, Gautam. “Classic Made Trivial.” The Hindu 15 Oct. 2004. 19 Feb. 2007 http://www.hinduonnet.com/thehindu/fr/2004/10/15/stories/ 2004101502220100.htm>. Boyum, Joy Gould. Double Exposure: Fiction into Film. Calcutta: Seagull Books, 1989. Bride and Prejudice. Dir. Gurinder Chadha. Perf. Aishwarya Ray and Martin Henderson. Miramax, 2004. Deshpande, Sudhanva. “The Consumable Hero of Globalized India.” Bollyworld: Popular Indian Cinema through a Transnational Lens. Eds. Raminder Kaur and Ajay J. Sinha. New Delhi: Sage, 2005. 186-203. Gangoli, Geetanjali. “Sexuality, Sensuality and Belonging: Representations of the ‘Anglo-Indian’ and the ‘Western’ Woman in Hindi Cinema.” Bollyworld: Popular Indian Cinema through a Transnational Lens. Eds. Raminder Kaur and Ajay J. Sinha. New Delhi: Sage, 2005. 143-162. Jaikumar, Priya. “Bollywood Spectaculars.” World Literature Today 77.3/4 (2003): n. pag. Jha, Subhash K. “Bride and Prejudice is not a K3G.” The Rediff Interview 30 Aug. 2004. 19 Feb. 2007 http://in.rediff.com/movies/2004/aug/30finter.htm>. Mandal, Somdatta. Film and Fiction: Word into Image. New Delhi: Rawat Publications, 2005. Prasad, M. Madhava. Ideology of the Hindi Film: A Historical Construction. New Delhi: Oxford UP, 1998. Sangari, Kumkum. Politics of the Possible: Essays on Gender, History, Narratives, Colonial English. New Delhi: Tulika, 1999. Uberoi, Patricia. Freedom and Destiny: Gender, Family, and Popular Culture in India. New Delhi: Oxford UP, 2006. Virdi, Jyotika. The Cinematic Imagination: Indian Popular Films as Social History. Delhi: Permanent Black, 2003. Wray, James. “Gurinder Chadha Talks Bride and Prejudice.” Movie News 7 Feb. 2005. 19 Feb. http://movies.monstersandcritics.com/news/article_4163.php/ Gurinder_Chadha_Talks_Bride_and_Prejudice>. Citation reference for this article MLA Style Mathur, Suchitra. "From British “Pride” to Indian “Bride”: Mapping the Contours of a Globalised (Post?)Colonialism." M/C Journal 10.2 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0705/06-mathur.php>. APA Style Mathur, S. (May 2007) "From British “Pride” to Indian “Bride”: Mapping the Contours of a Globalised (Post?)Colonialism," M/C Journal, 10(2). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0705/06-mathur.php>.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Blakey, Heather. „Designing Player Intent through “Playful” Interaction“. M/C Journal 24, Nr. 4 (12.08.2021). http://dx.doi.org/10.5204/mcj.2802.

Der volle Inhalt der Quelle
Annotation:
The contemporary video game market is as recognisable for its brands as it is for the characters that populate their game worlds, from franchise-leading characters like Garrus Vakarian (Mass Effect original trilogy), Princess Zelda (The Legend of Zelda franchise) and Cortana (HALO franchise) to more recent game icons like Miles Morales (Marvel's Spiderman game franchise) and Judy Alvarez (Cyberpunk 2077). Interactions with these casts of characters enhance the richness of games and their playable worlds, giving a sense of weight and meaning to player actions, emphasising thematic interests, and in some cases acting as buffers to (or indeed hindering) different aspects of gameplay itself. As Jordan Erica Webber writes in her essay The Road to Journey, “videogames are often examined through the lens of what you do and what you feel” (14). For many games, the design of interactions between the player and other beings in the world—whether they be intrinsic to the world (non-playable characters or NPCs) or other live players—is a bridging aspect between what you do and how you feel and is thus central to the communication of more cohesive and focussed work. This essay will discuss two examples of game design techniques present in Transistor by Supergiant Games and Journey by thatgamecompany. It will consider how the design of “playful” interactions between the player and other characters in the game world (both non-player characters and other player characters) can be used as a tool to align a player’s experience of “intent” with the thematic objectives of the designer. These games have been selected as both utilise design techniques that allow for this “playful” interaction (observed in this essay as interactions that do not contribute to “progression” in the traditional sense). By looking closely at specific aspects of game design, it aims to develop an accessible examination by “focusing on the dimensions of involvement the specific game or genre of games affords” (Calleja, 222). The discussion defines “intent”, in the context of game design, through a synthesis of definitions from two works by game designers. The first being Greg Costikyan’s definition of game structure from his 2002 presentation I Have No Words and I Must Design, a paper subsequently referenced by numerous prominent game scholars including Ian Bogost and Jesper Juul. The second is Steven Swink’s definition of intent in relation to video games, from his 2009 book Game Feel: A Game Designer’s Guide to Virtual Sensation—an extensive reference text of game design concepts, with a particular focus on the concept of “game feel” (the meta-sensation of involvement with a game). This exploratory essay suggests that examining these small but impactful design techniques, through the lens of their contribution to overall intent, is a useful tool for undertaking more holistic studies of how games are affective. I align with the argument that understanding “playfulness” in game design is useful in understanding user engagement with other digital communication platforms. In particular, platforms where the presentation of user identity is relational or performative to others—a case explored in Playful Identities: The Ludification of Digital Media Cultures (Frissen et al.). Intent in Game Design Intent, in game design, is generated by a complex, interacting economy, ecosystem, or “game structure” (Costikyan 21) of thematic ideas and gameplay functions that do not dictate outcomes, but rather guide behaviour and progression forward through the need to achieve a goal (Costikyan 21). Intent brings player goals in line with the intrinsic goals of the player character, and the thematic or experiential goals the game designer wants to convey through the act of play. Intent makes it easier to invest in the game’s narrative and spatial context—its role is to “motivate action in game worlds” (Swink 67). Steven Swink writes that it is the role of game design to create compelling intent from “a seemingly arbitrary collection of abstracted variables” (Swink 67). He continues that whether it is good or bad is a broader question, but that “most games do have in-born intentionality, and it is the game designer who creates it” (67). This echoes Costikyan’s point: game designers “must consciously set out to decide what kind of experiences [they] want to impart to players and create systems that enable those experiences” (20). Swink uses Mario 64 as one simple example of intent creation through design—if collecting 100 coins did not restore Mario’s health, players would simply not collect them. Not having health restricts the ability for players to fulfil the overarching intent of progression by defeating the game’s main villain (what he calls the “explicit” intent), and collecting coins also provides a degree of interactivity that makes the exploration itself feel more fulfilling (the “implicit” intent). This motivation for action may be functional, or it may be more experiential—how a designer shapes variables into particular forms to encourage the particular kinds of experience that they want a player to have during the act of play (such as in Journey, explored in the latter part of this essay). This essay is interested in the design of this compelling thematic intent—and the role “playful” interactions have as a variable that contributes to aligning player behaviours and experience to the thematic or experiential goals of game design. “Playful” Communication and Storytelling in Transistor Transistor is the second release from independent studio Supergiant Games and has received over 100 industry accolades (Kasavin) since its publication in 2014. Transistor incorporates the suspense of turn-based gameplay into an action role-playing game—neatly mirroring a style of gameplay to the suspense of its cyber noir narrative. The game is also distinctly “artful”. The city of Cloudbank, where the game takes place, is a cyberpunk landscape richly inspired by art nouveau and art deco style. There is some indication that Cloudbank may not be a real city at all—but rather a virtual city, with an abundance of computer-related motifs and player combat abilities named as if they were programming functions. At release, Transistor was broadly recognised in the industry press for its strength in “combining its visuals and music to powerfully convey narrative information and tone” (Petit). If intent in games in part stems from a unification of goals between the player and design, the interactivity between player input and the actions of the player character furthers this sense of “togetherness”. This articulation and unity of hand movement and visual response in games are what Kirkpatrick identified in his 2011 work Aesthetic Theory and the Video Game as the point in which videogames “broke from the visual entertainment culture of the last two centuries” (Kirkpatrick 88). The player character mediates access to the space by which all other game information is given context and allows the player a degree of self-expression that is unique to games. Swink describes it as an amplified impression of virtual proprioception, that is “an impression of space created by illusory means but is experienced as real by the senses … the effects of motion, sound, visuals, and responsive effects combine” (Swink 28). If we extend Swink’s point about creating an “impression of space” to also include an “impression of purpose”, we can utilise this observation to further understand how the design of the playful interactions in Transistor work to develop and align the player’s experience of intent with the overarching narrative goal (or, “explicit” intent) of the game—to tell a compelling “science-fiction love story in a cyberpunk setting, without the gritty backdrop” (Wallace) through the medium of gameplay. At the centre of any “love story” is the dynamic of a relationship, and in Transistor playful interaction is a means for conveying the significance and complexity of those dynamics in relation to the central characters. Transistor’s exposition asks players to figure out what happened to Red and her partner, The Boxer (a name he is identified by in the game files), while progressing through various battles with an entity called The Process to uncover more information. Transistor commences with player-character, Red, standing next to the body of The Boxer, whose consciousness and voice have been uploaded into the same device that impaled him: the story’s eponymous Transistor. The event that resulted in this strange circumstance has also caused Red to lose her ability to speak, though she is still able to hum. The first action that the player must complete to progress the game is to pull the Transistor from The Boxer’s body. From this point The Boxer, speaking through the Transistor, becomes the sole narrator of the game. The Boxer’s first lines of dialogue are responsive to player action, and position Red’s character in the world: ‘Together again. Heh, sort of …’ [Upon walking towards an exit a unit of The Process will appear] ‘Yikes … found us already. They want you back I bet. Well so do I.’ [Upon defeating The Process] ‘Unmarked alley, east of the bay. I think I know where we are.’ (Supergiant Games) This brief exchange and feedback to player movement, in medias res, limits the player’s possible points of attention and establishes The Boxer’s voice and “character” as the reference point for interacting with the game world. Actions, the surrounding world, and gameplay objectives are given meaning and context by being part of a system of intent derived from the significance of his character to the player character (Red) as both a companion and information-giver. The player may not necessarily feel what an individual in Red’s position would feel, but their expository position is aligned with Red’s narrative, and their scope of interaction with the world is intrinsically tied to the “explicit” intent of finding out what happened to The Boxer. Transistor continues to establish a loop between Red’s exploration of the world and the dialogue and narration of The Boxer. In the context of gameplay, player movement functions as the other half of a conversation and brings the player’s control of Red closer to how Red herself (who cannot communicate vocally) might converse with The Boxer gesturally. The Boxer’s conversational narration is scripted to occur as Red moves through specific parts of the world and achieves certain objectives. Significantly, The Boxer will also speak to Red in response to specific behaviours that only occur should the player choose to do them and that don’t necessarily contribute to “progressing” the game in the mechanical sense. There are multiple points where this is possible, but I will draw on two examples to demonstrate. Firstly, The Boxer will have specific reactions to a player who stands idle for too long, or who performs a repetitive action. Jumping repeatedly from platform to platform will trigger several variations of playful and exasperated dialogue from The Boxer (who has, at this point, no choice but to be carried around by Red): [Upon repeatedly jumping between the same platform] ‘Round and round.’ ‘Okay that’s enough.’ ‘I hate you.’ (Supergiant Games) The second is when Red “hums” (an activity initiated by the player by holding down R1 on a PlayStation console). At certain points of play, when making Red hum, The Boxer will chime in and sing the lyrics to the song she is humming. This musical harmonisation helps to articulate a particular kind of intimacy and flow between Red and The Boxer —accentuated by Red’s animation when humming: she is bathed in golden light and holds the Transistor close, swaying side to side, as if embracing or dancing with a lover. This is a playful, exploratory interaction. It technically doesn’t serve any “purpose” in terms of finishing the game—but is an action a player might perform while exploring controls and possibilities of interactivity, in turn exploring what it is to “be” Red in relation to the game world, the story being conveyed, and The Boxer. It delivers a more emotional and affective thematic idea about a relationship that nonetheless relies just as much on mechanical input and output as engaging in movement, exploration, and combat in the game world. It’s a mechanic that provides texture to the experience of inhabiting Red’s identity during play, showcasing a more individual complexity to her story, driven by interactivity. In techniques like this, Transistor directly unifies its method for information-giving, interactivity, progression, and theme into a single design language. To once again nod to Swink and Costikyan, it is a complex, interacting economy or ecosystem of thematic ideas and gameplay structures that guide behaviour and progression forward through the need to achieve a single goal (Costikyan 21), guiding the player towards the game’s “explicit” intent of investment in its “science fiction love story”. Companionship and Collaboration in Journey Journey is regularly praised in many circles of game review and discussion for its powerful, pared-back story conveyed through its exceptional game design. It has won a wide array of awards, including multiple British Academy Games Awards and Game Developer’s Choice Awards, and has been featured in highly regarded international galleries such as the Victoria and Albert Museum in London. Its director, Jenova Chen, articulated that the goal of the game (and thus, in the context of this essay, the intent) was “to create a game where people who interact with each other in an online community can connect at an emotional level, regardless of their gender, age, ethnicity, and social status” (Webber 14). In Journey, the player controls a small robed figure moving through a vast desert—the only choices for movement are to slide gracefully through the sand or to jump into the air by pressing the X button (on a PlayStation console), and gracefully float down to the ground. You cannot attack anything or defend yourself from the elements or hostile beings. Each player will “periodically find another individual in the landscape” (Isbister 121) of similar design to the player and can only communicate with them by experimenting with simple movements, and via short chirping noises. As the landscape itself is vast and unknown, it is what one player referred to as a sense of “reliance on one another” that makes the game so captivating (Isbister 12). Much like The Boxer in Transistor, the other figure in Journey stands out as a reference point and imbues a sense of collaboration and connection that makes the goal to reach the pinprick of light in the distance more meaningful. It is only after the player has finished the game that the screen reveals the other individual is a real person, another player, by displaying their gamer tag. One player, playing the game in 2017 (several years after its original release in 2012), wrote: I went through most of the game by myself, and when I first met my companion, it was right as I walked into the gate transitioning to the snow area. And I was SO happy that there was someone else in this desolate place. I felt like it added so much warmth to the game, so much added value. The companion and I stuck together 100% of the way. When one of us would fall the slightest bit behind, the other would wait for them. I remember saying out loud how I thought that my companion was the best programmed AI that I had ever seen. In the way that he waited for me to catch up, it almost seemed like he thanked me for waiting for him … We were always side-by-side which I was doing to the "AI" for "cinematic-effect". From when I first met him up to the very very end, we were side-by-side. (Peace_maybenot) Other players indicate a similar bond even when their companion is perhaps less competent: I thought my traveller was a crap AI. He kept getting launched by the flying things and was crap at staying behind cover … But I stuck with him because I was like, this is my buddy in the game. Same thing, we were communicating the whole time and I stuck with him. I finish and I see a gamer tag and my mind was blown. That was awesome. (kerode4791) Although there is a definite object of difference in that Transistor is narrated and single-player while Journey is not, there are some defined correlations between the way Supergiant Games and thatgamecompany encourage players to feel a sense of investment and intent aligned with another individual within the game to further thematic intent. Interactive mechanics are designed to allow players a means of playful and gestural communication as an extension of their kinetic interaction with the game; travellers in Journey can chirp and call out to other players—not always for an intrinsic goal but often to express joy, or just to experience and sense of connectivity or emotional warmth. In Transistor, the ability to hum and hear The Boxer’s harmony, and the animation of Red holding the Transistor close as she does so, implying a sense of protectiveness and affection, says more in the context of “play” than a literal declaration of love between the two characters. Graeme Kirkpatrick uses dance as a suitable metaphor for this kind of experience in games, in that both are characterised by a certainty that communication has occurred despite the “eschewal of overt linguistic elements and discursive meanings” (120). There is also a sense of finite temporality in these moments. Unlike scripted actions, or words on a page, they occur within a moment of being that largely belongs to the player and their actions alone. Kirkpatrick describes it as “an inherent ephemerality about this vanishing and that this very transience is somehow essential” (120). This imbuing of a sense of time is important because it implies that even if one were to play the game again, repeating the interaction is impossible. The communication of narrative within these games is not a static form, but an experience that hangs unique at that moment and space of play. Thatgamecompany discussed in their 2017 interviews with Webber, published as part of her essay for the Victoria & Albert’s Video Games: Design/Play/Disrupt exhibition, how by creating and restricting the kind of playful interaction available to players within the world, they could encourage the kind of emotional, collaborative, and thoughtful intent they desired to portray (Webber 14). They articulate how in the development process they prioritised giving the player a variety of responses for even the smallest of actions and how that positive feedback, in turn, encourages play and prevented players from being “bored” (Webber 22). Meanwhile, the team reduced responsiveness for interactions they didn’t want to encourage. Chen describes the approach as “maximising feedback for things you want and minimising it for things you don’t want” (Webber 27). In her essay, Webber writes that Chen describes “a person who enters a virtual world, leaving behind the value system they’ve learned from real life, as like a baby banging their spoon to get attention” (27): initially players could push each other, and when one baby [player] pushed the other baby [player] off the cliff that person died. So, when we tested the gameplay, even our own developers preferred killing each other because of the amount of feedback they would get, whether it’s visual feedback, audio feedback, or social feedback from the players in the room. For quite a while I was disappointed at our own developers’ ethics, but I was able to talk to a child psychologist and she was able to clarify why these people are doing what they are doing. She said, ‘If you want to train a baby not to knock the spoon, you should minimise the feedback. Either just leave them alone, and after a while they’re bored and stop knocking, or give them a spoon that does not make a sound. (27) The developers then made it impossible for players to kill, steal resources from, or even speak to each other. Players were encouraged to stay close to each other using high-feedback action and responsiveness for doing so (Webber 27). By using feedback design techniques to encourage players to behave a certain way to other beings in the world—both by providing and restricting playful interactivity—thatgamecompany encourage a resonance between players and the overarching design intent of the project. Chen’s observations about the behaviour of his team while playing different iterations of the game also support the argument (acknowledged in different perspectives by various scholarship, including Costikyan and Bogost) that in the act of gameplay, real-life personal ethics are to a degree re-prioritised by the interactivity and context of that interactivity in the game world. Intent and the “Actualities of (Game) Existence” Continuing and evolving explorations of “intent” (and other parallel terms) in games through interaction design is of interest for scholars of game studies; it also is an important endeavour when considering influential relationships between games and other digital mediums where user identity is performative or relational to others. This influence was examined from several perspectives in the aforementioned collection Playful Identities: The Ludification of Digital Media Cultures, which also examined “the process of ludification that seems to penetrate every cultural domain” of modern life, including leisure time, work, education, politics, and even warfare (Frissen et al. 9). Such studies affirm the “complex relationship between play, media, and identity in contemporary culture” and are motivated “not only by the dominant role that digital media plays in our present culture but also by the intuition that ‘“play is central … to media experience” (Frissen et al. 10). Undertaking close examinations of specific “playful” design techniques in video games, and how they may factor into the development of intent, can help to develop nuanced lines of questioning about how we engage with “playfulness” in other digital communication platforms in an accessible, comparative way. We continue to exist in a world where “ludification is penetrating the cultural domain”. In the first few months of the global COVID-19 pandemic, Nintendo released Animal Crossing: New Horizons. With an almost global population in lockdown, Animal Crossing became host to professional meetings (Espiritu), weddings (Garst), and significantly, a media channel for brands to promote content and products (Deighton). TikTok, panoramically, is a platform where “playful” user trends— dances, responding to videos, the “Tell Me … Without Telling Me” challenge—occur in the context of an extremely complex algorithm, that while automated, is created by people—and is thus unavoidably embedded with bias (Dias et al.; Noble). This is not to say that game design techniques and broader “playful” design techniques in other digital communication platforms are interchangeable by any measure, or that intent in a game design sense and intent or bias in a commercial sense should be examined through the same lens. Rather that there is a useful, interdisciplinary resource of knowledge that can further illuminate questions we might ask about this state of “ludification” in both the academic and public spheres. We might ask, for example, what would the implications be of introducing an intent design methodology similar to Journey, but using it for commercial gain? Or social activism? Has it already happened? There is a quotation from Nathan Jurgensen’s 2016 essay Fear of Screens (published in The New Inquiry) that often comes to my mind when thinking about interaction design in video games in this way. In his response to Sherry Turkle’s book, Reclaiming Conversation, Jurgensen writes: each time we say “IRL,” “face-to-face,” or “in person” to mean connection without screens, we frame what is “real” or who is a person in terms of their geographic proximity rather than other aspects of closeness — variables like attention, empathy, affect, erotics, all of which can be experienced at a distance. We should not conceptually preclude or discount all the ways intimacy, passion, love, joy, pleasure, closeness, pain, suffering, evil and all the visceral actualities of existence pass through the screen. “Face to face” should mean more than breathing the same air. (Jurgensen) While Jurgensen is not talking about communication in games specifically, there are comparisons to be drawn between his “variables” and “visceral actualities of existence” as the drivers of social meaning-making, and the methodology of games communicating intent and purpose through Swink’s “seemingly arbitrary collection of abstracted variables” (67). When players interact with other characters in a game world (whether they be NPCs or other players), they are inhabiting a shared virtual space, and how designers articulate and present the variables of “closeness”, as Jurgensen defines it, can shape player alignment with the overarching design intent. These design techniques take the place of Jurgensen’s “visceral actualities of existence”. While they may not intrinsically share an overarching purpose, their experiential qualities have the ability to align ethics, priorities, and values between individuals. Interactivity means game design has the potential to facilitate a particular kind of engagement for the player (as demonstrated in Journey) or give opportunities for players to explore a sense of what an emotion might feel like by aligning it with progression or playful activity (as discussed in relation to Transistor). Players may not “feel” exactly what their player-characters do, or care for other characters in the world in the same way a game might encourage them to, but through thoughtful intent design, something of recognition or unity of belief might pass through the screen. References Bogost, Ian. Persuasive Games: The Expressive Power of Video Games. MIT P, 2007. Calleja, Gordon. “Ludic Identities and the Magic Circle.” Playful Identities: The Ludification of Digital Media Cultures. Eds. Valerie Frissen et al. Amsterdam UP, 2015. 211–224. Costikyan, Greg. “I Have No Words & I Must Design: Toward a Critical Vocabulary for Games.” Computer Games and Digital Cultures Conference Proceedings 2002. Ed. Frans Mäyrä. Tampere UP. 9-33. Dias, Avani, et al. “The TikTok Spiral.” ABC News, 26 July 2021. <https://www.abc.net.au/news/2021-07-26/tiktok-algorithm-dangerous-eating-disorder-content-censorship/100277134>. Deighton, Katie. “Animal Crossing Is Emerging as a Media Channel for Brands in Lockdown.” The Drum, 21 Apr. 2020. <https://www.thedrum.com/news/2020/04/21/animal-crossing-emerging-media-channel-brands-lockdown>. Espiritu, Abby. “Japanese Company Attempts to Work Remotely in Animal Crossing: New Horizons.” The Gamer, 29 Mar. 2020. <https://www.thegamer.com/animal-crossing-new-horizons-work-remotely/>. Frissen, Valerie, et al., eds. Playful Identities: The Ludification of Digital Media Cultures. Amsterdam UP, 2015. Garst, Aron. “The Pandemic Canceled Their Wedding. So They Held It in Animal Crossing.” The Washington Post, 2 Apr. 2020. <https://www.washingtonpost.com/video-games/2020/04/02/animal-crossing-wedding-coronavirus/>. Isbister, Katherine. How Games Move Us: Emotion by Design. MIT P, 2016. Journey. thatgamecompany. 2012. Jurgensen, Nathan. “Fear of Screens.” The New Inquiry, 25 Jan. 2016. <https://thenewinquiry.com/fear-of-screens/>. Kasavin, Greg. “Transistor Earns More than 100+ Industry Accolades, Sells More than 600k Copies.” Supergiant Games, 8 Jan. 2015. <https://www.supergiantgames.com/blog/transistor-earns60-industry-accolades-sells-more-than-600k-copies/>. kerode4791. "Wanted to Share My First Experience with the Game, It Was That Awesome.”Reddit, 22 Mar. 2017. <https://www.reddit.com/r/JourneyPS3/comments/60u0am/wanted_to_share_my_f rst_experience_with_the_game/>. Kirkpatrick, Graeme. Aesthetic Theory and the Video Game. Manchester UP, 2011. Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York UP, 2018. peace_maybenot. "Wanted to Share My First Experience with the Game, It Was that Awesome” Reddit, 22 Mar. 2017. <https://www.reddit.com/r/JourneyPS3/comments/60u0am/wanted_to_share_my_f rst_experience_with_the_game/>. Petit, Carolyn. “Ghosts in the Machine." Gamespot, 20 May 2014. <https://www.gamespot.com/reviews/transistor-review/1900-6415763/>. Swink, Steve. Game Feel: A Game Designer’s Guide to Virtual Sensation. Amsterdam: Morgan Kaufmann Publishers/Elsevier, 2009. Transistor. Supergiant Games. 2014. Wallace, Kimberley. “The Story behind Supergiant Games’ Transistor.” Gameinformer, 20 May 2021. <https://www.gameinformer.com/2021/05/20/the-story-behind-supergiant-games-transistor>. Webber, Jordan Erica. “The Road to Journey.” Videogames: Design/Play/Disrupt. Eds. Marie Foulston and Kristian Volsing. V&A Publishing, 2018. 14–31.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Collins, Steve. „Recovering Fair Use“. M/C Journal 11, Nr. 6 (28.11.2008). http://dx.doi.org/10.5204/mcj.105.

Der volle Inhalt der Quelle
Annotation:
IntroductionThe Internet (especially in the so-called Web 2.0 phase), digital media and file-sharing networks have thrust copyright law under public scrutiny, provoking discourses questioning what is fair in the digital age. Accessible hardware and software has led to prosumerism – creativity blending media consumption with media production to create new works that are freely disseminated online via popular video-sharing Web sites such as YouTube or genre specific music sites like GYBO (“Get Your Bootleg On”) amongst many others. The term “prosumer” is older than the Web, and the conceptual convergence of producer and consumer roles is certainly not new, for “at electric speeds the consumer becomes producer as the public becomes participant role player” (McLuhan 4). Similarly, Toffler’s “Third Wave” challenges “old power relationships” and promises to “heal the historic breach between producer and consumer, giving rise to the ‘prosumer’ economics” (27). Prosumption blurs the traditionally separate consumer and producer creating a new creative era of mass customisation of artefacts culled from the (copyrighted) media landscape (Tapscott 62-3). Simultaneously, corporate interests dependent upon the protections provided by copyright law lobby for augmented rights and actively defend their intellectual property through law suits, takedown notices and technological reinforcement. Despite a lack demonstrable economic harm in many cases, the propertarian approach is winning and frequently leading to absurd results (Collins).The balance between private and public interests in creative works is facilitated by the doctrine of fair use (as codified in the United States Copyright Act 1976, section 107). The majority of copyright laws contain “fair” exceptions to claims of infringement, but fair use is characterised by a flexible, open-ended approach that allows the law to flex with the times. Until recently the defence was unique to the U.S., but on 2 January Israel amended its copyright laws to include a fair use defence. (For an overview of the new Israeli fair use exception, see Efroni.) Despite its flexibility, fair use has been systematically eroded by ever encroaching copyrights. This paper argues that copyright enforcement has spun out of control and the raison d’être of the law has shifted from being “an engine of free expression” (Harper & Row, Publishers, Inc. v. Nation Enterprises 471 U.S. 539, 558 (1985)) towards a “legal regime for intellectual property that increasingly looks like the law of real property, or more properly an idealized construct of that law, one in which courts seeks out and punish virtually any use of an intellectual property right by another” (Lemley 1032). Although the copyright landscape appears bleak, two recent cases suggest that fair use has not fallen by the wayside and may well recover. This paper situates fair use as an essential legal and cultural mechanism for optimising creative expression.A Brief History of CopyrightThe law of copyright extends back to eighteenth century England when the Statute of Anne (1710) was enacted. Whilst the length of this paper precludes an in depth analysis of the law and its export to the U.S., it is important to stress the goals of copyright. “Copyright in the American tradition was not meant to be a “property right” as the public generally understands property. It was originally a narrow federal policy that granted a limited trade monopoly in exchange for universal use and access” (Vaidhyanathan 11). Copyright was designed as a right limited in scope and duration to ensure that culturally important creative works were not the victims of monopolies and were free (as later mandated in the U.S. Constitution) “to promote the progress.” During the 18th century English copyright discourse Lord Camden warned against propertarian approaches lest “all our learning will be locked up in the hands of the Tonsons and the Lintons of the age, who will set what price upon it their avarice chooses to demand, till the public become as much their slaves, as their own hackney compilers are” (Donaldson v. Becket 17 Cobbett Parliamentary History, col. 1000). Camden’s sentiments found favour in subsequent years with members of the North American judiciary reiterating that copyright was a limited right in the interests of society—the law’s primary beneficiary (see for example, Wheaton v. Peters 33 US 591 [1834]; Fox Film Corporation v. Doyal 286 US 123 [1932]; US v. Paramount Pictures 334 US 131 [1948]; Mazer v. Stein 347 US 201, 219 [1954]; Twentieth Century Music Corp. v. Aitken 422 U.S. 151 [1975]; Aronson v. Quick Point Pencil Co. 440 US 257 [1979]; Dowling v. United States 473 US 207 [1985]; Harper & Row, Publishers, Inc. v. Nation Enterprises 471 U.S. 539 [1985]; Luther R. Campbell a.k.a. Luke Skyywalker, et al. v. Acuff-Rose Music, Inc. 510 U.S 569 [1994]). Putting the “Fair” in Fair UseIn Folsom v. Marsh 9 F. Cas. 342 (C.C.D. Mass. 1841) (No. 4,901) Justice Storey formulated the modern shape of fair use from a wealth of case law extending back to 1740 and across the Atlantic. Over the course of one hundred years the English judiciary developed a relatively cohesive set of principles governing the use of a first author’s work by a subsequent author without consent. Storey’s synthesis of these principles proved so comprehensive that later English courts would look to his decision for guidance (Scott v. Stanford L.R. 3 Eq. 718, 722 (1867)). Patry explains fair use as integral to the social utility of copyright to “encourage. . . learned men to compose and write useful books” by allowing a second author to use, under certain circumstances, a portion of a prior author’s work, where the second author would himself produce a work promoting the goals of copyright (Patry 4-5).Fair use is a safety valve on copyright law to prevent oppressive monopolies, but some scholars suggest that fair use is less a defence and more a right that subordinates copyrights. Lange and Lange Anderson argue that the doctrine is not fundamentally about copyright or a system of property, but is rather concerned with the recognition of the public domain and its preservation from the ever encroaching advances of copyright (2001). Fair use should not be understood as subordinate to the exclusive rights of copyright owners. Rather, as Lange and Lange Anderson claim, the doctrine should stand in the superior position: the complete spectrum of ownership through copyright can only be determined pursuant to a consideration of what is required by fair use (Lange and Lange Anderson 19). The language of section 107 suggests that fair use is not subordinate to the bundle of rights enjoyed by copyright ownership: “Notwithstanding the provisions of sections 106 and 106A, the fair use of a copyrighted work . . . is not an infringement of copyright” (Copyright Act 1976, s.107). Fair use is not merely about the marketplace for copyright works; it is concerned with what Weinreb refers to as “a community’s established practices and understandings” (1151-2). This argument boldly suggests that judicial application of fair use has consistently erred through subordinating the doctrine to copyright and considering simply the effect of the appropriation on the market place for the original work.The emphasis on economic factors has led courts to sympathise with copyright owners leading to a propertarian or Blackstonian approach to copyright (Collins; Travis) propagating the myth that any use of copyrighted materials must be licensed. Law and media reports alike are potted with examples. For example, in Bridgeport Music, Inc., et al v. Dimension Films et al 383 F. 3d 400 (6th Cir. 2004) a Sixth Circuit Court of Appeals held that the transformative use of a three-note guitar sample infringed copyrights and that musicians must obtain licence from copyright owners for every appropriated audio fragment regardless of duration or recognisability. Similarly, in 2006 Christopher Knight self-produced a one-minute television advertisement to support his campaign to be elected to the board of education for Rockingham County, North Carolina. As a fan of Star Wars, Knight used a makeshift Death Star and lightsaber in his clip, capitalising on the imagery of the Jedi Knight opposing the oppressive regime of the Empire to protect the people. According to an interview in The Register the advertisement was well received by local audiences prompting Knight to upload it to his YouTube channel. Several months later, Knight’s clip appeared on Web Junk 2.0, a cable show broadcast by VH1, a channel owned by media conglomerate Viacom. Although his permission was not sought, Knight was pleased with the exposure, after all “how often does a local school board ad wind up on VH1?” (Metz). Uploading the segment of Web Junk 2.0 featuring the advertisement to YouTube, however, led Viacom to quickly issue a take-down notice citing copyright infringement. Knight expressed his confusion at the apparent unfairness of the situation: “Viacom says that I can’t use my clip showing my commercial, claiming copy infringement? As we say in the South, that’s ass-backwards” (Metz).The current state of copyright law is, as Patry says, “depressing”:We are well past the healthy dose stage and into the serious illness stage ... things are getting worse, not better. Copyright law has abandoned its reason for being: to encourage learning and the creation of new works. Instead, its principal functions now are to preserve existing failed business models, to suppress new business models and technologies, and to obtain, if possible, enormous windfall profits from activity that not only causes no harm, but which is beneficial to copyright owners. Like Humpty-Dumpty, the copyright law we used to know can never be put back together.The erosion of fair use by encroaching private interests represented by copyrights has led to strong critiques leveled at the judiciary and legislators by Lessig, McLeod and Vaidhyanathan. “Free culture” proponents warn that an overly strict copyright regime unbalanced by an equally prevalent fair use doctrine is dangerous to creativity, innovation, culture and democracy. After all, “few, if any, things ... are strictly original throughout. Every book in literature, science and art, borrows, and must necessarily borrow, and use much which was well known and used before. No man creates a new language for himself, at least if he be a wise man, in writing a book. He contents himself with the use of language already known and used and understood by others” (Emerson v. Davis, 8 F. Cas. 615, 619 (No. 4,436) (CCD Mass. 1845), qted in Campbell v. Acuff-Rose, 62 U.S.L.W. at 4171 (1994)). The rise of the Web 2.0 phase with its emphasis on end-user created content has led to an unrelenting wave of creativity, and much of it incorporates or “mashes up” copyright material. As Negativland observes, free appropriation is “inevitable when a population bombarded with electronic media meets the hardware [and software] that encourages them to capture it” and creatively express themselves through appropriated media forms (251). The current state of copyright and fair use is bleak, but not beyond recovery. Two recent cases suggest a resurgence of the ideology underpinning the doctrine of fair use and the role played by copyright.Let’s Go CrazyIn “Let’s Go Crazy #1” on YouTube, Holden Lenz (then eighteen months old) is caught bopping to a barely recognizable recording of Prince’s “Let’s Go Crazy” in his mother’s Pennsylvanian kitchen. The twenty-nine second long video was viewed a mere twenty-eight times by family and friends before Stephanie Lenz received an email from YouTube informing her of its compliance with a Digital Millennium Copyright Act (DMCA) take-down notice issued by Universal, copyright owners of Prince’s recording (McDonald). Lenz has since filed a counterclaim against Universal and YouTube has reinstated the video. Ironically, the media exposure surrounding Lenz’s situation has led to the video being viewed 633,560 times at the time of writing. Comments associated with the video indicate a less than reverential opinion of Prince and Universal and support the fairness of using the song. On 8 Aug. 2008 a Californian District Court denied Universal’s motion to dismiss Lenz’s counterclaim. The question at the centre of the court judgment was whether copyright owners should consider “the fair use doctrine in formulating a good faith belief that use of the material in the manner complained of is not authorized by the copyright owner, its agent, or the law.” The court ultimately found in favour of Lenz and also reaffirmed the position of fair use in relation to copyright. Universal rested its argument on two key points. First, that copyright owners cannot be expected to consider fair use prior to issuing takedown notices because fair use is a defence, invoked after the act rather than a use authorized by the copyright owner or the law. Second, because the DMCA does not mention fair use, then there should be no requirement to consider it, or at the very least, it should not be considered until it is raised in legal defence.In rejecting both arguments the court accepted Lenz’s argument that fair use is an authorised use of copyrighted materials because the doctrine of fair use is embedded into the Copyright Act 1976. The court substantiated the point by emphasising the language of section 107. Although fair use is absent from the DMCA, the court reiterated that it is part of the Copyright Act and that “notwithstanding the provisions of sections 106 and 106A” a fair use “is not an infringement of copyright” (s.107, Copyright Act 1976). Overzealous rights holders frequently abuse the DMCA as a means to quash all use of copyrighted materials without considering fair use. This decision reaffirms that fair use “should not be considered a bizarre, occasionally tolerated departure from the grand conception of the copyright design” but something that it is integral to the constitution of copyright law and essential in ensuring that copyright’s goals can be fulfilled (Leval 1100). Unlicensed musical sampling has never fared well in the courtroom. Three decades of rejection and admonishment by judges culminated in Bridgeport Music, Inc., et al v. Dimension Films et al 383 F. 3d 400 (6th Cir. 2004): “Get a license or do not sample. We do not see this stifling creativity in any significant way” was the ruling on an action brought against an unlicensed use of a three-note guitar sample under section 114, an audio piracy provision. The Bridgeport decision sounded a death knell for unlicensed sampling, ensuring that only artists with sufficient capital to pay the piper could legitimately be creative with the wealth of recorded music available. The cost of licensing samples can often outweigh the creative merit of the act itself as discussed by McLeod (86) and Beaujon (25). In August 2008 the Supreme Court of New York heard EMI v. Premise Media in which EMI sought an injunction against an unlicensed fifteen second excerpt of John Lennon’s “Imagine” featured in Expelled: No Intelligence Allowed, a controversial documentary canvassing alleged chilling of intelligent design proponents in academic circles. (The family of John Lennon and EMI had previously failed to persuade a Manhattan federal court in a similar action.) The court upheld Premise Media’s arguments for fair use and rejected the Bridgeport approach on which EMI had rested its entire complaint. Justice Lowe criticised the Bridgeport court for its failure to examine the legislative intent of section 114 suggesting that courts should look to the black letter of the law rather than blindly accept propertarian arguments. This decision is of particular importance because it establishes that fair use applies to unlicensed use of sound recordings and re-establishes de minimis use.ConclusionThis paper was partly inspired by the final entry on eminent copyright scholar William Patry’s personal copyright law blog (1 Aug. 2008). A copyright lawyer for over 25 years, Patry articulated his belief that copyright law has swung too far away from its initial objectives and that balance could never be restored. The two cases presented in this paper demonstrate that fair use – and therefore balance – can be recovered in copyright. The federal Supreme Court and lower courts have stressed that copyright was intended to promote creativity and have upheld the fair doctrine, but in order for the balance to exist in copyright law, cases must come before the courts; copyright myth must be challenged. As McLeod states, “the real-world problems occur when institutions that actually have the resources to defend themselves against unwarranted or frivolous lawsuits choose to take the safe route, thus eroding fair use”(146-7). ReferencesBeaujon, Andrew. “It’s Not the Beat, It’s the Mocean.” CMJ New Music Monthly. April 1999.Collins, Steve. “Good Copy, Bad Copy: Covers, Sampling and Copyright.” M/C Journal 8.3 (2005). 26 Aug. 2008 ‹http://journal.media-culture.org.au/0507/02-collins.php›.———. “‘Property Talk’ and the Revival of Blackstonian Copyright.” M/C Journal 9.4 (2006). 26 Aug. 2008 ‹http://journal.media-culture.org.au/0609/5-collins.php›.Donaldson v. Becket 17 Cobbett Parliamentary History, col. 953.Efroni, Zohar. “Israel’s Fair Use.” The Center for Internet and Society (2008). 26 Aug. 2008 ‹http://cyberlaw.stanford.edu/node/5670›.Lange, David, and Jennifer Lange Anderson. “Copyright, Fair Use and Transformative Critical Appropriation.” Conference on the Public Domain, Duke Law School. 2001. 26 Aug. 2008 ‹http://www.law.duke.edu/pd/papers/langeand.pdf›.Lemley, Mark. “Property, Intellectual Property, and Free Riding.” Texas Law Review 83 (2005): 1031.Lessig, Lawrence. The Future of Ideas. New York: Random House, 2001.———. Free Culture. New York: Penguin, 2004.Leval, Pierre. “Toward a Fair Use Standard.” Harvard Law Review 103 (1990): 1105.McDonald, Heather. “Holden Lenz, 18 Months, versus Prince and Universal Music Group.” About.com: Music Careers 2007. 26 Aug. 2008 ‹http://musicians.about.com/b/2007/10/27/holden-lenz-18-months-versus-prince-and-universal-music-group.htm›.McLeod, Kembrew. “How Copyright Law Changed Hip Hop: An interview with Public Enemy’s Chuck D and Hank Shocklee.” Stay Free 2002. 26 Aug. 2008 ‹http://www.stayfreemagazine.org/archives/20/public_enemy.html›.———. Freedom of Expression: Overzealous Copyright Bozos and Other Enemies of Creativity. United States: Doubleday, 2005.McLuhan, Marshall, and Barrington Nevitt. Take Today: The Executive as Dropout. Ontario: Longman Canada, 1972.Metz, Cade. “Viacom Slaps YouTuber for Behaving like Viacom.” The Register 2007. 26 Aug. 2008 ‹http://www.theregister.co.uk/2007/08/30/viacom_slaps_pol/›.Negativland, ed. Fair Use: The Story of the Letter U and the Numeral 2. Concord: Seeland, 1995.Patry, William. The Fair Use Privilege in Copyright Law. Washington DC: Bureau of National Affairs, 1985.———. “End of the Blog.” The Patry Copyright Blog. 1 Aug. 2008. 27 Aug. 2008 ‹http://williampatry.blogspot.com/2008/08/end-of-blog.html›.Tapscott, Don. The Digital Economy: Promise and Peril in the Age of Networked Intelligence. New York: McGraw Hill, 1996.Toffler, Alvin. The Third Wave. London, Glasgow, Sydney, Auckland. Toronto, Johannesburg: William Collins, 1980.Travis, Hannibal. “Pirates of the Information Infrastructure: Blackstonian Copyright and the First Amendment.” Berkeley Technology Law Journal, Vol. 15 (2000), No. 777.Vaidhyanathan, Siva. Copyrights and Copywrongs: The Rise of Intellectual Property and How It Threatens Creativity. New York; London: New York UP, 2003.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Stamm, Emma. „Anomalous Forms in Computer Music“. M/C Journal 23, Nr. 5 (07.10.2020). http://dx.doi.org/10.5204/mcj.1682.

Der volle Inhalt der Quelle
Annotation:
IntroductionFor Gilles Deleuze, computational processes cannot yield the anomalous, or that which is unprecedented in form and content. He suggests that because computing functions are mechanically standardised, they always share the same ontic character. M. Beatrice Fazi claims that the premises of his critique are flawed. Her monograph Contingent Computation: Abstraction, Experience, and Indeterminacy in Computational Aesthetics presents an integrative reading of thinkers including Henri Bergson, Alfred North Whitehead, Kurt Gödel, Alan Turing, and Georg Cantor. From this eclectic basis, Fazi demonstrates that computers differ from humans in their modes of creation, yet still produce qualitative anomaly. This article applies her research to the cultural phenomenon of live-coded music. Live coding artists improvise music by writing audio computer functions which produce sound in real time. I draw from Fazi’s reading of Deleuze and Bergson to investigate the aesthetic mechanisms of live coding. In doing so, I give empirical traction to her argument for the generative properties of computers.Part I: Reconciling the Discrete and the Continuous In his book Difference and Repetition, Deleuze defines “the new” as that which radically differs from the known and familiar (136). Deleuzean novelty bears unpredictable creative potential; as he puts it, the “new” “calls forth forces in thought which are not the forces of recognition” (136). These forces issue from a space of alterity which he describes as a “terra incognita” and a “completely other model” (136). Fazi writes that Deleuze’s conception of novelty informs his aesthetic philosophy. She notes that Deleuze follows the etymological origins of the word “aesthetic”, which lie in the Ancient Greek term aisthēsis, or perception from senses and feelings (Fazi, “Digital Aesthetics” 5). Deleuze observes that senses, feelings, and cognition are interwoven, and suggests that creative processes beget new links between these faculties. In Fazi’s words, Deleuzean aesthetic research “opposes any existential modality that separates life, thought, and sensation” (5). Here, aesthetics does not denote a theory of art and is not concerned with such traditional topics as beauty, taste, and genre. Aesthetics-as-aisthēsis investigates the conditions which make it possible to sense, cognise, and create anomalous phenomena, or that which has no recognisable forebear.Fazi applies Deleuzean aesthetics towards an ontological account of computation. Towards this end, she challenges Deleuze’s precept that computers cannot produce the aesthetic “new”. As she explains, Deleuze denies this ability to computers on the grounds that computation operates on discrete variables, or data which possess a quantitatively finite array of possible values (6). Deleuze understands discreteness as both a quantitative and ontic condition, and implies that computation cannot surpass this originary state. In his view, only continuous phenomena are capable of aisthēsis as the function which yields ontic novelty (5). Moreover, he maintains that continuous entities cannot be represented, interpreted, symbolised, or codified. The codified discreteness of computation is therefore “problematic” within his aesthetic framework “inasmuch it exemplifies yet another development of the representational”. or a repetition of sameness (6). The Deleuzean act of aisthēsis does not compute, repeat, or iterate what has come before. It yields nothing less than absolute difference.Deleuze’s theory of creation as differentiation is prefigured by Bergson’s research on multiplicity, difference and time. Bergson holds that the state of being multiple is ultimately qualitative rather than quantitative, and that multiplicity is constituted by qualitative incommensurability, or difference in kind as opposed to degree (Deleuze, Bergsonism 42). Qualia are multiple when they cannot not withstand equivocation through a common substrate. Henceforth, entities that comprise discrete data, including all products and functions of digital computation, cannot aspire to true multiplicity or difference. In The Creative Mind, Bergson considers the concept of time from this vantage point. As he indicates, time is normally understood as numerable and measurable, especially by mathematicians and scientists (13). He sets out to show that this conception is an illusion, and that time is instead a process by which continuous qualia differentiate and self-actualise as unique instances of pure time, or what he calls “duration as duration”. As he puts it,the measuring of time never deals with duration as duration; what is counted is only a certain number of extremities of intervals, or moments, in short, virtual halts in time. To state that an incident will occur at the end of a certain time t, is simply to say that one will have counted, from now until then, a number t of simultaneities of a certain kind. In between these simultaneities anything you like may happen. (12-13)The in-between space where “anything you like may happen” inspired Deleuze’s notion of ontic continua, or entities whose quantitative limitlessness connects with their infinite aesthetic potentiality. For Bergson, those who believe that time is finite and measurable “cannot succeed in conceiving the radically new and unforeseeable”, a sentiment which also appears to have influenced Deleuze (The Creative Mind 17).The legacy of Bergson and Deleuze is traceable to the present era, where the alleged irreconcilability of the discrete and the continuous fuels debates in digital media studies. Deleuze is not the only thinker to explore this tension: scholars in the traditions of phenomenology, critical theory, and post-Marxism have positioned the continuousness of thought and feeling against the discreteness of computation (Fazi, “Digital Aesthetics” 7). Fazi contributes to this discourse by establishing that the ontic character of computation is not wholly predicated on quantitatively discrete elements. Drawing from Turing’s theory of computability, she claims that computing processes incorporate indeterminable and uncomputable forces in open-ended processes that “determine indeterminacy” (Fazi, Contingent Computation 1). She also marshals philosopher Stamatia Portanova, whose book Moving Without a Body: Digital Philosophy and Choreographic Thoughtsindicates that discrete and continuous components merge in processes that digitise bodily motion (Portanova 3). In a similar but more expansive maneuver, Fazi declares that the discrete and continuous coalesce in all computational operations. Although Fazi’s work applies to all forms of computing, it casts new light on specific devices, methodologies, and human-computer interfaces. In the next section, I use her reading of Bergsonian elements in Deleuze to explore the contemporary artistic practice of live coding. My reading situates live coding in the context of studies on improvisation and creative indeterminacy.Part II: Live Coding as Contingent Improvisational PracticeThe term “live coding” describes an approach to programming where computer functions immediately render as images and/or sound. Live coding interfaces typically feature two windows: one for writing source code and another which displays code outcomes, for example as graphic visualisations or audio. The practice supports the rapid evaluation, editing, and exhibition of code in progress (“A History of Live Programming”). Although it encompasses many different activities, the phrase “live coding” is most often used in the context of computer music. In live coding performances or “AlgoRaves,” musicians write programs on stage in front of audiences. The programming process might be likened to playing an instrument. Typically, the coding interface is projected on a large screen, allowing audiences to see the musical score as it develops (Magnusson, “Improvising with the Threnoscope” 19). Technologists, scholars, and educators have embraced live coding as both a creative method and an object of study. Because it provides immediate feedback, it is especially useful as a pedagogical aide. Sonic Pi, a user-friendly live coding language, was originally designed to teach programming basics to children. It has since been adopted by professional musicians across the world (Aaron). Despites its conspicuousness in educational and creative settings, scholars have rarely explored live coding in the context of improvisation studies. Programmers Gordan Kreković and Antonio Pošćic claim that this is a notable oversight, as improvisation is its “most distinctive feature”. In their view, live coding is most simply defined as an improvisational method, and its strong emphasis on chance sets it apart from other approaches to computer music (Kreković and Pošćić). My interest with respect to live coding lies in how its improvisational mechanisms blend computational discreteness and continuous “real time”. I do not mean to suggest that live coding is the only implement for improvising music with computers. Any digital instrument can be used to spontaneously play, produce, and record sound. What makes live coding unique is that it merges the act of playing with the process of writing notation: musicians play for audiences in the very moment that they produce a written score. The process fuses the separate functions of performing, playing, seeing, hearing, and writing music in a patently Deleuzean act of aisthēsis. Programmer Thor Magnusson writes that live coding is the “offspring” of two very different creative practices: first, “the formalization and encoding of music”; second, “open work resisting traditional forms of encoding” (“Algorithms as Scores” 21). By “traditional forms of encoding”, Magnusson refers to computer programs which function only insofar as source code files are static and immutable. By contrast, live coding relies on the real-time elaboration of new code. As an improvisational art, the process and product of live-coding does not exist without continuous interventions from external forces.My use of the phrase “real time” evokes Bergson’s concept of “pure time” or “duration as duration”. “Real time” phenomena are understood to occur instantaneously, that is, at no degree of temporal removal from those who produce and experience them. However, Bergson suggests that instantaneity is a myth. By his account, there always exists some degree of removal between events as they occur and as they are perceived, even if this gap is imperceptibly small. Regardless of size, the indelible space in time has important implications for theories of improvisation. For Deleuze and Bergson, each continuous particle of time is a germinal seed for the new. Fazi uses the word “contingent” to describe this ever-present, infinite potentiality (Contingent Computation, 1). Improvisation studies scholar Dan DiPiero claims that the concept of contingency not only qualifies future possibilities, but also describes past events that “could have been otherwise” (2). He explains his reasoning as follows:before the event, the outcome is contingent as in not-yet-known; after the event, the result is contingent as in could-have-been-otherwise. What appears at first blush a frustrating theoretical ambiguity actually points to a useful insight: at any given time in any given process, there is a particular constellation of openings and closures, of possibilities and impossibilities, that constitute a contingent situation. Thus, the contingent does not reference either the open or the already decided but both at once, and always. (2)Deleuze might argue that only continuous phenomena are contingent, and that because they are quantitatively finite, the structures of computational media — including the sound and notation of live coding scores — can never “be otherwise” or contingent as such. Fazi intervenes by indicating the role of quantitative continuousness in all computing functions. Moreover, she aligns her project with emerging theories of computing which “focus less on internal mechanisms and more on external interaction”, or interfaces with continuous, non-computational contexts (“Digital Aesthetics,” 19). She takes computational interactions with external environments, such as human programmers and observers, as “the continuous directionality of composite parts” (19).To this point, it matters that discrete objects always exist in relation to continuous environments, and that discrete objects make up continuous fluxes when mobilised as part of continuous temporal processes. It is for this reason that Portanova uses the medium of dance to explore the entanglement of discreteness and temporal contingency. As with music, the art of dance depends on the continuous unfolding of time. Fazi writes that Portanova’s study of choreography reveals “the unlimited potential that every numerical bit of a program, or every experiential bit of a dance (every gesture and step), has to change and be something else” (Contingent Computation, 39). As with the zeroes and ones of a binary computing system, the footfalls of a dance materialise as discrete parts which inhabit and constitute continuous vectors of time. Per Deleuzean aesthetics-as-aisthēsis, these parts yield new connections between sound, space, cognition, and feeling. DiPiero indicates that in the case of improvised artworks, the ontic nature of these links defies anticipation. In his words, improvisation forces artists and audiences to “think contingency”. “It is not that discrete, isolated entities connect themselves to form something greater”, he explains, “but rather that the distance between the musician as subject and the instrument as object is not clearly defined” (3). So, while live coder and code persist as separate phenomena, the coding/playing/performing process highlights the qualitative indeterminacy of the space between them. Each moment might beget the unrecognisable — and this ineluctable, ever-present surprise is essential to the practice.To be sure, there are elements of predetermination in live coding practices. For example, musicians often save and return to specific functions in the midst of performances. But as Kreković and Pošćić point out all modes of improvisation rely on patterning and standardisation, including analog and non-computational techniques. Here, they cite composer John Cage’s claim that there exists no “true” improvisation because artists “always find themselves in routines” (Kreković and Pošćić). In a slight twist on Cage, Kreković and Pošćić insist that repetition does not make improvisation “untrue”, but rather that it points to an expanded role for indeterminacy in all forms of composition. As they write,[improvisation] can both be viewed as spontaneous composition and, when distilled to its core processes, a part of each compositional approach. Continuous and repeated improvisation can become ingrained, classified, and formalised. Or, if we reverse the flow of information, we can consider composition to be built on top of quiet, non-performative improvisations in the mind of the composer. (Kreković and Pošćić)This commentary echoes Deleuze’s thoughts on creativity and ontic continuity. To paraphrase Kreković and Pošćić, the aisthēsis of sensing, feeling, and thinking yields quiet, non-performative improvisations that play continuously in each individual mind. Fazi’s reading of Deleuze endows computable phenomena with this capacity. She does not endorse a computational theory of cognition that would permit computers to think and feel in the same manner as humans. Instead, she proposes a Deleuzean aesthetic capacity proper to computation. Live coding exemplifies the creative potential of computers as articulated by Fazi in Contingent Computation. Her research has allowed me to indicate live coding as an embodiment of Deleuze and Bergson’s theories of difference and creativity. Importantly, live coding affirms their philosophical premises not in spite of its technologised discreteness — which they would have considered problematic — but because it leverages discreteness in service of the continuous aesthetic act. My essay might also serve as a prototype for studies on digitality which likewise aim to supersede the divide between discrete and continuous media. As I have hopefully demonstrated, Fazi’s framework allows scholars to apprehend all forms of computation with enhanced clarity and openness to new possibilities.Coda: From Aesthetics to PoliticsBy way of a coda, I will reflect on the relevance of Fazi’s work to contemporary political theory. In “Digital Aesthetics”, she makes reference to emerging “oppositions to the mechanization of life” from “post-structuralist, postmodernist and post-Marxist” perspectives (7). One such argument comes from philosopher Bernard Stiegler, whose theory of psychopower conceives “the capture of attention by technological means” as a political mechanism (“Biopower, Psychopower and the Logic of the Scapegoat”). Stiegler is chiefly concerned with the psychic impact of discrete technological devices. As he argues, the habitual use of these instruments advances “a proletarianization of the life of the mind” (For a New Critique of Political Economy 27). For Stiegler, human thought is vulnerable to discretisation processes, which effects the loss of knowledge and quality of life. He considers this process to be a form of political hegemony (34).Philosopher Antoinette Rouvroy proposes a related theory called “algorithmic governmentality” to describe the political effects of algorithmic prediction mechanisms. As she claims, predictive algorithms erode “the excess of the possible on the probable”, or all that cannot be accounted for in advance by statistical probabilities. In her words,all these events that can occur and that we cannot predict, it is the excess of the possible on the probable, that is everything that escapes it, for instance the actuarial reality with which we try precisely to make the world more manageable in reducing it to what is predictable … we have left this idea of the actuarial reality behind for what I would call a “post-actuarial reality” in which it is no longer about calculating probabilities but to account in advance for what escapes probability and thus the excess of the possible on the probable. (8)In the past five years, Stiegler and Rouvroy have collaborated on research into the politics of technological determinacy. The same issue concerned Deleuze almost three decades ago: his 1992 essay “Postscript on the Societies of Control” warns that future subjugation will proceed as technological prediction and enclosure. He writes of a dystopian society which features a “numerical language of control … made of codes that mark access to information, or reject it” (5). The society of control reduces individuals to “dividuals”, or homogenised and interchangeable numeric fractions (5). These accounts of political power equate digital discreteness with ontic finitude, and suggest that ubiquitous digital computing threatens individual agency and societal diversity. Stiegler and Deleuze envision a sort of digital reification of human subjectivity; Rouvroy puts forth the idea that algorithmic development will reduce the possibilities inherent in social life to mere statistical likelihoods. While Fazi’s work does not completely discredit these notions, it might instead be used to scrutinise their assumptions. If computation is not ontically finite, then political allegations against it must consider its opposition to human life with greater nuance and rigor.ReferencesAaron, Sam. “Programming as Performance.” Tedx Talks. YouTube, 22 July 2015. <https://www.youtube.com/watch?v=TK1mBqKvIyU&t=333s>.“A History of Live Programming.” Live Prog Blog. 13 Jan. 2013. <liveprogramming.github.io/liveblog/2013/01/a-history-of-live-programming/>.Bergson, Henri. The Creative Mind: An Introduction to Metaphysics. Trans. Mabelle L. Andison. New York City: Carol Publishing Group, 1992.———. Time and Free Will: An Essay on the Immediate Data of Consciousness. Trans. F.L. Pogson. Mineola: Dover Publications, 2001.Deleuze, Gilles. Difference and Repetition. Trans. Paul Patton. New York City: Columbia UP, 1994.———. "Postscript on the Societies of Control." October 59 (1992): 3-7.———. Bergsonism. Trans. Hugh Tomlinson and Barbara Habberjam. New York City: Zone Books, 1991.DiPiero, Dan. “Improvisation as Contingent Encounter, Or: The Song of My Toothbrush.” Critical Studies in Improvisation / Études Critiques en Improvisation 12.2 (2018). <https://www.criticalimprov.com/index.php/csieci/article/view/4261>.Fazi, M. Beatrice. Contingent Computation: Abstraction, Experience, and Indeterminacy in Computational Aesthetics. London: Rowman & Littlefield International, 2018.———. “Digital Aesthetics: The Discrete and the Continuous.” Theory, Culture & Society 36.1 (2018): 3-26.Fortune, Stephen. “What on Earth Is Livecoding?” Dazed Digital, 14 May 2013. <https://www.dazeddigital.com/artsandculture/article/16150/1/what-on-earth-is-livecoding>.Kreković, Gordan, and Antonio Pošćić. “Modalities of Improvisation in Live Coding.” Proceedings of xCoaX 2019, the 7th Conference on Computation, Communication, Aesthetics & X. Fabbrica del Vapore, Milan, Italy, 5 July 2019.Magnusson, Thor. “Algorithms as Scores: Coding Live Music.” Leonardo Music Journal 21 (2011): 19-23. ———. “Improvising with the Threnoscope: Integrating Code, Hardware, GUI, Network, and Graphic Scores.” Proceedings of the International Conference on New Interfaces for Musical Expression. Goldsmiths, University of London, London, England, 1 July 2014.Portanova, Stamatia. Moving without a Body: Digital Philosophy and Choreographic Thoughts. Cambridge, MA: The MIT P, 2013.Rouvroy, Antoinette.“The Digital Regime of Truth: From the Algorithmic Governmentality to a New Rule of Law.” Trans. Anaïs Nony and Benoît Dillet. La Deleuziana: Online Journal of Philosophy 3 (2016). <http://www.ladeleuziana.org/wp-content/uploads/2016/12/Rouvroy-Stiegler_eng.pdf>Stiegler, Bernard. For a New Critique of Political Economy. Malden: Polity Press, 2012.———. “Biopower, Psychopower and the Logic of the Scapegoat.” Ars Industrialis (no date given). <www.arsindustrialis.org/node/2924>.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Meikle, Graham. „Indymedia and The New Net News“. M/C Journal 6, Nr. 2 (01.04.2003). http://dx.doi.org/10.5204/mcj.2153.

Der volle Inhalt der Quelle
Annotation:
Scores of farm workers on hunger strike in the US. A campaigner for affordable housing abducted in Cape Town. Tens of thousands of anti-war demonstrators marching in Istanbul. None of those stories made my daily paper — instead, I read them all this morning on the global Indymedia network. Developments in communication technologies have often enabled new approaches to the production, distribution and reception of news. In this article, using Carey’s analysis of the impacts of the telegraph (1989) and Burnett and Marshall’s discussion of “informational news” (2003) as starting points, I want to offer some examples from the brief history of the Indymedia movement to show how the Net is making possible a significant shift in who gets to make the news. The telegraph offers a number of useful perspectives from which to consider the impacts of the Net, and there are some striking parallels between the dot.com boom of the 1990s and the dot.dash boom of the 19th century. Telegraphy, writes James Carey, “permitted for the first time the effective separation of communication from transportation” (203). The telegraph was not only an instrument of business, but “a thing to think with, an agency for the alteration of ideas” (204). And a consideration of the telegraph offers a number of examples of the relationships between technological form and the nature of news. One such example, in Carey’s analysis, was the impact of the telegraph on the language and nature of journalism. “If the same story were to be understood in the same way from Maine to California,” he writes, “language had to be flattened out and standardised” (210). Local colour was bleached out of news reports to make them saleable in a market unconstrained by geography. “The origins of objectivity,” Carey argues, “may be sought, therefore, in the necessity of stretching language in space over the long lines of Western Union” (210). The telegraph didn’t just affect the quality of news — it greatly increased the quantity of it as well, forcing greater attention to be paid to the management of newsrooms. News became a commodity; not only that, just like cattle or wheat, news was now subject to all the vagaries of any other commodity business, from contracts and price gouging to outright theft (211). And in Western Union, the telegraph made possible the prototype of today’s transnational media firms (201). As the telegraph solved problems of communicating across space, it opened up time as a new arena for expansion. In this sense, the gradual emergence of 24-hour broadcasting schedules is traceable to the impact of the telegraph (Carey 228). A key legacy of this impact is the rise to primacy of CNN and its imitators, offering round-the-clock news coverage made possible by satellite transmission. This too changed the nature of news. As McKenzie Wark has pointed out, a 24-hour continuous news service is not ideally compatible with the established narrative strategies of news. Rather than cutting and shaping events to fit familiar narrative forms, CNN instead introduced an emphasis on what Wark calls “the queer concept of ‘live’ news coverage — an instant audiovisual presence on the site of an event” (38). This focus on speed and immediacy, on being the first on the scene, leads to news that is all event and no process. More than this, it leads at times to revealing moments when CNN-style coverage becomes obvious as a component part of the event it purports to cover. In his analysis of the Tiananmen Square crisis of 1989 Wark argues that the media event appeared as “a positive feedback loop” (22). The Beijing students’ perceptions of Western accounts of their demands and motives became caught up in the students’ own accounts of their own motives, their own demands: Western interpretations of what was happening in Beijing, Wark writes, “fed back into the event itself via a global loop encompassing radio, telephone, and fax vectors. They impacted back on the further unfolding of the event itself” (22). Both the telegraph and the satellite contributed to major shifts in the production, distribution and reception of news. And both made possible new types of media institution, from Western Union and Reuters to CNN. This is not to argue that technologies determine the nature of news or of news organisations, but rather that certain developments are made possible by both the adoption and the adaptation of new technologies. Institutional and cultural factors, of course, affect the nature of news, but technology also both enables and constrains. The medium might not be the message — but it does matter. So with such precedents as those above in mind, what might be the key impacts of the Net on the nature of news? In an important analysis of the online news environment, Robert Burnett and P. David Marshall introduce the concept of “informational news,” defined as “the transformation of journalism and news in Web culture where there is a greater involvement of the user and news hierarchies are in flux” (206). News, they argue, has become “a subset of a wider search for information by Web users” (206) and this “has led to a shift in how we recontextualise news around a much larger search for information” (152). In this analysis, audience members are transformed into researchers. These researchers become comfortable with getting their news from a broader range of sources, while at the same time searching for new ways to hierarchise those sources, to establish some as more legitimate than others. Adding to the complexity are Burnett and Marshall’s observations that new media forms offer enhanced flexibility (with, for example, archival access to news databases, including audio and video, available 24 hours a day), and that online news fosters and caters for new global communities of interest 161-7). When these phenomena are taken together, the result for Burnett and Marshall is “a shifted boundary of what constitutes news” (167). But this concept of informational news is largely cast in terms of reception and consumption: the practices of the new informational news researchers are discussed in terms of information retrieval, not production — even newsgroups and Weblogs are considered as additional sources for information retrieval, rather than as new avenues for new kinds of journalists to develop and publish new kinds of news. Burnett and Marshall are, I believe, right in their identification of changes to the nature of news, and their analysis is an important contribution. But what I want to emphasise in this article is that there is also a corresponding ongoing shift in the boundary of what constitutes newsmakers. The Indymedia movement offers clear examples of this, in its spectacular growth and in its promotion of open publishing models. As a forum for non-professional journalists of all stripes, Indymedia’s development is a vivid example of the shifting boundary around who gets to make the news. By now, many readers of M/C will perhaps be familiar with Indymedia to some degree. But it’s worth briefly reviewing both the scope of the movement and the speed with which it’s developed. The first Indymedia Website was established for the Seattle demonstrations against the World Trade Organisation meeting in November 1999. Its key feature was offering news coverage supplied by anyone who wanted to contribute, using free software and ideas from the Australian activists who had created the Active network. As events in Seattle gathered pace, the nascent Indymedia drew a claimed 1.5 million hits; this success led to the site being refocussed around several subsequent protests, before local collectives began to appear and form their own Indymedia centres. Within a year, this original Indymedia site was just one of a new network of more than 30. At the time of writing, a little over three years on from the movement’s inception, there are more than 100 Indymedia centres around the world — there are both Israeli and Palestinian Indymedia; Indymedia is established in Mumbai, Jakarta and Buenos Aires; there are centres in Poland, Colombia and South Africa. By any measure, this is a remarkable achievement for a decentralised project run entirely by volunteers and donations. Like any other complex phenomenon, the story of this development can be told in many different ways, each adding a different dimension. Three are especially relevant here. The first version would centre around the Active software developed by Sydney’s Catalyst tech collective. This was devised to create the Active Sydney site, an online hub for Sydney activists to promote events from direct actions to screenings and seminars. Launched in January 1999, Active Sydney was to become a prototype for Indymedia — part events calendar, part meeting place, part street paper. For June of that year, the Active team revised the system for the J18 global day of action. Using this system, anyone could now upload a report, a video clip, a photo or an audio file, and see it instantly added to the emerging narrative of events. It was as easy as sending email. And it ran on open source code. With Catalyst members collaborating online with organisers in Seattle to establish the first site, this system became the basis for Indymedia. While the Active software is no longer the only platform used for Indymedia sites, it made a huge contribution to the movement’s explosive growth (see Arnison, 2001; Meikle, 2002). Another version of the story would place Indymedia within the long traditions of alternative media. John Downing’s work is important here, and his definition of “politically dissident media that offer radical alternatives to mainstream debate” is useful (240). To tell the Indymedia story from this perspective would be to highlight its independence and self-management, and the autonomy of each local editorial collective in running each Indymedia centre. It would be to emphasise Indymedia as a forum for viewpoints which are not usually expressed within the established media’s consensus about what is and isn’t news. And, perhaps most importantly, to tell the Indymedia story as one in the alternative media tradition would be to focus on the extent to which this movement fosters horizontal connections and open participation, in contrast to the vertical flows of the established broadcast and print media (Downing, 1995). A third version would approach Indymedia as part of what cultural studies academic George McKay terms “DiY Culture.” McKay defines this as “a youth-centred and -directed cluster of interests and practices around green radicalism, direct action politics, new musical sounds and experiences”(2). For this version of the story, a useful analogy would be with punk — not with the music so much as with its DIY access principle (“here’s three chords, now form a band”). DIY was the key to Richard Hell’s much-misunderstood lyric “I belong to the blank generation” — the idea of the blank was that you were supposed to fill it in for yourself, rather than sign up to someone else’s agenda. To consider Indymedia as part of this DIY spirit would be to see it as the expression of a blank generation in this fine original sense — not a vacant generation, but one prepared to offer their own self-definitions and to create their own media networks to do it. More than this, it would also be to place Indymedia within the frameworks of independent production and distribution which were the real impact of punk — independent record labels changed music more than any of their records, while photocopied zines opened up new possibilities for self-expression. Just as the real importance of punk wasn't in the individual songs, the importance of Indymedia isn't in this or that news story posted to this or that site. Instead, it's in its DIY ethos and its commitment to establishing new networks. What these three versions of the Indymedia story share is that each highlights an emphasis on access and participation; each stresses new avenues and methods for new people to create news; each shifts the boundary of who gets to speak. And where these different stories intersect is in the concept of open publishing. This is the Net making possible a shift in the production of news, as well as in its reception. Matthew Arnison of Catalyst, who played a key role in developing the Active software, offers a working definition of open publishing which is worth quoting in full: “Open publishing means that the process of creating news is transparent to the readers. They can contribute a story and see it instantly appear in the pool of stories publicly available. Those stories are filtered as little as possible to help the readers find the stories they want. Readers can see editorial decisions being made by others. They can see how to get involved and help make editorial decisions. If they can think of a better way for the software to help shape editorial decisions, they can copy the software because it is free and change it and start their own site. If they want to redistribute the news, they can, preferably on an open publishing site.” (Arnison, 2001) Open publishing has undoubtedly been a big part of the appeal of Indymedia for its many contributors. In fact, one of Indymedia’s slogans is “everyone is a journalist.” If this is a provocation, who and what is it meant to provoke? Obviously, “everyone” is not a journalist — at least not if journalists are seen as employees of news institutions and news businesses, employees with some kind of training in research methods and narrative construction. But to say that “everyone is a journalist” is not to claim that everyone has such institutional affiliation, or that everyone has such training or expertise. Instead, the tactic here seems to be to inflate something out of all proportion in order to draw attention to the core smaller truth that may otherwise go unnoticed. Specifically in this case, what authorises some to be story-tellers and not others? From this perspective, the slogan reads like a claim for difference, a claim that other kinds of expertise and other kinds of know-how also have valid claims on our attention, and that these too can make valid contributions to the more plural media environment made possible — but not guaranteed — by the Net. It’s a claim that the licence to tell stories should be shared around. But developments to this core element — open publishing — point both to an ongoing challenge for the Indymedia movement, and to a possible future which might enable a further significant shift in the nature of Net news. In March 2002, a proposal was circulated to remove the open publishing newswire from the front page of the main site at http://www.indymedia.org/, replacing this with features sourced from local sites around the world. While this was said to have the objective of promoting those local sites to a broader audience, it should also be seen as acknowledgement that Indymedia was struggling against limits to growth. One issue was the large number of items being posted to sites, which meant that even especially well-researched or significant stories would be replaced quickly on the front page; another issue was the persistent trolls and spam which plagued some Indymedia sites. In April 2002, after a voting process in which 15 Indymedia collectives from Brazil to Barcelona voted unanimously in favour of the reform, the open publishing newswire was taken off the front page. Many local Indymedia sites followed suit. Even the Sydney site, which, perhaps because of the history and involvement of the Catalyst group, promotes open publishing rather more than some other Indymedia sites, adopted a features-based front page in August 2002, stating that “promoting certain issues above others” would make the site “more effective.” These developments might signal the eventual demise of the open publishing component. Indymedia might instead become ‘professionalised,’ with greater reliance on de facto staff reporters and more stringent editing, moving closer to existing alternative media outlets. But the new centrality of its news features might also open Indymedia up to a new level of involvement, because those features are given prominence in the site’s central column and can remain on the front page for some weeks. This offers the potential for what Arnison terms “automated open-editing”. This would involve creating the facility for audience members to contribute to sub-editing stories on an Indymedia site: they might, for instance, check facts or add sources; edit spelling, grammar or formatting; nominate a topic area within which a given story could be archived; or translate the story from one language or style to another (Arnison, 2001). Open publishing is one phenomenon in which we can see the Net enabling changes to the nature of news and newsmakers. If open editing were also to work, then it would need to be as simple to operate as the original open publishing newswire. But if this were possible, then open editing might involve not only more new people in the development of informational news, but involve them in new ways, catering for a broader range of abilities and aptitudes than open publishing alone. Like earlier communication technologies, the Net could facilitate new types of media institution — ones built on an open model, which enable a new, more plural, news environment. Works Cited Arnison, Matthew. “Open Publishing Is the Same as Free Software.” 2001. 21 Feb. 2003 <http://www.cat.org.au/maffew/cat/openpub.php>. Arnison, Matthew. “Open Editing: A Crucial Part of Open Publishing.” 2002. 21 Feb. 2003 <http://www.cat.org.au/maffew/cat/openedit.php>. Burnett, Robert, and P. David Marshall. Web Theory: An Introduction. London & New York: Routledge, 2003. Carey, James. Communication as Culture. New York & London: Routledge, 1989. Downing, John. “Alternative Media and the Boston Tea Party.” Questioning The Media. Eds. John Downing, Ali Mohammadi and Annabelle Sreberny-Mohammadi. Thousand Oaks: Sage, 1995. 238-52. McKay, George. “DiY Culture: Notes towards an Intro.” DiY Culture: Party & Protest in Nineties Britain. Ed. George McKay. London: Verso, 1998. 1-53. Meikle, Graham. Future Active: Media Activism and the Internet. New York & London: Routledge, and Annandale: Pluto Press, 2002. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington: Indiana UP, 1994. Links http://www.cat.org.au/maffew/cat/openedit.html http://www.cat.org.au/maffew/cat/openpub.html http://www.indymedia.org/ http://www.sydney.active.org.au/ Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Meikle, Graham. "Indymedia and The New Net News" M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0304/02-feature.php>. APA Style Meikle, G. (2003, Apr 23). Indymedia and The New Net News. M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0304/02-feature.php>
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Lemos Morais, Renata. „The Hybrid Breeding of Nanomedia“. M/C Journal 17, Nr. 5 (25.10.2014). http://dx.doi.org/10.5204/mcj.877.

Der volle Inhalt der Quelle
Annotation:
IntroductionIf human beings have become a geophysical force, capable of impacting the very crust and atmosphere of the planet, and if geophysical forces become objects of study, presences able to be charted over millions of years—one of our many problems is a 'naming' problem. - Bethany NowviskieThe anthropocene "denotes the present time interval, in which many geologically significant conditions and processes are profoundly altered by human activities" (S.Q.S.). Although the narrative and terminology of the anthropocene has not been officially legitimized by the scientific community as a whole, it has been adopted worldwide by a plethora of social and cultural studies. The challenges of the anthropocene demand interdisciplinary efforts and actions. New contexts, situations and environments call for original naming propositions: new terminologies are always illegitimate at the moment of their first appearance in the world.Against the background of the naming challenges of the anthropocene, we will map the emergence and tell the story of a tiny world within the world of media studies: the world of the term 'nanomedia' and its hyphenated sister 'nano-media'. While we tell the story of the uses of this term, its various meanings and applications, we will provide yet another possible interpretation and application to the term, one that we believe might be helpful to interdisciplinary media studies in the context of the anthropocene. Contemporary media terminologies are usually born out of fortuitous exchanges between communication technologies and their various social appropriations: hypodermic media, interactive media, social media, and so on and so forth. These terminologies are either recognised as the offspring of legitimate scientific endeavours by the media theory community, or are widely discredited and therefore rendered illegitimate. Scientific legitimacy comes from the broad recognition and embrace of a certain term and its inclusion in the canon of an epistemology. Illegitimate processes of theoretical enquiry and the study of the kinds of deviations that might deem a theory unacceptable have been scarcely addressed (Delborne). Rejected terminologies and theories are marginalised and gain the status of bastard epistemologies of media, considered irrelevant and unworthy of mention and recognition. Within these margins, however, different streams of media theories which involve conceptual hybridizations can be found: creole encounters between high culture and low culture (James), McLuhan's hybrid that comes from the 'meeting of two media' (McLuhan 55), or even 'bastard spaces' of cultural production (Bourdieu). Once in a while a new media epistemology arises that is categorised as a bastard not because of plain rejection or criticism, but because of its alien origins, formations and shape. New theories are currently emerging out of interdisciplinary and transdisciplinary thinking which are, in many ways, bearers of strange features and characteristics that might render its meaning elusive and obscure to a monodisciplinary perspective. Radical transdisciplinary thinking is often alien and alienated. It results from unconventional excursions into uncharted territories of enquiry: bastard epistemologies arise from such exchanges. Being itself a product of a mestizo process of thinking, this article takes a look into the term nanomedia (or nano-media): a marginal terminology within media theory. This term is not to be confounded with the term biomedia, coined by Eugene Thacker (2004). (The theory of biomedia has acquired a great level of scientific legitimacy, however it refers to the moist realities of the human body, and is more concerned with cyborg and post-human epistemologies. The term nanomedia, on the contrary, is currently being used according to multiple interpretations which are mostly marginal, and we argue, in this paper, that such uses might be considered illegitimate). ’Nanomedia’ was coined outside the communications area. It was first used by scientific researchers in the field of optics and physics (Rand et al), in relation to flows of media via nanoparticles and optical properties of nanomaterials. This term would only be used in media studies a couple of years later, with a completely different meaning, without any acknowledgment of its scientific origins and context. The structure of this narrative is thus illegitimate, and as such does not fit into traditional modalities of written expression: there are bits and pieces of information and epistemologies glued together as a collage of nano fragments which combine philology, scientific literature, digital ethnography and technology reviews. Transgressions Illegitimate theories might be understood in terms of hybrid epistemologies that intertwine disciplines and perspectives, rendering its outcomes inter or transdisciplinary, and therefore prone to being considered marginal by disciplinary communities. Such theories might also be considered illegitimate due to social and political power struggles which aim to maintain territory by reproducing specific epistemologies within a certain field. Scientific legitimacy is a social and political process, which has been widely addressed. Pierre Bourdieu, in particular, has dedicated most of his work to deciphering the intricacies of academic wars around the legitimacy or illegitimacy of theories and terminologies. Legitimacy also plays a role in determining the degree to which a certain theory will be regarded as relevant or irrelevant:Researchers’ tendency to concentrate on those problems regarded as the most important ones (e.g. because they have been constituted as such by producers endowed with a high degree of legitimacy) is explained by the fact that a contribution or discovery relating to those questions will tend to yield greater symbolic profit (Bourdieu 22).Exploring areas of enquiry which are outside the boundaries of mainstream scientific discourses is a dangerous affair. Mixing different epistemologies in the search for transversal grounds of knowledge might result in unrecognisable theories, which are born out of a combination of various processes of hybridisation: social, technological, cultural and material.Material mutations are happening that call for new epistemologies, due to the implications of current technological possibilities which might redefine our understanding of mediation, and expand it to include molecular forms of communication. A new terminology that takes into account the scientific and epistemological implications of nanotechnology applied to communication [and that also go beyond cyborg metaphors of a marriage between biology and cibernetics] is necessary. Nanomedia and nanomediations are the terminologies proposed in this article as conceptual tools to allow these further explorations. Nanomedia is here understood as the combination of different nanotechnological mediums of communication that are able to create and disseminate meaning via molecular exchange and/ or assembly. Nanomediation is here defined as the process of active transmission and reception of signs and meaning using nanotechnologies. These terminologies might help us in conducting interdisciplinary research and observations that go deeper into matter itself and take into account its molecular spaces of mediation - moving from metaphor into pragmatics. Nanomedia(s)Within the humanities, the term 'nano-media' was first proposed by Mojca Pajnik and John Downing, referring to small media interventions that communicate social meaning in independent ways. Their use of term 'nano-media' proposes to be a revised alternative to the plethora of terms that categorise such media actions, such as alternative media, community media, tactical media, participatory media, etc. The metaphor of smallness implied in the term nano-media is used to categorise the many fragments and complexities of political appropriations of independent media. Historical examples of the kind of 'nano' social interferences listed by Downing (2),include the flyers (Flugblätter) of the Protestant Reformation in Germany; the jokes, songs and ribaldry of François Rabelais’ marketplace ... the internet links of the global social justice (otromundialista) movement; the worldwide community radio movement; the political documentary movement in country after country.John Downing applies the meaning of the prefix nano (coming from the Greek word nanos - dwarf), to independent media interventions. His concept is rooted in an analysis of the social actions performed by local movements scattered around the world, politically engaged and tactically positioned. A similar, but still unique, proposition to the use of the term 'nano-media' appeared 2 years later in the work of Graham St John (442):If ‘mass media’ consists of regional and national print and television news, ‘niche media’ includes scene specific publications, and ‘micro media’ includes event flyers and album cover art (that which Eshun [1998] called ‘conceptechnics’), and ‘social media’ refers to virtual social networks, then the sampling of popular culture (e.g. cinema and documentary sources) using the medium of the programmed music itself might be considered nano-media.Nano-media, according to Graham St John, "involves the remediation of samples from popular sources (principally film) as part of the repertoire of electronic musicians in their efforts to create a distinct liminalized socio-aesthetic" (St John 445). While Downing proposes to use the term nano-media as a way to "shake people free of their obsession with the power of macro-media, once they consider the enormous impact of nano-technologies on our contemporary world" (Downing 1), Graham St John uses the term to categorise media practices specific to a subculture (psytrance). Since the use of the term 'nano-media' in relation to culture seems to be characterised by the study of marginalised social movements, portraying a hybrid remix of conceptual references that, if not completely illegitimate, would be located in the border of legitimacy within media theories, I am hereby proposing yet another bastard version of the concept of nanomedia (without a hyphen). Given that neither of the previous uses of the term 'nano-media' within the discipline of media studies take into account the technological use of the prefix nano, it is time to redefine the term in direct relation to nanotechnologies and communication devices. Let us start by taking a look at nanoradios. Nanoradios are carbon nanotubes connected in such a way that when electrodes flow through the nanotubes, various electrical signals recover the audio signals encoded by the radio wave being received (Service). Nanoradios are examples of the many ways in which nanotechnologies are converging with and transforming our present information and communication technologies. From molecular manufacturing (Drexler) to quantum computing (Deutsch), we now have a wide spectrum of emerging and converging technologies that can act as nanomedia - molecular structures built specifically to act as communication devices.NanomediationsBeyond literal attempts to replicate traditional media artifacts using nanotechnologies, we find deep processes of mediation which are being called nanocommunication (Hara et al.) - mediation that takes place through the exchange of signals between molecules: Nanocommunication networks (nanonetworks) can be used to coordinate tasks and realize them in a distributed manner, covering a greater area and reaching unprecedented locations. Molecular communication is a novel and promising way to achieve communication between nanodevices by encoding messages inside molecules. (Abadal & Akyildiz) Nature is nanotechnological. Living systems are precise mechanisms of physical engineering: our molecules obey our DNA and fall into place according to biological codes that are mysteriously written in our every cell. Bodies are perfectly mediated - biological systems of molecular communication and exchange. Humans have always tried to emulate or to replace natural processes by artificial ones. Nanotechnology is not an exception. Many nanotechnological applications try to replicate natural systems, for example: replicas of nanostructures found in lotus flowers are now being used in waterproof fabrics, nanocrystals, responsible for resistance of cobwebs, are being artificially replicated for use in resistant materials, and various proteins are being artificially replicated as well (NNI 05). In recent decades, the methods of manipulation and engineering of nano particles have been perfected by scientists, and hundreds of nanotechnological products are now being marketed. Such nano material levels are now accessible because our digital technologies were advanced enough to allow scientific visualization and manipulation at the atomic level. The Scanning Tunneling Microscopes (STMs), by Gerd Binnig and Heinrich Rohrer (1986), might be considered as the first kind of nanomedia devices ever built. STMs use quantum-mechanical principles to capture information about the surface of atoms and molecules, allowed digital imaging and visualization of atomic surfaces. Digital visualization of atomic surfaces led to the discovery of buckyballs and nanotubes (buckytubes), structures that are celebrated today and received their names in honor of Buckminster Fuller. Nanotechnologies were developed as a direct consequence of the advancement of digital technologies in the fields of scientific visualisation and imaging. Nonetheless, a direct causal relationship between nano and digital technologies is not the only correlation between these two fields. Much in the same manner in which digital technologies allow infinite manipulation and replication of data, nanotechnologies would allow infinite manipulation and replication of molecules. Nanocommunication could be as revolutionary as digital communication in regards to its possible outcomes concerning new media. Full implementation of the new possibilities of nanomedia would be equivalent or even more revolutionary than digital networks are today. Nanotechnology operates at an intermediate scale at which the laws of classical physics are mixed to the laws of quantum physics (Holister). The relationship between digital technologies and nanotechnologies is not just instrumental, it is also conceptual. We might compare the possibilities of nanotechnology to hypertext: in the same way that a word processor allows the expression of any type of textual structure, so nanotechnology could allow, in principle, for a sort of "3-D printing" of any material structure.Nanotechnologies are essentially media technologies. Nanomedia is now a reality because digital technologies made possible the visualization and computational simulation of the behavior of atomic particles at the nano level. Nanomachines that can build any type of molecular structure by atomic manufacturing could also build perfect replicas of themselves. Obviously, such a powerful technology offers medical and ecological dangers inherent to atomic manipulation. Although this type of concern has been present in the global debate about the social implications of nanotechnology, its full implications are yet not entirely understood. A general scientific consensus seems to exist, however, around the idea that molecules could become a new type of material alphabet, which, theoretically, would make possible the reconfiguration of the physical structures of any type of matter using molecular manufacturing. Matter becomes digital through molecular communication.Although the uses given to the term nano-media in the context of cultural and social studies are merely metaphorical - the prefix nano is used by humanists as an allegorical reference of a combination between 'small' and 'contemporary' - once the technological and scientifical realities of nanomedia present themselves as a new realm of mediation, populated with its own kind of molecular devices, it will not be possible to ignore its full range of implications anymore. A complexifying media ecosystem calls for a more nuanced and interdisciplinary approach to media studies.ConclusionThis article narrates the different uses of the term nanomedia as an illustration of the way in which disciplinarity determines the level of legitimacy or illegitimacy of an emerging term. We then presented another possible use of the term in the field of media studies, one that is more closely aligned with its scientific origins. The importance and relevance of this narrative is connected to the present challenges we face in the anthropocene. The reality of the anthropocene makes painfully evident the full extent of the impact our technologies have had in the present condition of our planet's ecosystems. For as long as we refuse to engage directly with the technologies themselves, trying to speak the language of science and technology in order to fully understand its wider consequences and implications, our theories will be reduced to fancy metaphors and aesthetic explorations which circulate around the critical issues of our times without penetrating them. The level of interdisciplinarity required by the challenges of the anthropocene has to go beyond anthropocentrism. Traditional theories of media are anthropocentric: we seem to be willing to engage only with that which we are able to recognise and relate to. Going beyond anthropocentrism requires that we become familiar with interdisciplinary discussions and perspectives around common terminologies so we might reach a consensus about the use of a shared term. For scientists, nanomedia is an information and communication technology which is simultaneously a tool for material engineering. For media artists and theorists, nano-media is a cultural practice of active social interference and artistic exploration. However, none of the two approaches is able to fully grasp the magnitude of such an inter and transdisciplinary encounter: when communication becomes molecular engineering, what are the legitimate boundaries of media theory? If matter becomes not only a medium, but also a language, what would be the conceptual tools needed to rethink our very understanding of mediation? Would this new media epistemology be considered legitimate or illegitimate? Be it legitimate or illegitimate, a new media theory must arise that challenges and overcomes the walls which separate science and culture, physics and semiotics, on the grounds that it is a transdisciplinary change on the inner workings of media itself which now becomes our vector of epistemological and empirical transformation. A new media theory which not only speaks the language of molecular technologies but that might be translated into material programming, is the only media theory equipped to handle the challenges of the anthropocene. ReferencesAbadal, Sergi, and Ian F. Akyildiz. "Bio-Inspired Synchronization for Nanocommunication Networks." Global Telecommunications Conference (GLOBECOM), 2011.Borisenko, V. E., and S. Ossicini. What Is What in the Nanoworld: A Handbook on Nanoscience and Nanotechnology. Weinheim: Wiley-VCH, 2005.Bourdieu, Pierre. "The Specificity of the Scientific Field and the Social Conditions of the Progress of Reason." Social Science Information 14 (Dec. 1975): 19-47.---. La Distinction: Critique Sociale du Jugement. Paris: Editions de Minuit, 1979. Delborne, Jason A. "Transgenes and Transgressions: Scientific Dissent as Heterogeneous Practice". Social Studies of Science 38 (2008): 509.Deutsch, David. The Beginning of Infinity. London: Penguin, 2011.Downing, John. "Nanomedia: ‘Community’ Media, ‘Network’ Media, ‘Social Movement’ Media: Why Do They Matter? And What’s in a Name? Mitjans Comunitaris, Moviments Socials i Xarxes." InCom-UAB. Barcelona: Cidob, 15 March 2010.Drexler, E.K. "Modular Molecular Composite Nanosystems." Metamodern 10 Nov. 2008. Epstein, Steven. Impure Science: AIDS, Activism, and the Politics of Knowledge. Vol. 7. U of California P, 1996.Hara, S., et al. "New Paradigms in Wireless Communication Systems." Wireless Personal Communications 37.3-4 (May 2006): 233-241.Holister, P. "Nanotech: The Tiny Revolution." CMP Cientifica July 2002.James, Daniel. Bastardising Technology as a Critical Mode of Cultural Practice. PhD Thesis. Wellington, New Zealand, Massey University, 2010.Jensen, K., J. Weldon, H. Garcia, and A. Zetti. "Nanotube Radio." Nano Letters 7.11 (2007): 3508–3511. Lee, C.H., S.W. Lee, and S.S. Lee. "A Nanoradio Utilizing the Mechanical Resonance of a Vertically Aligned Nanopillar Array." Nanoscale 6.4 (2014): 2087-93. Maasen. Governing Future Technologies: Nanotechnology and the Rise of an Assessment Regime. Berlin: Springer, 2010. 121–4.Milburn, Colin. "Digital Matters: Video Games and the Cultural Transcoding of Nanotechnology." In Governing Future Technologies: Nanotechnology and the Rise of an Assessment Regime, eds. Mario Kaiser, Monika Kurath, Sabine Maasen, and Christoph Rehmann-Sutter. Berlin: Springer, 2009.Miller, T.R., T.D. Baird, C.M. Littlefield, G. Kofinas, F. Chapin III, and C.L. Redman. "Epistemological Pluralism: Reorganizing Interdisciplinary Research". Ecology and Society 13.2 (2008): 46.National Nanotechnology Initiative (NNI). Big Things from a Tiny World. 2008.Nowviskie, Bethany. "Digital Humanities in the Anthropocene". Nowviskie.org. 15 Sep. 2014 .Pajnik, Mojca, and John Downing. "Introduction: The Challenges of 'Nano-Media'." In M. Pajnik and J. Downing, eds., Alternative Media and the Politics of Resistance: Perspectives and Challenges. Ljubljana, Slovenia: Peace Institute, 2008. 7-16.Qarehbaghi, Reza, Hao Jiang, and Bozena Kaminska. "Nano-Media: Multi-Channel Full Color Image with Embedded Covert Information Display." In ACM SIGGRAPH 2014 Posters. New York: ACM, 2014. Rand, Stephen C., Costa Soukolis, and Diederik Wiersma. "Localization, Multiple Scattering, and Lasing in Random Nanomedia." JOSA B 21.1 (2004): 98-98.Service, Robert F. "TF10: Nanoradio." MIT Technology Review April 2008. Shanken, Edward A. "Artists in Industry and the Academy: Collaborative Research, Interdisciplinary Scholarship and the Creation and Interpretation of Hybrid Forms." Leonardo 38.5 (Oct. 2005): 415-418.St John, Graham. "Freak Media: Vibe Tribes, Sampledelic Outlaws and Israeli Psytrance." Continuum: Journal of Media and Cultural Studies 26. 3 (2012): 437–447.Subcomission on Quartenary Stratigraphy (S.Q.S.). "What Is the Anthropocene?" Quaternary.stratigraphy.org.Thacker, Eugene. Biomedia. Minneapolis: University of Minnesota Press, 2004.Toffoli, Tommaso, and Norman Margolus. "Programmable Matter: Concepts and Realization." Physica D 47 (1991): 263–272.Vanderbeeken, Robrecht, Christel Stalpaert, Boris Debackere, and David Depestel. Bastard or Playmate? On Adapting Theatre, Mutating Media and the Contemporary Performing Arts. Amsterdam: Amsterdam University, 2012.Wark, McKenzie. "Climate Science as Sensory Infrastructure." Extract from Molecular Red, forthcoming. The White Review 20 Sep. 2014.Wilson, Matthew W. "Cyborg Geographies: Towards Hybrid Epistemologies." Gender, Place and Culture 16.5 (2009): 499–515.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Wilken, Rowan. „Walkie-Talkies, Wandering, and Sonic Intimacy“. M/C Journal 22, Nr. 4 (14.08.2019). http://dx.doi.org/10.5204/mcj.1581.

Der volle Inhalt der Quelle
Annotation:
IntroductionThis short article examines contemporary artistic use of walkie-talkies across two projects: Saturday (2002) by Sabrina Raaf and Walk That Sound (2014) by Lukatoyboy. Drawing on Dominic Pettman’s notion of sonic intimacy, I argue that both artists incorporate walkie-talkies as part of their explorations of mediated wandering, and in ways that seek to capture sonic ambiances and intimacies. One thing that is striking about both these works is that they rethink what’s possible with walkie-talkies; both artists use them not just as low-tech, portable devices for one-to-one communication over distance, but also—and more strikingly—as (covert) recording equipment for capturing, while wandering, snippets of intimate conversation between passers-by and the “voice” of the surrounding environment. Both artworks strive to make the familiar strange. They prompt us to question our preconceived perceptions of, and affective engagements with, the people and places around us, to listen more attentively to the voices of others (and the “Other”), and to aurally inhabit in new ways the spaces and places we find ourselves in and routinely pass through.The walkie-talkie is an established, simple communication device, consisting of a two-way radio transceiver with a speaker and microphone (in some cases, the speaker is also used as the microphone) and an antenna (Wikipedia). Walkie-talkies are half-duplex communication devices, meaning that they use a single radio channel: only one radio on the channel can transmit at a time, but many can listen; when a user wishes to talk, they must turn off the receiver and turn on the transmitter by pressing a push-to-talk button (Wikipedia). In some models, static—known as squelch—is produced each time the push-to-talk button is depressed. The push-to-talk button is a feature of both projects: in Saturday, it transforms the walkie-talkie into a cheap, portable recorder-transmitter. In Walk That Sound, rapid fire exchanges of conversation using the push-to-talk button feature strongly.Interestingly, walkie-talkies were developed during World War Two. While they continue to be used within certain industrial settings, they are perhaps best known as a “quaint” household toy and “fun tool” (Smith). Early print ads for walkie-talkie toys marketed them as a form of both spyware for kids (with the Gabriel Toy Co. releasing a 007-themed walkie-talkie set) and as a teletechnology for communication over distance—“how thrilling to ‘speak through space!’”, states one ad (Statuv “New!”). What is noteworthy about these early ads is that they actively promote experimental use of walkie-talkies. For instance, a 1953 ad for Vibro-Matic “Space Commander” walkie-talkies casts them as media transmission devices, suggesting that, with them, one can send and receive “voice – songs – music” (Statuv “New!”). In addition, a 1962 ad for the Knight-Kit walkie-talkie imagines “you’ll find new uses for this exciting walkie-talkie every day” (Statuv “Details”). Resurgent interest in walkie-talkies has seen them also promoted more recently as intimate tools “for communication without asking permission to communicate” (“Nextel”); this is to say that they have been marketed as devices for synchronous or immediate communication that overcome the limits of asynchronous communication, such as texting, where there might be substantial delays between the sending of a message and receipt of a response. Within this context, it is not surprising that Snapchat and Instagram have also since added “walkie-talkie” features to their messaging services. The Nextel byline, emphasising “without asking permission”, also speaks to the possibilities of using walkie-talkies as rudimentary forms of spyware.Within art practice that explores mediated forms of wandering—that is, walking while using media and various “remote transmission technologies” (Duclos 233)—walkie-talkies hold appeal for a number of reasons, including their particular aesthetic qualities, such as the crackling or static sound (squelch) that one encounters when using them; their portability; their affordability; and, the fact that, while they can be operated on multiple channels, they tend to be regarded primarily as devices that permit two-way, one-to-one (and therefore intimate, if not secure) remote communication. As we will see below, however, contemporary artists, such as the aforementioned earlier advertisers, have also been very attentive to the device’s experimental possibilities. Perhaps the best known (if possibly apocryphal) example of artistic use of walkie-talkies is by the Situationist International as part of their explorations in urban wandering (a revolutionary strategy called dérive). In the Situationist text from 1960, Die Welt als Labyrinth (Anon.), there is a detailed account of how walkie-talkies were to form part of a planned dérive, which was organised by the Dutch section of the Situationist International, through the city of Amsterdam, but which never went ahead:Two groups, each containing three situationists, would dérive for three days, on foot or eventually by boat (sleeping in hotels along the way) without leaving the center of Amsterdam. By means of the walkie-talkies with which they would be equipped, these groups would remain in contact, with each other, if possible, and in any case with the radio-truck of the cartographic team, from where the director of the dérive—in this case Constant [Nieuwenhuys]—moving around so as to maintain contact, would define their routes and sometimes give instructions (it was also the director of the dérive’s responsibility to prepare experiments at certain locations and secretly arranged events.) (Anon.) This proposed dérive formed part of Situationist experiments in unitary urbanism, a process that consisted of “making different parts of the city communicate with one another.” Their ambition was to create new situations informed by, among other things, encounters and atmospheres that were registered through dérive in order to reconnect parts of the city that were separated spatially (Lefebvre quoted in Lefebvre and Ross 73). In an interview with Kristin Ross, Henri Lefebvre insists that the Situationists “did have their experiments; I didn’t participate. They used all kinds of means of communication—I don’t know when exactly they were using walkie-talkies. But I know they were used in Amsterdam and in Strasbourg” (Lefebvre quoted in Lefebvre and Ross 73). However, as Rebecca Duclos points out, such use “is, in fact, not well documented”, and “none of the more well-known reports on situationist activity […] specifically mentions the use of walkie-talkies within their descriptive narratives” (Duclos 233). In the early 2000s, walkie-talkies also figured prominently, alongside other media devices, in at least two location-based gaming projects by renowned British art collective Blast Theory, Can You See Me Now? (2001) and You Get Me (2008). In the first of these projects, participants in the game (“online players”) competed against members of Blast Theory (“runners”), tracking them through city streets via a GPS-enabled handheld computer that runners carried with them. The goal for online players was to move an avatar they created through a virtual map of the city as multiple runners “pursued their avatar’s geographical coordinates in real-time” (Leorke). As Dale Leorke explains, “Players could see the locations of the runners and other players and exchange text messages with other players” (Leorke 27), and runners could “read players’ messages and communicate directly with each other through a walkie-talkie” (28). An audio stream from these walkie-talkie conversations allowed players to eavesdrop on their pursuers (Blast Theory, Can You See Me Now?).You Get Me was similarly structured, with online players and “runners” (eight teenagers who worked with Blast Theory on the game). Remotely situated online players began the game by listening to the “personal geography” of the runners over a walkie-talkie stream (Blast Theory, You Get Me). They then selected one runner, and tracked them down by navigating their own avatar, without being caught, through a virtual version of Mile End Park in London, in pursuit of their chosen runner who was moving about the actual Mile End Park. Once their chosen runner was contacted, the player had to respond to a question that the runner posed to them. If the runner was satisfied with the player’s answer, conversation switched to “the privacy of a mobile phone” in order to converse further; if not, the player was thrown back into the game (Blast Theory, You Get Me). A key aim of Blast Theory’s work, as I have argued elsewhere (Wilken), is the fostering of interactions and fleeting intimacies between relative and complete strangers. The walkie-talkie is a key tool in both the aforementioned Blast Theory projects for facilitating these interactions and intimacies.Beyond these well-known examples, walkie-talkies have been employed in productive and exploratory ways by other artists. The focus in this article is on two specific projects: the first by US-based sound artist Sabrina Raaf, called Saturday (2002) and the second by Serbian sound designer Lukatoyboy (Luka Ivanović), titled Walk That Sound (2014). Sonic IntimaciesThe concept that gives shape and direction to the analysis of the art projects by Raaf and Lukatoyboy and their use of walkie-talkies is that of sonic intimacy. This is a concept of emerging critical interest across media and sound studies and geography (see, for example, James; Pettman; Gallagher and Prior). Sonic intimacy, as Dominic Pettman explains, is composed of two simultaneous yet opposing orientations. On the one hand, sonic intimacy involves a “turning inward, away from the wider world, to more private and personal experiences and relationships” (79). While, on the other hand, it also involves a turning outward, to seek and heed “the voice of the world” (79)—or what Pettman refers to as the “vox mundi” (66). Pettman conceives of the “vox mundi” as an “ecological voice”, whereby “all manner of creatures, agents, entities, objects, and phenomena” (79) have the opportunity to speak to us, if only we were prepared to listen to our surroundings in new and different ways. In a later passage, he also refers to the “vox mundi” as a “carrier or potentially enlightening alterity” (83). Voices, Pettman writes, “transgress the neat divisions we make between ‘us’ and ‘them’, at all scales and junctures” (6). Thus, Pettman’s suggestion is that “by listening to the ‘voices’ that lie dormant in the surrounding world […] we may in turn foster a more sustainable relationship with [the] local matrix of specific existences” (85), be they human or otherwise.This formulation of sonic intimacy provides a productive conceptual frame for thinking through Raaf’s and Lukatoyboy’s use of walkie-talkies. The contention in this article is that these two projects are striking for the way that they both use walkie-talkies to explore, simultaneously, this double articulation or dual orientation of sonic intimacy—a turning inwards to capture more private and personal experiences and conversations, and a turning outwards to capture the vox mundi. Employing Pettman’s notion of sonic intimacy as a conceptual frame, I trace below the different ways that these two projects incorporate walkie-talkies in order to develop mediated forms of wandering that seek to capture place-based sonic ambiances and sonic intimacies.Sabrina Raaf, Saturday (2002)US sound artist Sabrina Raaf’s Saturday (2002) is a sound-based art installation based on recordings of “stolen conversations” that Raaf gathered over many Saturdays in Humboldt Park, Chicago. Raaf’s work harks back to the early marketing of walkie-talkie toys as spyware. In Raaf’s hands, this device is used not for engaging in intimate one-to-one conversation, but for listening in on, and capturing, the intimate conversations of others. In other words, she uses this device, as the Nextel slogan goes, for “communication without permission to communicate” (“Nextel”). Raaf’s inspiration for the piece was twofold. First, she has noted that “with the overuse of radio frequency bands for wireless communications, there comes the increased occurrence of crossed lines where a private conversation becomes accidentally shared” (Raaf). Reminiscent of Francis Ford Coppola’s film The Conversation (1974), in which surveillance expert Harry Caul (Gene Hackman) records the conversation of a couple as they walk through crowded Union Square in San Francisco, Raaf used a combination of walkie-talkies, CB radios, and “various other forms of consumer spy […] technology in order to actively harvest such communication leaks” (Raaf). The second source of inspiration was noticing the “sheer quantity of non-phone, low tech, radio transmissions that were constantly being sent around [the] neighbourhood”, transmissions that were easily intercepted. These conversations were eclectic in composition and character:The transmissions included communications between gang members on street corners nearby and group conversations between friends talking about changes in the neighbourhood and their families. There were raw, intimate conversations and often even late night sex talk between potential lovers. (Raaf)What struck Raaf about these conversations, these transmissions, was that there was “a furtive quality” to most of them, and “a particular daringness to their tone”.During her Saturday wanderings, Raaf complemented her recordings of stolen snippets of conversation with recordings of the “voice” of the surrounding neighbourhood—“the women singing out their windows to their radios, the young men in their low rider cars circling the block, the children, the ice cream carts, etc. These are the sounds that are mixed into the piece” (Raaf).Audience engagement with Saturday involves a kind of austere intimacy of its own that seems befitting of a surveillance-inspired sonic portrait of urban and private life. The piece is accessed via an interactive glove. This glove is white in colour and about the size of a large gardening glove, with a Velcro strap that fastens across the hand, like a cycling glove. The glove, which only has coverings for thumb and first two fingers (it is missing the ring and little fingers) is wired into and rests on top of a roughly A4-sized white rectangular box. This box, which is mounted onto the wall of an all-white gallery space at the short end, serves as a small shelf. The displayed glove is illuminated by a discrete, bent-arm desk lamp, that protrudes from the shelf near the gallery wall. Above the shelf are a series of wall-mounted colour images that relate to the project. In order to hear the soundtrack of Saturday, gallery visitors approach the shelf, put on the glove, and “magically just press their fingertips to their forehead [to] hear the sound without the use of their ears” (Raaf). The glove, Raaf explains, “is outfitted with leading edge audio electronic devices called ‘bone transducers’ […]. These transducers transmit sound in a very unusual fashion. They translate sound into vibration patterns which resonate through bone” (Raaf).Employing this technique, Raaf explains, “permits a new way of listening”:The user places their fingers to their forehead—in a gesture akin to Rodin’s The Thinker or of a clairvoyant—in order to tap into the lives of strangers. Pressing different combinations of fingers to the temple yield plural viewpoints and group conversations. These sounds are literally mixed in the bones of the listener. (Raaf) The result is a (literally and figuratively) touching sonic portrait of Humboldt Park, its residents, and the “voice” of its surrounding neighbourhoods. Through the unique technosomatic (Richardson) apparatus—combinations of gestures that convey the soundscape directly through the bones and body—those engaging with Saturday get to hear voices in/of/around Humboldt Park. It is a portrait that combines sonic intimacy in the two forms described earlier in this article. In its inward-focused form, the gallery visitor-listener is positioned as a voyeur of sorts, listening into stolen snippets of private and personal relationships, experiences, and interactions. And, in its outward-focused form, the gallery visitor-listener encounters a soundscape in which an array of agents, entities, and objects are also given a voice. Additional work performed by this piece, it seems to me, is to be found in the intermingling of these two form of sonic intimacy—the personal and the environmental—and the way that they prompt reflection on mediation, place, urban life, others, and intimacy. That is to say that, beyond its particular sonic portrait of Humboldt Park, Saturday works in “clearing some conceptual space” in the mind of the departing gallery visitor such that they might “listen for, if not precisely to, the collective, polyphonic ‘voice of the world’” (Pettman 6) as they go about their day-to-day lives.Lukatoyboy, Walk That Sound (2014)The second project, Walk That Sound, by Serbian sound artist Lukatoyboy was completed for the 2014 CTM festival. CTM is an annual festival event that is staged in Berlin and dedicated to “adventurous music and art” (CTM Festival, “About”). A key project within the festival is CTM Radio Lab. The Lab supports works, commissioned by CTM Festival and Deutschlandradio Kultur – Hörspiel/Klangkunst (among other partnering organisations), that seek to pair and explore the “specific artistic possibilities of radio with the potentials of live performance or installation” (CTM Festival, “Projects”). Lukatoyboy’s Walk That Sound was one of two commissioned pieces for the 2014 CTM Radio Lab. The project used the “commonplace yet often forgotten walkie-talkie” (CTM Festival, “Projects”) to create a moving urban sound portrait in the area around the Kottbusser Tor U-Bahn station in Berlin-Kreuzberg. Walk That Sound recruited participants—“mobile scouts”—to rove around the Kottbusser Tor area (CTM Festival, “Projects”). Armed with walkie-talkies, and playing with “the array of available and free frequencies, and the almost unlimited amount of users that can interact over these different channels”, the project captured the dispatches via walkie-talkie of each participant (CTM Festival, “Projects”). The resultant recording of Walk That Sound—which was aired on Deutschlandradio (see Lukatoyboy), part of a long tradition of transmitting experimental music and sound art on German radio (Cory)—forms an eclectic soundscape.The work juxtaposes snippets of dialogue shared between the mobile scouts, overheard mobile phone conversations, and moments of relative quietude, where the subdued soundtrack is formed by the ambient sounds—the “voice”—of the Kottbusser Tor area. This voice includes distant traffic, the distinctive auditory ticking of pedestrian lights, and moments of tumult and agitation, such as the sounds of construction work, car horns, emergency services vehicle sirens, a bottle bouncing on the pavement, and various other repetitive yet difficult to identify industrial sounds. This voice trails off towards the end of the recording into extended walkie-talkie produced static or squelch. The topics covered within the “crackling dialogues” (CTM Festival, “Projects”) of the mobile scouts ranged widely. There were banal observations (“I just stepped on a used tissue”; “people are crossing the street”; “there are 150 trains”)—wonderings that bear strong similarities with French writer Georges Perec’s well-known experimental descriptions of everyday Parisian life in the 1970s (Perec “An Attempt”). There were also intimate, confiding, flirtatious remarks (“Do you want to come to Turkey with me?”), as well as a number of playfully paranoid observations and quips (“I like to lie”; “I can see you”; “do you feel like you are being recorded?”; “I’m being followed”) that seem to speak to the fraught history of Berlin in particular as well as the complicated character of urban life in general—as Pettman asks, “what does ‘together’ signify in a socioeconomic system so efficient in producing alienation and isolation?” (92).In sum, Walk That Sound is a strangely moving exploration of sonic intimacy, one that shifts between many different registers and points of focus—much like urban wandering itself. As a work, it is variously funny, smart, paranoid, intimate, expansive, difficult to decipher, and, at times, even difficult to listen to. Pettman argues that, “thanks in large part to the industrialization of the human ear […], we have lost the capacity to hear the vox mundi, which is […] the sum total of cacophonous, heterogeneous, incommensurate, and unsynthesizable sounds of the postnatural world” (8). Walk That Sound functions almost like a response to this dilemma. One comes away from listening to it with a heightened awareness of, appreciation for, and aural connection to the rich messiness of the polyphonic contemporary urban vox mundi. ConclusionThe argument of this article is that Sabrina Raaf’s Saturday and Lukatoyboy’s Walk That Sound are two projects that both incorporate walkie-talkies in order to develop mediated forms of wandering that seek to capture place-based sonic ambiances and sonic intimacies. Drawing on Pettman’s notion of “sonic intimacy”, examination of these projects has opened consideration around voice, analogue technology, and what Nick Couldry refers to as “an obligation to listen” (Couldry 580). In order to be heard, Pettman remarks, and “in order to be considered a voice at all”, and therefore as “something worth heeding”, the vox mundi “must arrive intimately, or else it is experienced as noise or static” (Pettman 83). In both the projects discussed here—Saturday and Walk That Sound—the walkie-talkie provides this means of “intimate arrival”. As half-duplex communication devices, walkie-talkies have always fulfilled a double function: communicating and listening. This dual functionality is exploited in new ways by Raaf and Lukatoyboy. In their projects, both artists turn the microphone outwards, such that the walkie-talkie becomes not just a device for communicating while in the field, but also—and more strikingly—it becomes a field recording device. The result of which is that this simple, “playful” communication device is utilised in these two projects in two ways: on the one hand, as a “carrier of potentially enlightening alterity” (Pettman 83), a means of encouraging “potential encounters” (89) with strangers who have been thrown together and who cross paths, and, on the other hand, as a means of fostering “an environmental awareness” (89) of the world around us. In developing these prompts, Raaf and Lukatoyboy build potential bridges between Pettman’s work on sonic intimacy, their own work, and the work of other experimental artists. For instance, in relation to potential encounters, there are clear points of connection with Blast Theory, a group who, as noted earlier, have utilised walkie-talkies and sound-based and other media technologies to explore issues around urban encounters with strangers that promote reflection on ideas and experiences of otherness and difference (see Wilken)—issues that are also implicit in the two works examined. In relation to environmental awareness, their work—as well as Pettman’s calls for greater sonic intimacy—brings renewed urgency to Georges Perec’s encouragement to “question the habitual” and to account for, and listen carefully to, “the common, the ordinary, the infraordinary, the background noise” (Perec “Approaches” 210).Walkie-talkies, for Raaf and Lukatoyboy, when reimagined as field recording devices as much as remote transmission technologies, thus “allow new forms of listening, which in turn afford new forms of being together” (Pettman 92), new forms of being in the world, and new forms of sonic intimacy. Both these artworks engage with, and explore, what’s at stake in a politics and ethics of listening. Pettman prompts us, as urban dweller-wanderers, to think about how we might “attend to the act of listening itself, rather than to a specific sound” (Pettman 1). His questioning, as this article has explored, is answered by the works from Raaf and Lukatoyboy in effective style and technique, setting up opportunities for aural attentiveness and experiential learning. However, it is up to us whether we are prepared to listen carefully and to open ourselves to such intimate sonic contact with others and with the environments in which we live.ReferencesAnon. “Die Welt als Labyrinth.” Internationale Situationiste 4 (Jan. 1960). International Situationist Online, 19 June 2019 <https://www.cddc.vt.edu/sionline/si/diewelt.html>Blast Theory. “Can You See Me Now?” Blast Theory, 19 June 2019 <https://www.blasttheory.co.uk/projects/can-you-see-me-now/>.———. “You Get Me.” Blast Theory, 19 June 2019 <https://wwww.blasttheory.co.uk/projects/you-get-me/>.Cory, Mark E. “Soundplay: The Polyphonous Tradition of German Radio Art.” Wireless Imagination: Sound, Radio, and the Avant-garde. Eds. Douglas Kahn and Gregory Whitehead. Cambridge, MA: MIT P, 1992. 331–371.Couldry, Nick. “Rethinking the Politics of Voice.” Continuum 23.4 (2009): 579–582.CTM Festival. “About.” CTM Festival, 2019. 19 June 2019 <https://www.ctm-festival.de/about/ctm-festival/>.———. “Projects – CTM Radio Lab.” CTM Festival, 2019. 19 June 2019 <https://www.ctm-festival.de/projects/ctm-radio-lab/>.Duclos, Rebecca. “Reconnaissance/Méconnaissance: The Work of Janet Cardiff and George Bures Miller.” Articulate Objects: Voice, Sculpture and Performance. Eds. Aura Satz and Jon Wood. Bern: Peter Lang, 2009. 221–246. Gallagher, Michael, and Jonathan Prior. “Sonic Geographies: Exploring Phonographic Methods.” Progress in Human Geography 38.2 (2014): 267–284.James, Malcom. Sonic Intimacy: The Study of Sound. London: Bloomsbury, forthcoming.Lefebvre, Henri, and Kristin Ross. “Lefebvre on the Situationists: An Interview.” October 79 (Winter 1997): 69–83. Leorke, Dale. Location-Based Gaming: Play in Public Space. Singapore: Palgrave Macmillan, 2019.Lukatoyboy. “Walk That Sound – Deutschlandradiokultur Klangkunst Broadcast 14.02.2014.” SoundCloud. 19 June 2019 <https://soundcloud.com/lukatoyboy/walk-that-sound-deutschlandradiokultur-broadcast-14022014>.“Nextel: Couple. Walkie Talkies Are Good for Something More.” AdAge. 6 June 2012. 18 July 2019 <https://adage.com/creativity/work/couple/27993>.Perec, Georges. An Attempt at Exhausting a Place in Paris. Trans. Marc Lowenthal. Cambridge, MA: Wakefield Press, 2010.———. “Approaches to What?” Species of Spaces and Other Pieces. Rev. ed. Ed. and trans. John Sturrock. Harmondsworth, Middlesex: Penguin, 1999. 209–211.Pettman, Dominic. Sonic Intimacy: Voice, Species, Technics (Or, How to Listen to the World). Stanford, CA: Stanford UP, 2017.Raaf, Sabrina. “Saturday.” Sabrina Raaf :: New Media Artist, 2002. 19 June 2019 <http://raaf.org/projects.php?pcat=2&proj=10>.Richardson, Ingrid. “Mobile Technosoma: Some Phenomenological Reflections on Itinerant Media Devices.” The Fibreculture Journal 6 (2005). <http://six.fibreculturejournal.org/fcj-032-mobile-technosoma-some-phenomenological-reflections-on-itinerant-media-devices/>. Smith, Ernie. “Roger That: A Short History of the Walkie Talkie.” Vice, 23 Sep. 2017. 19 June 2019 <https://www.vice.com/en_us/article/vb7vk4/roger-that-a-short-history-of-the-walkie-talkie>. Statuv. “Details about Allied Radio Knight-Kit C-100 Walkie Talkie CB Radio Vtg Print Ad.” Statuv, 4 Jan. 2016. 18 July 2019 <https://statuv.com/media/74802043788985511>.———. “New! 1953 ‘Space Commander’ Vibro-Matic Walkie-Talkies.” Statuv, 4 Jan. 2016. 18 July 2019 <https://statuv.com/media/74802043788985539>.Wikipedia. “Walkie-Talkie”. Wikipedia, 3 July 2019. 18 July 2019 <https://en.wikipedia.org/wiki/Walkie-talkie>.Wilken, Rowan. “Proximity and Alienation: Narratives of City, Self, and Other in the Locative Games of Blast Theory.” The Mobile Story: Narrative Practices with Locative Technologies. Ed. Jason Farman. New York: Routledge, 2014. 175–191.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Burns, Alex. „Select Issues with New Media Theories of Citizen Journalism“. M/C Journal 10, Nr. 6 (01.04.2008). http://dx.doi.org/10.5204/mcj.2723.

Der volle Inhalt der Quelle
Annotation:
“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media? How could practice-based approaches inform this research instead of relying on espoused theories-in-use? What new methodologies could be developed for CJ implementation? What role can the “heroic” individual reporter or editor have in “the swarm”? Do the claims about OhmyNews and other sites stand up to longitudinal observation? Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators? How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/>. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 http://www.innosight.com/documents/Theory%20Building.pdf>. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit>. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 http://www.gladwell.com/1997/1997_03_17_a_cool.htm>. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all>. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 http://www.demos.co.uk/publications/proameconomy>. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm>. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out>. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine>. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 http://blog.futurestreetconsulting.com/?p=39>. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973. Citation reference for this article MLA Style Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 10.6/11.1 (2008). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0804/10-burns.php>. APA Style Burns, A. (Apr. 2008) "Select Issues with New Media Theories of Citizen Journalism," M/C Journal, 10(6)/11(1). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0804/10-burns.php>.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Burns, Alex. „Select Issues with New Media Theories of Citizen Journalism“. M/C Journal 11, Nr. 1 (01.06.2008). http://dx.doi.org/10.5204/mcj.30.

Der volle Inhalt der Quelle
Annotation:
“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media?How could practice-based approaches inform this research instead of relying on espoused theories-in-use?What new methodologies could be developed for CJ implementation?What role can the “heroic” individual reporter or editor have in “the swarm”?Do the claims about OhmyNews and other sites stand up to longitudinal observation?Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators?How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 < http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/ >. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 < http://www.innosight.com/documents/Theory%20Building.pdf >. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 < http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit >. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 < http://www.gladwell.com/1997/1997_03_17_a_cool.htm >. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 < http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all >. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 < http://www.demos.co.uk/publications/proameconomy >. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 < http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm >. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 < http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out >. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 < http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine >. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 < http://blog.futurestreetconsulting.com/?p=39 >. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie