Journal articles on the topic 'Research Subject Categories – SOCIAL SCIENCES – Statistics, computer and systems science'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 24 journal articles for your research on the topic 'Research Subject Categories – SOCIAL SCIENCES – Statistics, computer and systems science.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Hu, Jiming, and Yin Zhang. "Measuring the interdisciplinarity of Big Data research: a longitudinal study." Online Information Review 42, no. 5 (September 10, 2018): 681–96. http://dx.doi.org/10.1108/oir-12-2016-0361.

Full text
Abstract:
Purpose The purpose of this paper is to measure the degree of interdisciplinary collaboration in Big Data research based on the co-occurrences of subject categories using Stirling’s diversity index and specialization index. Design/methodology/approach Interdisciplinarity was measured utilizing the descriptive statistics of disciplines, network indicators showing relationships between disciplines and within individual disciplines, interdisciplinary communities, Stirling’s diversity index and specialization index, and a strategic diagram revealing the development status and trends of discipline communities. Findings Comprehensively considering all results, the degree of interdisciplinarity of Big Data research is increasing over time, particularly, after 2013. There is a high level of interdisciplinarity in Big Data research involving a large number of disciplines, but it is unbalanced in distribution. The interdisciplinary collaborations are not intensive on the whole; most disciplines are aggregated into a few distinct communities with computer science, business and economics, mathematics, and biotechnology and applied microbiology as the core. Four major discipline communities in Big Data research represent different directions with different development statuses and trends. Community 1, with computer science as the core, is the most mature and central to the whole interdisciplinary network. Accounting for all network indicators, computer science, engineering, business and economics, social sciences, and mathematics are the most important disciplines in Big Data research. Originality/value This study deepens our understanding of the degree and trend of interdisciplinary collaboration in Big Data research through a longitudinal study and quantitative measures based on two indexes. It has practical implications to study and reveal the interdisciplinary phenomenon and characteristics of related developments of a specific research area, or to conduct comparative studies between different research areas.
APA, Harvard, Vancouver, ISO, and other styles
2

Lyubimov, I. I., N. N. Yakunin, N. V. Yakunina, and Sh M. Minatullaev. "THE RESULTS OF THE STUDY OF THE RELATIONSHIP BETWEEN THE NUMBER OF ROLLING STOCK UNITS OF PASSENGER ROAD TRANSPORT AND GROSS REGIONAL PRODUCTS." Intellect. Innovations. Investments, no. 6 (2022): 88–98. http://dx.doi.org/10.25198/2077-7175-2022-6-88.

Full text
Abstract:
Passenger road transport plays an important role in the functioning of the state. This thesis can be substantiated by the fact that passenger road transport performs an important social task — the transportation of passengers. Passenger road transport is fully integrated into the transport system of the country and its subjects. The article considers three main types of activities in which passenger road transport is involved. The authors refer the commercial transportation of passengers to the first type, the transportation of passengers for the own needs of the carrier company to the second, and the use of passenger road transport as a means of individual mobility to the third. If we consider these groups from the point of view of the openness of information on them in various databases, then the first type of activity is the most informative, and the second and third are the least informative. Currently, there is no tool for qualitative and quantitative assessment of the impact of passenger road transport on gross regional products (GRP) and their components. This leads to inadequate reactions in the control system, for example in the field of passenger transportation. Increasing the validity of the development of passenger road transport systems by improving the accuracy of assessing its impact on the socio-economic development of regions is an urgent scientific and practical task. The working hypothesis is that passenger road transport is one of the key factors influencing the development of the subjects of the Russian Federation. Finding the relationship between the qualitative and quantitative assessment of the impact of passenger road transport on GRP and their components will create an understandable and informative tool for assessing the socio-economic condition of the subject in particular, and the country as a whole. The work uses methods of system analysis, mathematical statistics, in particular, correlation and regression analysis. When analyzing the data and processing the results, the authors used the discovery of information databases. The aim of the work is to create theoretical prerequisites for an analytical platform for managing the structure of the passenger vehicle fleet in the regions. The scientific novelty of the study lies in the revealed relationship between the number of passenger vehicles and the gross regional products of the constituent entities of the Russian Federation (correlation coefficients from 0.57 to 0.79), as well as parts of gross regional products by industry (correlation coefficients from 0.07 to 0.9). For the first time, an indicator of the number of conditional passenger motor transport units in the region is proposed, which gives an integral assessment of the methods of passenger motor transport correspondence, taking into account the categories and classes of vehicles. The practical significance of the study lies in the creation of a modern tool for managing passenger transportation in the constituent entities of the Russian Federation by improving the process of operating passenger road transport and managing the number, categories and classes of passenger vehicles. Directions for further research are to apply the above methodology to the integrated management of transport systems of the constituent entities of the Russian Federation.
APA, Harvard, Vancouver, ISO, and other styles
3

Jesenko, Berndt, and Christian Schlögl. "The effect of web of science subject categories on clustering: the case of data-driven methods in business and economic sciences." Scientometrics 126, no. 8 (June 23, 2021): 6785–801. http://dx.doi.org/10.1007/s11192-021-04060-4.

Full text
Abstract:
AbstractThe primary goal of this article is to identify the research fronts on the application of data-driven methods in business and economics. For this purpose, the research literature of the business and economic sciences Subject Categories from the Web of Science is mapped using BibExcel and VOSviewer. Since the assignment to subject categories is done at the journal level and since a journal is often assigned to several subject categories in Web of Science, two mappings are performed: one without considering multiple assignments (broad view) and one considering only those (articles from) journals that have been assigned exclusively to the business and economic sciences subject categories and no others (narrow view). A further aim of this article is therefore to identify differences in the two mappings. Surprisingly, engineering sciences play a major role in the broad mapping, in addition to the economic sciences. In the narrow mapping, however, only the following clusters with a clear business-management focus emerge: (i) Data-driven methods in management in general and data-driven supply chain management in particular, (ii) Data-driven operations research analyses with different business administration/management focuses, (iii) Data-driven methods and processes in economics and finance, and (iv) Data-driven methods in Information Systems. One limitation of the narrow mapping is that many relevant documents are not covered since the journals in which they appear are assigned to multiple subject categories in WoS. The paper comes to the conclusion that the multiple assignments of subject categories in Web of Science may lead to massive changes in the results. Adjacent subject areas—in this specific case the application of data-driven methods in engineering and more mathematically oriented contributions in economics (econometrics) are considered in the broad mapping (not excluding subject categories from neighbouring disciplines) and are even over-represented compared to the core areas of business and economics. If a mapping should only consider the core aspects of particular research fields, it is shown in this use case that the exclusion of Web of Science-subject categories that do not belong to the core areas due to multiple assignments (narrow view), may be a valuable alternative. Finally, it depends on the reader to decide which mapping is more beneficial to them.
APA, Harvard, Vancouver, ISO, and other styles
4

Keramatfar, Abdalsamad, and Hossein Amirkhani. "Bibliometrics of sentiment analysis literature." Journal of Information Science 45, no. 1 (March 19, 2018): 3–15. http://dx.doi.org/10.1177/0165551518761013.

Full text
Abstract:
This article provides a bibliometric study of the sentiment analysis literature based on Web of Science (WoS) until the end of 2016 to evaluate current research trends, quantitatively and qualitatively. We concentrate on the analysis of scientific documents, distribution of subject categories, languages of documents and languages that have been more investigated in sentiment analysis, most prolific and impactful authors and institutions, venues of publications and their geographic distribution, most cited and hot documents, trends of keywords and future works. Our investigations demonstrate that the most frequent subject categories in this field are computer science, engineering, telecommunications, linguistics, operations research and management science, information science and library science, business and economics, automation and control systems, robotics and social sciences. In addition, the most active venue of publication in this field is Lecture Notes in Computer Science ( LNCS). The United States, China and Singapore have the most prolific or impactful institutions. A keyword analysis demonstrates that sentiment analysis is a more accepted term than opinion mining. Twitter is the most used social network for sentiment analysis and Support Vector Machine ( SVM) is the most used classification method. We also present the most cited and hot documents in this field and authors’ suggestions for future works.
APA, Harvard, Vancouver, ISO, and other styles
5

Taskin, Zehra, and Umut Al. "Natural language processing applications in library and information science." Online Information Review 43, no. 4 (August 12, 2019): 676–90. http://dx.doi.org/10.1108/oir-07-2018-0217.

Full text
Abstract:
Purpose With the recent developments in information technologies, natural language processing (NLP) practices have made tasks in many areas easier and more practical. Nowadays, especially when big data are used in most research, NLP provides fast and easy methods for processing these data. The purpose of this paper is to identify subfields of library and information science (LIS) where NLP can be used and to provide a guide based on bibliometrics and social network analyses for researchers who intend to study this subject. Design/methodology/approach Within the scope of this study, 6,607 publications, including NLP methods published in the field of LIS, are examined and visualized by social network analysis methods. Findings After evaluating the obtained results, the subject categories of publications, frequently used keywords in these publications and the relationships between these words are revealed. Finally, the core journals and articles are classified thematically for researchers working in the field of LIS and planning to apply NLP in their research. Originality/value The results of this paper draw a general framework for LIS field and guides researchers on new techniques that may be useful in the field.
APA, Harvard, Vancouver, ISO, and other styles
6

Sorz, Johannes, Wolfgang Glänzel, Ursula Ulrych, Christian Gumpenberger, and Juan Gorraiz. "Research strengths identified by esteem and bibliometric indicators: a case study at the University of Vienna." Scientometrics 125, no. 2 (September 3, 2020): 1095–116. http://dx.doi.org/10.1007/s11192-020-03672-6.

Full text
Abstract:
AbstractThe identification of one’s own research strengths is of crucial importance for research administration at universities. In this case study, two different approaches were applied to the University of Vienna. The first relies on funding and rankings information as well as on other esteem indicators. The second is based on a bibliometric analysis of the publication output. We used two alternative clusterings for publications for the bibliometric analysis: Web of Science subject categories and lists of researchers associated with esteem-indicators. Both, esteem-indicators and bibliometric analysis proved to be useful in identifying research strengths, lead to similar results and are meant to be used together to complement each other. We found that the greatest hindrance in the bibliometric approach lies in the inherent limitations of journals-assignment-based classification systems and the considerable time and efforts for more accurate researcher-based publication analyses. Further investigation on this subject, including new and alternative metrics, is needed and will be conducted in the future. However, the preliminary results already demonstrate the benefits of using esteem-factors together with bibliometric analyses for research strengths definition of universities.
APA, Harvard, Vancouver, ISO, and other styles
7

Su, Fangli, Yin Zhang, and Zachary Immel. "Digital humanities research: interdisciplinary collaborations, themes and implications to library and information science." Journal of Documentation 77, no. 1 (September 3, 2020): 143–61. http://dx.doi.org/10.1108/jd-05-2020-0072.

Full text
Abstract:
PurposeThe purpose of this paper is to examine the structure, patterns and themes of interdisciplinary collaborations in the digital humanities (DH) research through the application of social network analysis and visualization tools.Design/methodology/approachThe sample includes articles containing DH research in the Web of Science Core Collection as of December 2018. First, co-occurrence data representing collaborations among disciplinary were extracted from the subject category. Second, the descriptive statistics, network indicators and interdisciplinary communities were calculated. Third, the research topics of different interdisciplinary collaboration communities based on system keywords, author keywords, title and abstracts were detected.FindingsThe findings reveal that while the scope of disciplines involved in DH research is broad and evolving over time, most interdisciplinary collaborations are concentrated among several disciplines, including computer science, library and information science, linguistics and literature. The study further uncovers some communities based on closely collaborating disciplines and the evolving nature of such interdisciplinary collaboration communities over time. To better understand the close collaboration ties, the study traces and analyzes the research topics and themes of the interdisciplinary communities. Finally, the implications of the findings for DH research are discussed.Originality/valueThis study applied various informetric methods and tools to reveal the collaboration structure, patterns and themes among disciplinaries in DH research.
APA, Harvard, Vancouver, ISO, and other styles
8

Joorabchi, Arash, Michael English, and Abdulhussain E. Mahdi. "Text mining stackoverflow." Journal of Enterprise Information Management 29, no. 2 (March 7, 2016): 255–75. http://dx.doi.org/10.1108/jeim-11-2014-0109.

Full text
Abstract:
Purpose – The use of social media and in particular community Question Answering (Q & A) websites by learners has increased significantly in recent years. The vast amounts of data posted on these sites provide an opportunity to investigate the topics under discussion and those receiving most attention. The purpose of this paper is to automatically analyse the content of a popular computer programming Q & A website, StackOverflow (SO), determine the exact topics of posted Q & As, and narrow down their categories to help determine subject difficulties of learners. By doing so, the authors have been able to rank identified topics and categories according to their frequencies, and therefore, mark the most asked about subjects and, hence, identify the most difficult and challenging topics commonly faced by learners of computer programming and software development. Design/methodology/approach – In this work the authors have adopted a heuristic research approach combined with a text mining approach to investigate the topics and categories of Q & A posts on the SO website. Almost 186,000 Q & A posts were analysed and their categories refined using Wikipedia as a crowd-sourced classification system. After identifying and counting the occurrence frequency of all the topics and categories, their semantic relationships were established. This data were then presented as a rich graph which could be visualized using graph visualization software such as Gephi. Findings – Reported results and corresponding discussion has given an indication that the insight gained from the process can be further refined and potentially used by instructors, teachers, and educators to pay more attention to and focus on the commonly occurring topics/subjects when designing their course material, delivery, and teaching methods. Research limitations/implications – The proposed approach limits the scope of the analysis to a subset of Q & As which contain one or more links to Wikipedia. Therefore, developing more sophisticated text mining methods capable of analysing a larger portion of available data would improve the accuracy and generalizability of the results. Originality/value – The application of text mining and data analytics technologies in education has created a new interdisciplinary field of research between the education and information sciences, called Educational Data Mining (EDM). The work presented in this paper falls under this field of research; and it is an early attempt at investigating the practical applications of text mining technologies in the area of computer science (CS) education.
APA, Harvard, Vancouver, ISO, and other styles
9

Papi, Zeinab, Saeid Rezaei Sharifabadi, Sedigheh Mohammadesmaeil, and Nadjla Hariri. "Technical requirements for copyright protection of electronic theses and dissertations in INSTD." Electronic Library 35, no. 1 (February 6, 2017): 21–35. http://dx.doi.org/10.1108/el-11-2015-0226.

Full text
Abstract:
Purpose This study aims to determine the technical requirements for copyright protection of theses and dissertations for proposing a model for applying in Iran’s National System for Theses and Dissertations (INSTD). Design/methodology/approach This study used a mixed research methodology. The grounded theory was used in the qualitative phase, and a researcher-made checklist was applied in the quantitative phase for surveying the status of the INSTD. Research population included INSTD as well as six information specialists and copyright experts. Data were analysed by using open, axial and selective coding. Findings Based on data extracted from the completed checklists, some technical requirements had been provided in the system. The technical requirements that interviewees pointed out included the following two main classes: technical components and technical-software infrastructures, explored in the phase of the grounded theory. The individual categories included access control, copy control, technical-software challenges, protecting standards, hypertext transfer protocol secure, certificate authority, documentation of thesis and dissertation information, the use of digital object identifiers, copy detection systems, thesis and dissertation integrated systems, digital rights management systems and electronic copyright management systems. Research limitations/implications Considering the subject of this study, only the technical aspect was investigated, and other aspects were not included. In addition, electronic theses and dissertation (ETD) providers were not well aware of copyright issues. Practical implications Using the technical requirements with high security is effective in the INSTD to gain the trust of the authors and encourage them to deposit their ETDs. Social implications The increased use of the system encourages the authors to be more innovative in conducting their research. Originality/value Considering the continued violation of copyright in electronic databases, applying technical requirements for copyright protection and regulating users’ access to the information of theses and dissertations are needed in the INSTD.
APA, Harvard, Vancouver, ISO, and other styles
10

Kabera, Samuel. "Impact of Good Governance on Social and Economic Development in Rwanda." International Journal of Scientific Research and Management 10, no. 05 (May 9, 2022): 2322–46. http://dx.doi.org/10.18535/ijsrm/v10i5.el04.

Full text
Abstract:
The purpose of this study was assessed the impact of good governance on social and economic development in Rwanda. The specific objectives: To determine the impact of good governance on social and economic development brought by Rwanda Governance Board; To analyze challenges facing good governance for social and economic development at Rwanda Governance Board and To establish relationship between good governance and social-economic development in Rwanda. This study was designed as a case study of Rwanda Governance Board using the survey method; a case study was described as analysis of the impact of good governance on social and economic development in Rwanda, assuming that the researcher acquired knowledge regarding the subject under review from in-depth exploration of a single case. It was qualitative analysis that involves careful observation of a situation. All the respondents from the population of different sectors to respond to research questionnaires. The researcher used questionnaires to collect data, As far as this study was concerned, the population was comprised by 145 employees, administrators and policy makers from Rwanda Governance Board. To describe target population of a study as the point of focus from which a generalization was made regarding the research findings. Thus, a sample size of 60 respondents as citizens and staffs from three sectors were considered representative of the total population. I used primary and secondary data to get all information needed in this study, the quantitative data was analyzed using descriptive and inferential statistics after running the data collected through the Statistical Package for Social Sciences (SPSS). The first research objective or research question was to determine the impact of good governance on social and economic development brought by Rwanda Governance Board, the findings of the study discovered that good governance has major contributed to the development of the rest of socio-economic sectors in the country. However, the results of the study shows that the whole civil servants’ been taking part the bad governance which led in destruction of social development, but the responsibility of that bad things have taking the government workers. The second research objective or research question where was to analyze challenges facing good governance for social and economic development at Rwanda Governance Board, therefore the findings of the study exposed that there were a lot of consequence of lack or bad governance which caused corruption fraud and embezzlement, on the other hand public lost their confidence of government institutions. Finally the research objective or research question were understand the significance relationship between good governance and social-economic development in Rwanda, the nature an administrative requirement of good governance may be poorly understood both by the top government organizations and the civil servants besides that the general weakness of accountability and luck of transparency are the challenges of good governance in RGB, besides that most of politicians and civil servant and community members admitted the importance of applying the principal of good governance to enhance economic and social development. Firstly, by in a straight line forthcoming legislative body of the influential as well as civil society, should maintain and implement the principles of good governance to eliminate to rebuild and develop for the country and its people. Secondly, civil servants and politicians should change their attitude through driving fuel for bad governance in the country to eradicate to poverty, in order to create employment opportunities. Thirdly, since there is lack of the principles of good governance, government should establish and implement in a quality control systems be done which will assure transparency and efficiency, also to increase economic development must improve the economic mechanism whose focus is to renew mechanisms and policies so as to release radically productive forces, expand domestic and foreign markets in order to get well infrastructures.
APA, Harvard, Vancouver, ISO, and other styles
11

Olawuyi, Seyi Olalekan, and Abbyssinia Mushunje. "Access to special COVID-19 relief from distress grant and livelihood outcome of livestock farming households in Eastern Cape Province, South Africa." AIMS Agriculture and Food 8, no. 2 (2023): 598–614. http://dx.doi.org/10.3934/agrfood.2023033.

Full text
Abstract:
<abstract> <p>Unexpected events and shocks constitute greater threats to the attainment of zero hunger targets in Africa and the world over, and in the extreme case, lead to total collapse of the global food system and food supply chain. Consequently, this causes significant loss of critical income sources, renders individuals vulnerable, and further deteriorates households' livelihood outcome and welfare state. Therefore, the need for social protection programs to mitigate the impact of distress and unexpected events, as well as extreme occurrences cannot be over emphasized. This research used dataset from the 1499 households captured in the 2021 South African General Household Survey to investigate whether access to a special relief from distress grant has effect on the livestock farming households' food security status in Eastern Cape Province of South Africa. Descriptive statistics, cross-tabulation, a two-sample t-test, a food insecurity experience-based scale technique, and a fractional outcome model were used to analyze the datasets. Based on access to the grant, households in the non-beneficiary group are significantly distinguishable from the beneficiary counterparts, such that the beneficiary households out-performed the non-beneficiary households in the food break-even and food surplus categories. The findings further indicated the possibility of transition of the beneficiary households' population under the transitory food insecurity category to either the chronic food insecurity status or food break-even status, subject to the effectiveness of the food security policy to which they are exposed. The fractional outcome model also indicated that non-metropolitan resident households (<italic>p</italic> &lt; 0.05), access to the special grant (<italic>p</italic> &lt; 0.01), access to health facilities (<italic>p</italic> &lt; 0.01), age of households' heads (<italic>p</italic> &lt; 0.01), colored, indian and white population groups (both at <italic>p</italic> &lt; 0.01), as well as access to remittance (<italic>p</italic> &lt; 0.01) made significant contributions to the households' food security status. The Wald test indicated that access to the special relief grant had a significant effect on the households' food security status in the study area. The study therefore recommends accelerated investments in various social investment programs as sustained responses to expected and unexpected shocks and occurrences to be able to induce progress and realize more resilient food systems.</p> </abstract>
APA, Harvard, Vancouver, ISO, and other styles
12

Pinto, Maria, Cristina Pouliot, and José Antonio Cordón-García. "E-book reading among Spanish university students." Electronic Library 32, no. 4 (July 29, 2014): 473–92. http://dx.doi.org/10.1108/el-05-2012-0048.

Full text
Abstract:
Purpose – This paper aims to show data about Spanish higher-education students’ usage, habits and perceptions regarding reading on new digital media to show the potential future of electronic books (e-books) and reading mobile devices (e-readers, tablets, cell phones, etc) in academia. It explores whether demographics and academic factors might influence e-book reading habits and attitudes and university students’ opinions about e-books vs print books. REWIL 2.0, a purpose-built research tool, was applied to measure students’ opinions about digital reading in different media and formats, considering their academic context, at the confluence of analog and digital materials and learning. Likewise, REWIL 2.0 detects who are e-book readers (eBR) and who are not and produces a statistics indicator to identify five categories of eBRs by their frequency of e-book reading. This research gathered 745 online surveys between April and July 2010 in 15 degree programs at the University of Granada: Spanish philology, English philology, history, mathematics, chemistry, environmental sciences, education, library and information science, law, medicine, biology, dentistry, computer systems, architecture and civil engineering. Design/methodology/approach – This present study is a transversal applied research, where 745 students were surveyed from 15 different academic disciplines offered at the University of Granada (Spain), representing the five main discipline areas. The survey was carried out by means of a structured online survey, with REWIL 2.0 research tool. To ensure internal consistency of correlation between two different survey items designed to measure e-book reading frequency, Pearson’s r reliability test was applied. Likewise, Persons’ chi-squared statistics were applied to test the hypotheses and to detect if significant correlation existed between academic disciplines and e-book reading frequency measured through a Likert scale. Findings – The present research is motivated by our interest in discovering what effect the current technological maelstrom and the rapid growth of new portable digital reading devices in the Spanish university environment are having on students’ lives, and the extent to which students have adopted new reading technologies. Their first aim is to establish who is reading e-books in the University? A second aim is to answer the following question: is the academic discipline a determinant factor in e-book reading habits and students’ attitudes about it? The authors began by considering the following hypotheses: University students’ attitudes to e-book reading and the way they use them will be determined by the scientific discipline they study. Students of humanities, social sciences and law will prefer to read traditional format books (printed paper), while students of experimental sciences, health and technical courses will prefer reading e-books. Students’ preferences will be determined by their previous reading experiences. Originality/value – The main objective of the present study is to learn whether there are any notable differences among university students from distinct disciplines with regard to their attitude and behavior toward e-books. The authors, therefore, set out to identify the segment of the student population that does not read e-books yet (non-eBRs) from those who have already read at least one (eBRs), and within this segment, the readers that have read e-books recently (recent eBRs); find out how frequently university students are reading in different formats (paper and digital), document types (book, written press, etc.) and languages (textual, multimodal, etc.) identify what channels are used to access e-books; find out university students’ opinions on the advantages and disadvantages of reading e-books as compared to traditional print books; and identify the types of improvements or changes to the design–production–distribution–reception chain that students consider might help extend e-book reading.
APA, Harvard, Vancouver, ISO, and other styles
13

Thor, Andreas, Lutz Bornmann, Robin Haunschild, and Loet Leydesdorff. "Which are the influential publications in the Web of Science subject categories over a long period of time? CRExplorer software used for big-data analyses in bibliometrics." Journal of Information Science, April 23, 2020, 016555152091381. http://dx.doi.org/10.1177/0165551520913817.

Full text
Abstract:
What are the landmark papers in scientific disciplines? Which papers are indispensable for scientific progress? These are typical questions which are of interest not only for researchers (who frequently know the answers – or guess to know them) but also for the interested general public. Citation counts can be used to identify very useful papers since they reflect the wisdom of the crowd – in this case, the scientists using published results for their research. In this study, we identified with recently developed methods for the program CRExplorer landmark publications in nearly all Web of Science subject categories (WoS-SCs). These are publications which belong more frequently than other publications during the citing years to the top-1‰ in their subject area. As examples, we show the results of five subject categories: ‘Information Science & Library Science’, ‘Computer Science, Information Systems’, ‘Computer Science, Software Engineering’, ‘Psychology, Social’ and, ‘Chemistry, Physical’. The results of the other WoS-SCs can be found online at http://crexplorer.net . An analyst of the results should keep in mind that the identification of landmark papers depends on the used methods and data. Small differences in methods and/or data may lead to other results.
APA, Harvard, Vancouver, ISO, and other styles
14

Livingstone, Randall M. "Let’s Leave the Bias to the Mainstream Media: A Wikipedia Community Fighting for Information Neutrality." M/C Journal 13, no. 6 (November 23, 2010). http://dx.doi.org/10.5204/mcj.315.

Full text
Abstract:
Although I'm a rich white guy, I'm also a feminist anti-racism activist who fights for the rights of the poor and oppressed. (Carl Kenner)Systemic bias is a scourge to the pillar of neutrality. (Cerejota)Count me in. Let's leave the bias to the mainstream media. (Orcar967)Because this is so important. (CuttingEdge)These are a handful of comments posted by online editors who have banded together in a virtual coalition to combat Western bias on the world’s largest digital encyclopedia, Wikipedia. This collective action by Wikipedians both acknowledges the inherent inequalities of a user-controlled information project like Wikpedia and highlights the potential for progressive change within that same project. These community members are taking the responsibility of social change into their own hands (or more aptly, their own keyboards).In recent years much research has emerged on Wikipedia from varying fields, ranging from computer science, to business and information systems, to the social sciences. While critical at times of Wikipedia’s growth, governance, and influence, most of this work observes with optimism that barriers to improvement are not firmly structural, but rather they are socially constructed, leaving open the possibility of important and lasting change for the better.WikiProject: Countering Systemic Bias (WP:CSB) considers one such collective effort. Close to 350 editors have signed on to the project, which began in 2004 and itself emerged from a similar project named CROSSBOW, or the “Committee Regarding Overcoming Serious Systemic Bias on Wikipedia.” As a WikiProject, the term used for a loose group of editors who collaborate around a particular topic, these editors work within the Wikipedia site and collectively create a social network that is unified around one central aim—representing the un- and underrepresented—and yet they are bound by no particular unified set of interests. The first stage of a multi-method study, this paper looks at a snapshot of WP:CSB’s activity from both content analysis and social network perspectives to discover “who” geographically this coalition of the unrepresented is inserting into the digital annals of Wikipedia.Wikipedia and WikipediansDeveloped in 2001 by Internet entrepreneur Jimmy Wales and academic Larry Sanger, Wikipedia is an online collaborative encyclopedia hosting articles in nearly 250 languages (Cohen). The English-language Wikipedia contains over 3.2 million articles, each of which is created, edited, and updated solely by users (Wikipedia “Welcome”). At the time of this study, Alexa, a website tracking organisation, ranked Wikipedia as the 6th most accessed site on the Internet. Unlike the five sites ahead of it though—Google, Facebook, Yahoo, YouTube (owned by Google), and live.com (owned by Microsoft)—all of which are multibillion-dollar businesses that deal more with information aggregation than information production, Wikipedia is a non-profit that operates on less than $500,000 a year and staffs only a dozen paid employees (Lih). Wikipedia is financed and supported by the WikiMedia Foundation, a charitable umbrella organisation with an annual budget of $4.6 million, mainly funded by donations (Middleton).Wikipedia editors and contributors have the option of creating a user profile and participating via a username, or they may participate anonymously, with only an IP address representing their actions. Despite the option for total anonymity, many Wikipedians have chosen to visibly engage in this online community (Ayers, Matthews, and Yates; Bruns; Lih), and researchers across disciplines are studying the motivations of these new online collectives (Kane, Majchrzak, Johnson, and Chenisern; Oreg and Nov). The motivations of open source software contributors, such as UNIX programmers and programming groups, have been shown to be complex and tied to both extrinsic and intrinsic rewards, including online reputation, self-satisfaction and enjoyment, and obligation to a greater common good (Hertel, Niedner, and Herrmann; Osterloh and Rota). Investigation into why Wikipedians edit has indicated multiple motivations as well, with community engagement, task enjoyment, and information sharing among the most significant (Schroer and Hertel). Additionally, Wikipedians seem to be taking up the cause of generativity (a concern for the ongoing health and openness of the Internet’s infrastructures) that Jonathan Zittrain notably called for in The Future of the Internet and How to Stop It. Governance and ControlAlthough the technical infrastructure of Wikipedia is built to support and perhaps encourage an equal distribution of power on the site, Wikipedia is not a land of “anything goes.” The popular press has covered recent efforts by the site to reduce vandalism through a layer of editorial review (Cohen), a tightening of control cited as a possible reason for the recent dip in the number of active editors (Edwards). A number of regulations are already in place that prevent the open editing of certain articles and pages, such as the site’s disclaimers and pages that have suffered large amounts of vandalism. Editing wars can also cause temporary restrictions to editing, and Ayers, Matthews, and Yates point out that these wars can happen anywhere, even to Burt Reynold’s page.Academic studies have begun to explore the governance and control that has developed in the Wikipedia community, generally highlighting how order is maintained not through particular actors, but through established procedures and norms. Konieczny tested whether Wikipedia’s evolution can be defined by Michels’ Iron Law of Oligopoly, which predicts that the everyday operations of any organisation cannot be run by a mass of members, and ultimately control falls into the hands of the few. Through exploring a particular WikiProject on information validation, he concludes:There are few indicators of an oligarchy having power on Wikipedia, and few trends of a change in this situation. The high level of empowerment of individual Wikipedia editors with regard to policy making, the ease of communication, and the high dedication to ideals of contributors succeed in making Wikipedia an atypical organization, quite resilient to the Iron Law. (189)Butler, Joyce, and Pike support this assertion, though they emphasise that instead of oligarchy, control becomes encapsulated in a wide variety of structures, policies, and procedures that guide involvement with the site. A virtual “bureaucracy” emerges, but one that should not be viewed with the negative connotation often associated with the term.Other work considers control on Wikipedia through the framework of commons governance, where “peer production depends on individual action that is self-selected and decentralized rather than hierarchically assigned. Individuals make their own choices with regard to resources managed as a commons” (Viegas, Wattenberg and McKeon). The need for quality standards and quality control largely dictate this commons governance, though interviewing Wikipedians with various levels of responsibility revealed that policies and procedures are only as good as those who maintain them. Forte, Larco, and Bruckman argue “the Wikipedia community has remained healthy in large part due to the continued presence of ‘old-timers’ who carry a set of social norms and organizational ideals with them into every WikiProject, committee, and local process in which they take part” (71). Thus governance on Wikipedia is a strong representation of a democratic ideal, where actors and policies are closely tied in their evolution. Transparency, Content, and BiasThe issue of transparency has proved to be a double-edged sword for Wikipedia and Wikipedians. The goal of a collective body of knowledge created by all—the “expert” and the “amateur”—can only be upheld if equal access to page creation and development is allotted to everyone, including those who prefer anonymity. And yet this very option for anonymity, or even worse, false identities, has been a sore subject for some in the Wikipedia community as well as a source of concern for some scholars (Santana and Wood). The case of a 24-year old college dropout who represented himself as a multiple Ph.D.-holding theology scholar and edited over 16,000 articles brought these issues into the public spotlight in 2007 (Doran; Elsworth). Wikipedia itself has set up standards for content that include expectations of a neutral point of view, verifiability of information, and the publishing of no original research, but Santana and Wood argue that self-policing of these policies is not adequate:The principle of managerial discretion requires that every actor act from a sense of duty to exercise moral autonomy and choice in responsible ways. When Wikipedia’s editors and administrators remain anonymous, this criterion is simply not met. It is assumed that everyone is behaving responsibly within the Wikipedia system, but there are no monitoring or control mechanisms to make sure that this is so, and there is ample evidence that it is not so. (141) At the theoretical level, some downplay these concerns of transparency and autonomy as logistical issues in lieu of the potential for information systems to support rational discourse and emancipatory forms of communication (Hansen, Berente, and Lyytinen), but others worry that the questionable “realities” created on Wikipedia will become truths once circulated to all areas of the Web (Langlois and Elmer). With the number of articles on the English-language version of Wikipedia reaching well into the millions, the task of mapping and assessing content has become a tremendous endeavour, one mostly taken on by information systems experts. Kittur, Chi, and Suh have used Wikipedia’s existing hierarchical categorisation structure to map change in the site’s content over the past few years. Their work revealed that in early 2008 “Culture and the arts” was the most dominant category of content on Wikipedia, representing nearly 30% of total content. People (15%) and geographical locations (14%) represent the next largest categories, while the natural and physical sciences showed the greatest increase in volume between 2006 and 2008 (+213%D, with “Culture and the arts” close behind at +210%D). This data may indicate that contributing to Wikipedia, and thus spreading knowledge, is growing amongst the academic community while maintaining its importance to the greater popular culture-minded community. Further work by Kittur and Kraut has explored the collaborative process of content creation, finding that too many editors on a particular page can reduce the quality of content, even when a project is well coordinated.Bias in Wikipedia content is a generally acknowledged and somewhat conflicted subject (Giles; Johnson; McHenry). The Wikipedia community has created numerous articles and pages within the site to define and discuss the problem. Citing a survey conducted by the University of Würzburg, Germany, the “Wikipedia:Systemic bias” page describes the average Wikipedian as:MaleTechnically inclinedFormally educatedAn English speakerWhiteAged 15-49From a majority Christian countryFrom a developed nationFrom the Northern HemisphereLikely a white-collar worker or studentBias in content is thought to be perpetuated by this demographic of contributor, and the “founder effect,” a concept from genetics, linking the original contributors to this same demographic has been used to explain the origins of certain biases. Wikipedia’s “About” page discusses the issue as well, in the context of the open platform’s strengths and weaknesses:in practice editing will be performed by a certain demographic (younger rather than older, male rather than female, rich enough to afford a computer rather than poor, etc.) and may, therefore, show some bias. Some topics may not be covered well, while others may be covered in great depth. No educated arguments against this inherent bias have been advanced.Royal and Kapila’s study of Wikipedia content tested some of these assertions, finding identifiable bias in both their purposive and random sampling. They conclude that bias favoring larger countries is positively correlated with the size of the country’s Internet population, and corporations with larger revenues work in much the same way, garnering more coverage on the site. The researchers remind us that Wikipedia is “more a socially produced document than a value-free information source” (Royal & Kapila).WikiProject: Countering Systemic BiasAs a coalition of current Wikipedia editors, the WikiProject: Countering Systemic Bias (WP:CSB) attempts to counter trends in content production and points of view deemed harmful to the democratic ideals of a valueless, open online encyclopedia. WP:CBS’s mission is not one of policing the site, but rather deepening it:Generally, this project concentrates upon remedying omissions (entire topics, or particular sub-topics in extant articles) rather than on either (1) protesting inappropriate inclusions, or (2) trying to remedy issues of how material is presented. Thus, the first question is "What haven't we covered yet?", rather than "how should we change the existing coverage?" (Wikipedia, “Countering”)The project lays out a number of content areas lacking adequate representation, geographically highlighting the dearth in coverage of Africa, Latin America, Asia, and parts of Eastern Europe. WP:CSB also includes a “members” page that editors can sign to show their support, along with space to voice their opinions on the problem of bias on Wikipedia (the quotations at the beginning of this paper are taken from this “members” page). At the time of this study, 329 editors had self-selected and self-identified as members of WP:CSB, and this group constitutes the population sample for the current study. To explore the extent to which WP:CSB addressed these self-identified areas for improvement, each editor’s last 50 edits were coded for their primary geographical country of interest, as well as the conceptual category of the page itself (“P” for person/people, “L” for location, “I” for idea/concept, “T” for object/thing, or “NA” for indeterminate). For example, edits to the Wikipedia page for a single person like Tony Abbott (Australian federal opposition leader) were coded “Australia, P”, while an edit for a group of people like the Manchester United football team would be coded “England, P”. Coding was based on information obtained from the header paragraphs of each article’s Wikipedia page. After coding was completed, corresponding information on each country’s associated continent was added to the dataset, based on the United Nations Statistics Division listing.A total of 15,616 edits were coded for the study. Nearly 32% (n = 4962) of these edits were on articles for persons or people (see Table 1 for complete coding results). From within this sub-sample of edits, a majority of the people (68.67%) represented are associated with North America and Europe (Figure A). If we break these statistics down further, nearly half of WP:CSB’s edits concerning people were associated with the United States (36.11%) and England (10.16%), with India (3.65%) and Australia (3.35%) following at a distance. These figures make sense for the English-language Wikipedia; over 95% of the population in the three Westernised countries speak English, and while India is still often regarded as a developing nation, its colonial British roots and the emergence of a market economy with large, technology-driven cities are logical explanations for its representation here (and some estimates make India the largest English-speaking nation by population on the globe today).Table A Coding Results Total Edits 15616 (I) Ideas 2881 18.45% (L) Location 2240 14.34% NA 333 2.13% (T) Thing 5200 33.30% (P) People 4962 31.78% People by Continent Africa 315 6.35% Asia 827 16.67% Australia 175 3.53% Europe 1411 28.44% NA 110 2.22% North America 1996 40.23% South America 128 2.58% The areas of the globe of main concern to WP:CSB proved to be much less represented by the coalition itself. Asia, far and away the most populous continent with more than 60% of the globe’s people (GeoHive), was represented in only 16.67% of edits. Africa (6.35%) and South America (2.58%) were equally underrepresented compared to both their real-world populations (15% and 9% of the globe’s population respectively) and the aforementioned dominance of the advanced Westernised areas. However, while these percentages may seem low, in aggregate they do meet the quota set on the WP:CSB Project Page calling for one out of every twenty edits to be “a subject that is systematically biased against the pages of your natural interests.” By this standard, the coalition is indeed making headway in adding content that strategically counterbalances the natural biases of Wikipedia’s average editor.Figure ASocial network analysis allows us to visualise multifaceted data in order to identify relationships between actors and content (Vego-Redondo; Watts). Similar to Davis’s well-known sociological study of Southern American socialites in the 1930s (Scott), our Wikipedia coalition can be conceptualised as individual actors united by common interests, and a network of relations can be constructed with software such as UCINET. A mapping algorithm that considers both the relationship between all sets of actors and each actor to the overall collective structure produces an image of our network. This initial network is bimodal, as both our Wikipedia editors and their edits (again, coded for country of interest) are displayed as nodes (Figure B). Edge-lines between nodes represents a relationship, and here that relationship is the act of editing a Wikipedia article. We see from our network that the “U.S.” and “England” hold central positions in the network, with a mass of editors crowding around them. A perimeter of nations is then held in place by their ties to editors through the U.S. and England, with a second layer of editors and poorly represented nations (Gabon, Laos, Uzbekistan, etc.) around the boundaries of the network.Figure BWe are reminded from this visualisation both of the centrality of the two Western powers even among WP:CSB editoss, and of the peripheral nature of most other nations in the world. But we also learn which editors in the project are contributing most to underrepresented areas, and which are less “tied” to the Western core. Here we see “Wizzy” and “Warofdreams” among the second layer of editors who act as a bridge between the core and the periphery; these are editors with interests in both the Western and marginalised nations. Located along the outer edge, “Gallador” and “Gerrit” have no direct ties to the U.S. or England, concentrating all of their edits on less represented areas of the globe. Identifying editors at these key positions in the network will help with future research, informing interview questions that will investigate their interests further, but more significantly, probing motives for participation and action within the coalition.Additionally, we can break the network down further to discover editors who appear to have similar interests in underrepresented areas. Figure C strips down the network to only editors and edits dealing with Africa and South America, the least represented continents. From this we can easily find three types of editors again: those who have singular interests in particular nations (the outermost layer of editors), those who have interests in a particular region (the second layer moving inward), and those who have interests in both of these underrepresented regions (the center layer in the figure). This last group of editors may prove to be the most crucial to understand, as they are carrying the full load of WP:CSB’s mission.Figure CThe End of Geography, or the Reclamation?In The Internet Galaxy, Manuel Castells writes that “the Internet Age has been hailed as the end of geography,” a bold suggestion, but one that has gained traction over the last 15 years as the excitement for the possibilities offered by information communication technologies has often overshadowed structural barriers to participation like the Digital Divide (207). Castells goes on to amend the “end of geography” thesis by showing how global information flows and regional Internet access rates, while creating a new “map” of the world in many ways, is still closely tied to power structures in the analog world. The Internet Age: “redefines distance but does not cancel geography” (207). The work of WikiProject: Countering Systemic Bias emphasises the importance of place and representation in the information environment that continues to be constructed in the online world. This study looked at only a small portion of this coalition’s efforts (~16,000 edits)—a snapshot of their labor frozen in time—which itself is only a minute portion of the information being dispatched through Wikipedia on a daily basis (~125,000 edits). Further analysis of WP:CSB’s work over time, as well as qualitative research into the identities, interests and motivations of this collective, is needed to understand more fully how information bias is understood and challenged in the Internet galaxy. The data here indicates this is a fight worth fighting for at least a growing few.ReferencesAlexa. “Top Sites.” Alexa.com, n.d. 10 Mar. 2010 ‹http://www.alexa.com/topsites>. Ayers, Phoebe, Charles Matthews, and Ben Yates. How Wikipedia Works: And How You Can Be a Part of It. San Francisco, CA: No Starch, 2008.Bruns, Axel. Blogs, Wikipedia, Second Life, and Beyond: From Production to Produsage. New York: Peter Lang, 2008.Butler, Brian, Elisabeth Joyce, and Jacqueline Pike. Don’t Look Now, But We’ve Created a Bureaucracy: The Nature and Roles of Policies and Rules in Wikipedia. Paper presented at 2008 CHI Annual Conference, Florence.Castells, Manuel. The Internet Galaxy: Reflections on the Internet, Business, and Society. Oxford: Oxford UP, 2001.Cohen, Noam. “Wikipedia.” New York Times, n.d. 12 Mar. 2010 ‹http://www.nytimes.com/info/wikipedia/>. Doran, James. “Wikipedia Chief Promises Change after ‘Expert’ Exposed as Fraud.” The Times, 6 Mar. 2007 ‹http://technology.timesonline.co.uk/tol/news/tech_and_web/article1480012.ece>. Edwards, Lin. “Report Claims Wikipedia Losing Editors in Droves.” Physorg.com, 30 Nov 2009. 12 Feb. 2010 ‹http://www.physorg.com/news178787309.html>. Elsworth, Catherine. “Fake Wikipedia Prof Altered 20,000 Entries.” London Telegraph, 6 Mar. 2007 ‹http://www.telegraph.co.uk/news/1544737/Fake-Wikipedia-prof-altered-20000-entries.html>. Forte, Andrea, Vanessa Larco, and Amy Bruckman. “Decentralization in Wikipedia Governance.” Journal of Management Information Systems 26 (2009): 49-72.Giles, Jim. “Internet Encyclopedias Go Head to Head.” Nature 438 (2005): 900-901.Hansen, Sean, Nicholas Berente, and Kalle Lyytinen. “Wikipedia, Critical Social Theory, and the Possibility of Rational Discourse.” The Information Society 25 (2009): 38-59.Hertel, Guido, Sven Niedner, and Stefanie Herrmann. “Motivation of Software Developers in Open Source Projects: An Internet-Based Survey of Contributors to the Linex Kernel.” Research Policy 32 (2003): 1159-1177.Johnson, Bobbie. “Rightwing Website Challenges ‘Liberal Bias’ of Wikipedia.” The Guardian, 1 Mar. 2007. 8 Mar. 2010 ‹http://www.guardian.co.uk/technology/2007/mar/01/wikipedia.news>. Kane, Gerald C., Ann Majchrzak, Jeremaih Johnson, and Lily Chenisern. A Longitudinal Model of Perspective Making and Perspective Taking within Fluid Online Collectives. Paper presented at the 2009 International Conference on Information Systems, Phoenix, AZ, 2009.Kittur, Aniket, Ed H. Chi, and Bongwon Suh. What’s in Wikipedia? Mapping Topics and Conflict Using Socially Annotated Category Structure. Paper presented at the 2009 CHI Annual Conference, Boston, MA.———, and Robert E. Kraut. Harnessing the Wisdom of Crowds in Wikipedia: Quality through Collaboration. Paper presented at the 2008 Association for Computing Machinery’s Computer Supported Cooperative Work Annual Conference, San Diego, CA.Konieczny, Piotr. “Governance, Organization, and Democracy on the Internet: The Iron Law and the Evolution of Wikipedia.” Sociological Forum 24 (2009): 162-191.———. “Wikipedia: Community or Social Movement?” Interface: A Journal for and about Social Movements 1 (2009): 212-232.Langlois, Ganaele, and Greg Elmer. “Wikipedia Leeches? The Promotion of Traffic through a Collaborative Web Format.” New Media & Society 11 (2009): 773-794.Lih, Andrew. The Wikipedia Revolution. New York, NY: Hyperion, 2009.McHenry, Robert. “The Real Bias in Wikipedia: A Response to David Shariatmadari.” OpenDemocracy.com 2006. 8 Mar. 2010 ‹http://www.opendemocracy.net/media-edemocracy/wikipedia_bias_3621.jsp>. Middleton, Chris. “The World of Wikinomics.” Computer Weekly, 20 Jan. 2009: 22-26.Oreg, Shaul, and Oded Nov. “Exploring Motivations for Contributing to Open Source Initiatives: The Roles of Contribution, Context and Personal Values.” Computers in Human Behavior 24 (2008): 2055-2073.Osterloh, Margit and Sandra Rota. “Trust and Community in Open Source Software Production.” Analyse & Kritik 26 (2004): 279-301.Royal, Cindy, and Deepina Kapila. “What’s on Wikipedia, and What’s Not…?: Assessing Completeness of Information.” Social Science Computer Review 27 (2008): 138-148.Santana, Adele, and Donna J. Wood. “Transparency and Social Responsibility Issues for Wikipedia.” Ethics of Information Technology 11 (2009): 133-144.Schroer, Joachim, and Guido Hertel. “Voluntary Engagement in an Open Web-Based Encyclopedia: Wikipedians and Why They Do It.” Media Psychology 12 (2009): 96-120.Scott, John. Social Network Analysis. London: Sage, 1991.Vego-Redondo, Fernando. Complex Social Networks. Cambridge: Cambridge UP, 2007.Viegas, Fernanda B., Martin Wattenberg, and Matthew M. McKeon. “The Hidden Order of Wikipedia.” Online Communities and Social Computing (2007): 445-454.Watts, Duncan. Six Degrees: The Science of a Connected Age. New York, NY: W. W. Norton & Company, 2003Wikipedia. “About.” n.d. 8 Mar. 2010 ‹http://en.wikipedia.org/wiki/Wikipedia:About>. ———. “Welcome to Wikipedia.” n.d. 8 Mar. 2010 ‹http://en.wikipedia.org/wiki/Main_Page>.———. “Wikiproject:Countering Systemic Bias.” n.d. 12 Feb. 2010 ‹http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Countering_systemic_bias#Members>. Zittrain, Jonathan. The Future of the Internet and How to Stop It. New Haven, CT: Yale UP, 2008.
APA, Harvard, Vancouver, ISO, and other styles
15

Burns, Alex. "Oblique Strategies for Ambient Journalism." M/C Journal 13, no. 2 (April 15, 2010). http://dx.doi.org/10.5204/mcj.230.

Full text
Abstract:
Alfred Hermida recently posited ‘ambient journalism’ as a new framework for para- and professional journalists, who use social networks like Twitter for story sources, and as a news delivery platform. Beginning with this framework, this article explores the following questions: How does Hermida define ‘ambient journalism’ and what is its significance? Are there alternative definitions? What lessons do current platforms provide for the design of future, real-time platforms that ‘ambient journalists’ might use? What lessons does the work of Brian Eno provide–the musician and producer who coined the term ‘ambient music’ over three decades ago? My aim here is to formulate an alternative definition of ambient journalism that emphasises craft, skills acquisition, and the mental models of professional journalists, which are the foundations more generally for journalism practices. Rather than Hermida’s participatory media context I emphasise ‘institutional adaptiveness’: how journalists and newsrooms in media institutions rely on craft and skills, and how emerging platforms can augment these foundations, rather than replace them. Hermida’s Ambient Journalism and the Role of Journalists Hermida describes ambient journalism as: “broad, asynchronous, lightweight and always-on communication systems [that] are creating new kinds of interactions around the news, and are enabling citizens to maintain a mental model of news and events around them” (Hermida 2). His ideas appear to have two related aspects. He conceives ambient journalism as an “awareness system” between individuals that functions as a collective intelligence or kind of ‘distributed cognition’ at a group level (Hermida 2, 4-6). Facebook, Twitter and other online social networks are examples. Hermida also suggests that such networks enable non-professionals to engage in ‘communication’ and ‘conversation’ about news and media events (Hermida 2, 7). In a helpful clarification, Hermida observes that ‘para-journalists’ are like the paralegals or non-lawyers who provide administrative support in the legal profession and, in academic debates about journalism, are more commonly known as ‘citizen journalists’. Thus, Hermida’s ambient journalism appears to be: (1) an information systems model of new platforms and networks, and (2) a normative argument that these tools empower ‘para-journalists’ to engage in journalism and real-time commentary. Hermida’s thesis is intriguing and worthy of further discussion and debate. As currently formulated however it risks sharing the blind-spots and contradictions of the academic literature that Hermida cites, which suffers from poor theory-building (Burns). A major reason is that the participatory media context on which Hermida often builds his work has different mental models and normative theories than the journalists or media institutions that are the target of critique. Ambient journalism would be a stronger and more convincing framework if these incorrect assumptions were jettisoned. Others may also potentially misunderstand what Hermida proposes, because the academic debate is often polarised between para-journalists and professional journalists, due to different views about institutions, the politics of knowledge, decision heuristics, journalist training, and normative theoretical traditions (Christians et al. 126; Cole and Harcup 166-176). In the academic debate, para-journalists or ‘citizen journalists’ may be said to have a communitarian ethic and desire more autonomous solutions to journalists who are framed as uncritical and reliant on official sources, and to media institutions who are portrayed as surveillance-like ‘monitors’ of society (Christians et al. 124-127). This is however only one of a range of possible relationships. Sole reliance on para-journalists could be a premature solution to a more complex media ecology. Journalism craft, which does not rely just on official sources, also has a range of practices that already provides the “more complex ways of understanding and reporting on the subtleties of public communication” sought (Hermida 2). Citizen- and para-journalist accounts may overlook micro-studies in how newsrooms adopt technological innovations and integrate them into newsgathering routines (Hemmingway 196). Thus, an examination of the realities of professional journalism will help to cast a better light on how ambient journalism can shape the mental models of para-journalists, and provide more rigorous analysis of news and similar events. Professional journalism has several core dimensions that para-journalists may overlook. Journalism’s foundation as an experiential craft includes guidance and norms that orient the journalist to information, and that includes practitioner ethics. This craft is experiential; the basis for journalism’s claim to “social expertise” as a discipline; and more like the original Linux and Open Source movements which evolved through creative conflict (Sennett 9, 25-27, 125-127, 249-251). There are learnable, transmissible skills to contextually evaluate, filter, select and distil the essential insights. This craft-based foundation and skills informs and structures the journalist’s cognitive witnessing of an event, either directly or via reconstructed, cultivated sources. The journalist publishes through a recognised media institution or online platform, which provides communal validation and verification. There is far more here than the academic portrayal of journalists as ‘gate-watchers’ for a ‘corporatist’ media elite. Craft and skills distinguish the professional journalist from Hermida’s para-journalist. Increasingly, media institutions hire journalists who are trained in other craft-based research methods (Burns and Saunders). Bethany McLean who ‘broke’ the Enron scandal was an investment banker; documentary filmmaker Errol Morris first interviewed serial killers for an early project; and Neil Chenoweth used ‘forensic accounting’ techniques to investigate Rupert Murdoch and Kerry Packer. Such expertise allows the journalist to filter information, and to mediate any influences in the external environment, in order to develop an individualised, ‘embodied’ perspective (Hofstadter 234; Thompson; Garfinkel and Rawls). Para-journalists and social network platforms cannot replace this expertise, which is often unique to individual journalists and their research teams. Ambient Journalism and Twitter Current academic debates about how citizen- and para-journalists may augment or even replace professional journalists can often turn into legitimation battles whether the ‘de facto’ solution is a social media network rather than a media institution. For example, Hermida discusses Twitter, a micro-blogging platform that allows users to post 140-character messages that are small, discrete information chunks, for short-term and episodic memory. Twitter enables users to monitor other users, to group other messages, and to search for terms specified by a hashtag. Twitter thus illustrates how social media platforms can make data more transparent and explicit to non-specialists like para-journalists. In fact, Twitter is suitable for five different categories of real-time information: news, pre-news, rumours, the formation of social media and subject-based networks, and “molecular search” using granular data-mining tools (Leinweber 204-205). In this model, the para-journalist acts as a navigator and “way-finder” to new information (Morville, Findability). Jaron Lanier, an early designer of ‘virtual reality’ systems, is perhaps the most vocal critic of relying on groups of non-experts and tools like Twitter, instead of individuals who have professional expertise. For Lanier, what underlies debates about citizen- and para-journalists is a philosophy of “cybernetic totalism” and “digital Maoism” which exalts the Internet collective at the expense of truly individual views. He is deeply critical of Hermida’s chosen platform, Twitter: “A design that shares Twitter’s feature of providing ambient continuous contact between people could perhaps drop Twitter’s adoration of fragments. We don’t really know, because it is an unexplored design space” [emphasis added] (Lanier 24). In part, Lanier’s objection is traceable back to an unresolved debate on human factors and design in information science. Influenced by the post-war research into cybernetics, J.C.R. Licklider proposed a cyborg-like model of “man-machine symbiosis” between computers and humans (Licklider). In turn, Licklider’s framework influenced Douglas Engelbart, who shaped the growth of human-computer interaction, and the design of computer interfaces, the mouse, and other tools (Engelbart). In taking a system-level view of platforms Hermida builds on the strength of Licklider and Engelbart’s work. Yet because he focuses on para-journalists, and does not appear to include the craft and skills-based expertise of professional journalists, it is unclear how he would answer Lanier’s fears about how reliance on groups for news and other information is superior to individual expertise and judgment. Hermida’s two case studies point to this unresolved problem. Both cases appear to show how Twitter provides quicker and better forms of news and information, thereby increasing the effectiveness of para-journalists to engage in journalism and real-time commentary. However, alternative explanations may exist that raise questions about Twitter as a new platform, and thus these cases might actually reveal circumstances in which ambient journalism may fail. Hermida alludes to how para-journalists now fulfil the earlier role of ‘first responders’ and stringers, in providing the “immediate dissemination” of non-official information about disasters and emergencies (Hermida 1-2; Haddow and Haddow 117-118). Whilst important, this is really a specific role. In fact, disaster and emergency reporting occurs within well-established practices, professional ethics, and institutional routines that may involve journalists, government officials, and professional communication experts (Moeller). Officials and emergency management planners are concerned that citizen- or para-journalism is equated with the craft and skills of professional journalism. The experience of these officials and planners in 2005’s Hurricane Katrina in the United States, and in 2009’s Black Saturday bushfires in Australia, suggests that whilst para-journalists might be ‘first responders’ in a decentralised, complex crisis, they are perceived to spread rumours and potential social unrest when people need reliable information (Haddow and Haddow 39). These terms of engagement between officials, planners and para-journalists are still to be resolved. Hermida readily acknowledges that Twitter and other social network platforms are vulnerable to rumours (Hermida 3-4; Sunstein). However, his other case study, Iran’s 2009 election crisis, further complicates the vision of ambient journalism, and always-on communication systems in particular. Hermida discusses several events during the crisis: the US State Department request to halt a server upgrade, how the Basij’s shooting of bystander Neda Soltan was captured on a mobile phone camera, the spread across social network platforms, and the high-velocity number of ‘tweets’ or messages during the first two weeks of Iran’s electoral uncertainty (Hermida 1). The US State Department was interested in how Twitter could be used for non-official sources, and to inform people who were monitoring the election events. Twitter’s perceived ‘success’ during Iran’s 2009 election now looks rather different when other factors are considered such as: the dynamics and patterns of Tehran street protests; Iran’s clerics who used Soltan’s death as propaganda; claims that Iran’s intelligence services used Twitter to track down and to kill protestors; the ‘black box’ case of what the US State Department and others actually did during the crisis; the history of neo-conservative interest in a Twitter-like platform for strategic information operations; and the Iranian diaspora’s incitement of Tehran student protests via satellite broadcasts. Iran’s 2009 election crisis has important lessons for ambient journalism: always-on communication systems may create noise and spread rumours; ‘mirror-imaging’ of mental models may occur, when other participants have very different worldviews and ‘contexts of use’ for social network platforms; and the new kinds of interaction may not lead to effective intervention in crisis events. Hermida’s combination of news and non-news fragments is the perfect environment for psychological operations and strategic information warfare (Burns and Eltham). Lessons of Current Platforms for Ambient Journalism We have discussed some unresolved problems for ambient journalism as a framework for journalists, and as mental models for news and similar events. Hermida’s goal of an “awareness system” faces a further challenge: the phenomenological limitations of human consciousness to deal with information complexity and ambiguous situations, whether by becoming ‘entangled’ in abstract information or by developing new, unexpected uses for emergent technologies (Thackara; Thompson; Hofstadter 101-102, 186; Morville, Findability, 55, 57, 158). The recursive and reflective capacities of human consciousness imposes its own epistemological frames. It’s still unclear how Licklider’s human-computer interaction will shape consciousness, but Douglas Hofstadter’s experiments with art and video-based group experiments may be suggestive. Hofstadter observes: “the interpenetration of our worlds becomes so great that our worldviews start to fuse” (266). Current research into user experience and information design provides some validation of Hofstadter’s experience, such as how Google is now the ‘default’ search engine, and how its interface design shapes the user’s subjective experience of online search (Morville, Findability; Morville, Search Patterns). Several models of Hermida’s awareness system already exist that build on Hofstadter’s insight. Within the information systems field, on-going research into artificial intelligence–‘expert systems’ that can model expertise as algorithms and decision rules, genetic algorithms, and evolutionary computation–has attempted to achieve Hermida’s goal. What these systems share are mental models of cognition, learning and adaptiveness to new information, often with forecasting and prediction capabilities. Such systems work in journalism areas such as finance and sports that involve analytics, data-mining and statistics, and in related fields such as health informatics where there are clear, explicit guidelines on information and international standards. After a mid-1980s investment bubble (Leinweber 183-184) these systems now underpin the technology platforms of global finance and news intermediaries. Bloomberg LP’s ubiquitous dual-screen computers, proprietary network and data analytics (www.bloomberg.com), and its competitors such as Thomson Reuters (www.thomsonreuters.com and www.reuters.com), illustrate how financial analysts and traders rely on an “awareness system” to navigate global stock-markets (Clifford and Creswell). For example, a Bloomberg subscriber can access real-time analytics from exchanges, markets, and from data vendors such as Dow Jones, NYSE Euronext and Thomson Reuters. They can use portfolio management tools to evaluate market information, to make allocation and trading decisions, to monitor ‘breaking’ news, and to integrate this information. Twitter is perhaps the para-journalist equivalent to how professional journalists and finance analysts rely on Bloomberg’s platform for real-time market and business information. Already, hedge funds like PhaseCapital are data-mining Twitter’s ‘tweets’ or messages for rumours, shifts in stock-market sentiment, and to analyse potential trading patterns (Pritchett and Palmer). The US-based Securities and Exchange Commission, and researchers like David Gelernter and Paul Tetlock, have also shown the benefits of applied data-mining for regulatory market supervision, in particular to uncover analysts who provide ‘whisper numbers’ to online message boards, and who have access to material, non-public information (Leinweber 60, 136, 144-145, 208, 219, 241-246). Hermida’s framework might be developed further for such regulatory supervision. Hermida’s awareness system may also benefit from the algorithms found in high-frequency trading (HFT) systems that Citadel Group, Goldman Sachs, Renaissance Technologies, and other quantitative financial institutions use. Rather than human traders, HFT uses co-located servers and complex algorithms, to make high-volume trades on stock-markets that take advantage of microsecond changes in prices (Duhigg). HFT capabilities are shrouded in secrecy, and became the focus of regulatory attention after several high-profile investigations of traders alleged to have stolen the software code (Bray and Bunge). One public example is Streambase (www.streambase.com), a ‘complex event processing’ (CEP) platform that can be used in HFT, and commercialised from the Project Aurora research collaboration between Brandeis University, Brown University, and Massachusetts Institute of Technology. CEP and HFT may be the ‘killer apps’ of Hermida’s awareness system. Alternatively, they may confirm Jaron Lanier’s worst fears: your data-stream and user-generated content can be harvested by others–for their gain, and your loss! Conclusion: Brian Eno and Redefining ‘Ambient Journalism’ On the basis of the above discussion, I suggest a modified definition of Hermida’s thesis: ‘Ambient journalism’ is an emerging analytical framework for journalists, informed by cognitive, cybernetic, and information systems research. It ‘sensitises’ the individual journalist, whether professional or ‘para-professional’, to observe and to evaluate their immediate context. In doing so, ‘ambient journalism’, like journalism generally, emphasises ‘novel’ information. It can also inform the design of real-time platforms for journalistic sources and news delivery. Individual ‘ambient journalists’ can learn much from the career of musician and producer Brian Eno. His personal definition of ‘ambient’ is “an atmosphere, or a surrounding influence: a tint,” that relies on the co-evolution of the musician, creative horizons, and studio technology as a tool, just as para-journalists use Twitter as a platform (Sheppard 278; Eno 293-297). Like para-journalists, Eno claims to be a “self-educated but largely untrained” musician and yet also a craft-based producer (McFadzean; Tamm 177; 44-50). Perhaps Eno would frame the distinction between para-journalist and professional journalist as “axis thinking” (Eno 298, 302) which is needlessly polarised due to different normative theories, stances, and practices. Furthermore, I would argue that Eno’s worldview was shaped by similar influences to Licklider and Engelbart, who appear to have informed Hermida’s assumptions. These influences include the mathematician and game theorist John von Neumann and biologist Richard Dawkins (Eno 162); musicians Eric Satie, John Cage and his book Silence (Eno 19-22, 162; Sheppard 22, 36, 378-379); and the field of self-organising systems, in particular cyberneticist Stafford Beer (Eno 245; Tamm 86; Sheppard 224). Eno summed up the central lesson of this theoretical corpus during his collaborations with New York’s ‘No Wave’ scene in 1978, of “people experimenting with their lives” (Eno 253; Reynolds 146-147; Sheppard 290-295). Importantly, he developed a personal view of normative theories through practice-based research, on a range of projects, and with different creative and collaborative teams. Rather than a technological solution, Eno settled on a way to encode his craft and skills into a quasi-experimental, transmittable method—an aim of practitioner development in professional journalism. Even if only a “founding myth,” the story of Eno’s 1975 street accident with a taxi, and how he conceived ‘ambient music’ during his hospital stay, illustrates how ambient journalists might perceive something new in specific circumstances (Tamm 131; Sheppard 186-188). More tellingly, this background informed his collaboration with the late painter Peter Schmidt, to co-create the Oblique Strategies deck of aphorisms: aleatory, oracular messages that appeared dependent on chance, luck, and randomness, but that in fact were based on Eno and Schmidt’s creative philosophy and work guidelines (Tamm 77-78; Sheppard 178-179; Reynolds 170). In short, Eno was engaging with the kind of reflective practices that underpin exemplary professional journalism. He was able to encode this craft and skills into a quasi-experimental method, rather than a technological solution. Journalists and practitioners who adopt Hermida’s framework could learn much from the published accounts of Eno’s practice-based research, in the context of creative projects and collaborative teams. In particular, these detail the contexts and choices of Eno’s early ambient music recordings (Sheppard 199-200); Eno’s duels with David Bowie during ‘Sense of Doubt’ for the Heroes album (Tamm 158; Sheppard 254-255); troubled collaborations with Talking Heads and David Byrne (Reynolds 165-170; Sheppard; 338-347, 353); a curatorial, mentor role on U2’s The Unforgettable Fire (Sheppard 368-369); the ‘grand, stadium scale’ experiments of U2’s 1991-93 ZooTV tour (Sheppard 404); the Zorn-like games of Bowie’s Outside album (Eno 382-389); and the ‘generative’ artwork 77 Million Paintings (Eno 330-332; Tamm 133-135; Sheppard 278-279; Eno 435). Eno is clearly a highly flexible maker and producer. Developing such flexibility would ensure ambient journalism remains open to novelty as an analytical framework that may enhance the practitioner development and work of professional journalists and para-journalists alike.Acknowledgments The author thanks editor Luke Jaaniste, Alfred Hermida, and the two blind peer reviewers for their constructive feedback and reflective insights. References Bray, Chad, and Jacob Bunge. “Ex-Goldman Programmer Indicted for Trade Secrets Theft.” The Wall Street Journal 12 Feb. 2010. 17 March 2010 ‹http://online.wsj.com/article/SB10001424052748703382904575059660427173510.html›. Burns, Alex. “Select Issues with New Media Theories of Citizen Journalism.” M/C Journal 11.1 (2008). 17 March 2010 ‹http://journal.media-culture.org.au/index.php/mcjournal/article/view/30›.———, and Barry Saunders. “Journalists as Investigators and ‘Quality Media’ Reputation.” Record of the Communications Policy and Research Forum 2009. Eds. Franco Papandrea and Mark Armstrong. Sydney: Network Insight Institute, 281-297. 17 March 2010 ‹http://eprints.vu.edu.au/15229/1/CPRF09BurnsSaunders.pdf›.———, and Ben Eltham. “Twitter Free Iran: An Evaluation of Twitter’s Role in Public Diplomacy and Information Operations in Iran’s 2009 Election Crisis.” Record of the Communications Policy and Research Forum 2009. Eds. Franco Papandrea and Mark Armstrong. Sydney: Network Insight Institute, 298-310. 17 March 2010 ‹http://eprints.vu.edu.au/15230/1/CPRF09BurnsEltham.pdf›. Christians, Clifford G., Theodore Glasser, Denis McQuail, Kaarle Nordenstreng, and Robert A. White. Normative Theories of the Media: Journalism in Democratic Societies. Champaign, IL: University of Illinois Press, 2009. Clifford, Stephanie, and Julie Creswell. “At Bloomberg, Modest Strategy to Rule the World.” The New York Times 14 Nov. 2009. 17 March 2010 ‹http://www.nytimes.com/2009/11/15/business/media/15bloom.html?ref=businessandpagewanted=all›.Cole, Peter, and Tony Harcup. Newspaper Journalism. Thousand Oaks, CA: Sage Publications, 2010. Duhigg, Charles. “Stock Traders Find Speed Pays, in Milliseconds.” The New York Times 23 July 2009. 17 March 2010 ‹http://www.nytimes.com/2009/07/24/business/24trading.html?_r=2andref=business›. Engelbart, Douglas. “Augmenting Human Intellect: A Conceptual Framework, 1962.” Ed. Neil Spiller. Cyber Reader: Critical Writings for the Digital Era. London: Phaidon Press, 2002. 60-67. Eno, Brian. A Year with Swollen Appendices. London: Faber and Faber, 1996. Garfinkel, Harold, and Anne Warfield Rawls. Toward a Sociological Theory of Information. Boulder, CO: Paradigm Publishers, 2008. Hadlow, George D., and Kim S. Haddow. Disaster Communications in a Changing Media World, Butterworth-Heinemann, Burlington MA, 2009. Hemmingway, Emma. Into the Newsroom: Exploring the Digital Production of Regional Television News. Milton Park: Routledge, 2008. Hermida, Alfred. “Twittering the News: The Emergence of Ambient Journalism.” Journalism Practice 4.3 (2010): 1-12. Hofstadter, Douglas. I Am a Strange Loop. New York: Perseus Books, 2007. Lanier, Jaron. You Are Not a Gadget: A Manifesto. London: Allen Lane, 2010. Leinweber, David. Nerds on Wall Street: Math, Machines and Wired Markets. Hoboken, NJ: John Wiley and Sons, 2009. Licklider, J.C.R. “Man-Machine Symbiosis, 1960.” Ed. Neil Spiller. Cyber Reader: Critical Writings for the Digital Era, London: Phaidon Press, 2002. 52-59. McFadzean, Elspeth. “What Can We Learn from Creative People? The Story of Brian Eno.” Management Decision 38.1 (2000): 51-56. Moeller, Susan. Compassion Fatigue: How the Media Sell Disease, Famine, War and Death. New York: Routledge, 1998. Morville, Peter. Ambient Findability. Sebastopol, CA: O’Reilly Press, 2005. ———. Search Patterns. Sebastopol, CA: O’Reilly Press, 2010.Pritchett, Eric, and Mark Palmer. ‘Following the Tweet Trail.’ CNBC 11 July 2009. 17 March 2010 ‹http://www.casttv.com/ext/ug0p08›. Reynolds, Simon. Rip It Up and Start Again: Postpunk 1978-1984. London: Penguin Books, 2006. Sennett, Richard. The Craftsman. London: Penguin Books, 2008. Sheppard, David. On Some Faraway Beach: The Life and Times of Brian Eno. London: Orion Books, 2008. Sunstein, Cass. On Rumours: How Falsehoods Spread, Why We Believe Them, What Can Be Done. New York: Farrar, Straus and Giroux, 2009. Tamm, Eric. Brian Eno: His Music and the Vertical Colour of Sound. New York: Da Capo Press, 1995. Thackara, John. In the Bubble: Designing in a Complex World. Boston, MA: The MIT Press, 1995. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Science of Mind. Boston, MA: Belknap Press, 2007.
APA, Harvard, Vancouver, ISO, and other styles
16

Teng, Yun, Boyuan Pang, and Xiangyu Guo. "Study on the quality improvement on black land in Northeast China under the environment of sustainable agricultural development." Kybernetes, December 13, 2021. http://dx.doi.org/10.1108/k-07-2021-0608.

Full text
Abstract:
Purpose The authors are committed to providing the Chinese government with a foundation for making decisions that will protect black land and ensure long-term agricultural development. Design/methodology/approach Using the grounded theory approach, this study investigates the influencing factors affecting the quality of black land in Northeast China and proposes a hypothetical model for the mechanism of the influencing factors on the quality of black land in Northeast China. Findings The factors influencing the quality of black land include not only soil quality, ecological quality and environmental quality, but also economic quality and management quality, and can be classified into five categories. There are complex influence relationships between various factors and black land quality, with soil quality, ecological quality, environmental quality and management quality having a positive influence on economic quality. Soil quality, ecological quality and environmental quality are all improved as a result of good management. Black land quality is influenced positively by environmental quality, economic quality and management quality. Research limitations/implications The quality of black land is a major concern in terms of food production and long-term agricultural development. The black land in Northeast China was chosen as the subject of this study, and the research findings have some limitations. The next step will be to expand from studying the black land in Northeast China to the black land worldwide. Originality/value In Northeast China, the quality of the five dimensions of black land must be improved in a coordinated and consistent manner.
APA, Harvard, Vancouver, ISO, and other styles
17

Latifian, Ahmad. "How does cloud computing help businesses to manage big data issues." Kybernetes, January 6, 2022. http://dx.doi.org/10.1108/k-05-2021-0432.

Full text
Abstract:
Purpose Big data has posed problems for businesses, the Information Technology (IT) sector and the science community. The problems posed by big data can be effectively addressed using cloud computing and associated distributed computing technology. Cloud computing and big data are two significant past-year problems that allow high-efficiency and competitive computing tools to be delivered as IT services. The paper aims to examine the role of the cloud as a tool for managing big data in various aspects to help businesses. Design/methodology/approach This paper delivers solutions in the cloud for storing, compressing, analyzing and processing big data. Hence, articles were divided into four categories: articles on big data storage, articles on big data processing, articles on analyzing and finally, articles on data compression in cloud computing. This article is based on a systematic literature review. Also, it is based on a review of 19 published papers on big data. Findings From the results, it can be inferred that cloud computing technology has features that can be useful for big data management. Challenging issues are raised in each section. For example, in storing big data, privacy and security issues are challenging. Research limitations/implications There were limitations to this systematic review. The first limitation is that only English articles were reviewed. Also, articles that matched the keywords were used. Finally, in this review, authoritative articles were reviewed, and slides and tutorials were avoided. Practical implications The research presents new insight into the business value of cloud computing in interfirm collaborations. Originality/value Previous research has often examined other aspects of big data in the cloud. This article takes a new approach to the subject. It allows big data researchers to comprehend the various aspects of big data management in the cloud. In addition, setting an agenda for future research saves time and effort for readers searching for topics within big data.
APA, Harvard, Vancouver, ISO, and other styles
18

Humphry, Justine. "Making an Impact: Cultural Studies, Media and Contemporary Work." M/C Journal 14, no. 6 (November 18, 2011). http://dx.doi.org/10.5204/mcj.440.

Full text
Abstract:
Cultural Studies has tended to prioritise the domain of leisure and consumption over work as an area for meaning making, in many ways defining everyday life in opposition to work. Greg Noble, a cultural researcher who examined work in the context of the early computerisation of Australian universities made the point that "discussions of everyday life often make the mistake of assuming that everyday life equates with home and family life, or leisure" (87). This article argues for the need within Cultural Studies to focus on work and media as a research area of everyday life. With the growth of flexible and creative labour and the widespread uptake of an array of new media technologies used for work, traditional ways to identify and measure the space and time of work have become increasingly flawed, with implications for how we account for work and negotiate its boundaries. New approaches are needed to address the complex media environments and technological practices that are an increasing part of contemporary working life. Cultural Studies can make a significant impact towards this research agenda by offering new ways to analyse the complex interrelations of space, time and technology in everyday work practice. To further this goal, a new material practices account of work termed Officing is introduced, developed through my doctoral research on professionals' daily use of information and communication technology (ICT). This approach builds on the key cultural concepts of "bricolage" and "appropriation" combined with the idea of "articulation work" proposed by Anselm Strauss, to support the analysis of the office workplace as a contingent and provisional arrangement or process. Officing has a number of benefits as a framework for analysing the nature of work in a highly mediated world. Highlighting the labour that goes into stabilising work platforms makes it possible to assess the claims of productivity and improved work-life balance brought about by new mobile media technologies; to identify previously unidentified sources of time pressure, overwork and intensification and ultimately, to contribute to the design of more sustainable work environments. The Turn Away from Work Work held a central position in social and cultural analysis in the first half of the twentieth century but as Strangleman observed, there was a marked shift away from the study of work from the mid 1970s (3.1). Much of the impulse for this shift came from critiques of the over-emphasis on relations of production and the workplace as the main source of meaning and value (5.1). In line with this position, feminist researchers challenged the traditional division of labour into paid and unpaid work, arguing that this division sustained the false perception of domestic work as non-productive (cf. Delphy; Folbre). Accompanying these critiques were significant changes in work itself, as traditional jobs literally began to disappear with the decline of manufacturing in industrialised countries (6.1). With the turn away from work in academia and the changes in the nature of work, attention shifted to the realm of the market and consumption. One of the important contributions of Cultural Studies has been the focus on the role of the consumer in driving social and technological change and processes of identity formation. Yet, it is a major problem that work is largely marginalised in cultural research of everyday life, especially since, in most industrialised nations, we are working in new ways, in rapidly changing conditions and more than ever before. Research shows that in Australia there has been a steady increase in the average hours of paid work and Australians are working harder (cf. Watson, Buchanan, Campbell and Briggs; Edwards and Wajcman). In the 2008 Australian Work and Life Index (AWALI) Skinner and Pocock found around 55 per cent of employees frequently felt rushed or pressed for time and this was associated with long working hours, work overload and an overall poor work–life interaction (8). These trends have coincided with long-term changes in the type and location of work. In Australia, like many other developed countries, information-based occupations have taken over manufacturing jobs and there has been an increase in part-time and casual work (cf. Watson et al.). Many employees now conduct work outside of the traditional workplace, with the ABS reporting that in 2008, 24 per cent of employees worked at least some hours at home. Many social analysts have explained the rise of casual and flexible labour as related to the transition to global capitalism driven by the expansion of networked information processes (cf. Castells; Van Dijk). This shift is not simply that more workers are producing ideas and information but that the previously separated spheres of production and consumption have blurred (cf. Ritzer and Jurgenson). With this, entirely new industries have sprung up, predicated on the often unpaid for creative labour of individuals, including users of media technologies. A growing chorus of writers are now pointing out that a fragmented, polarised and complex picture is emerging of this so-called "new economy", with significant implications for the quality of work (cf. Edwards and Wajcman; Fudge and Owens; Huws). Indeed, some claim that new conditions of insecure and poor quality employment or "precarious work" are fast becoming the norm. Moreover, this longer-term pattern runs parallel to the production of a multitude of new mobile media technologies, first taken up by professionals and then by the mainstream, challenging the notion that activities are bound to any particular place or time. Reinvigorating Work in Social and Cultural Analysis There are moves to reposition social and cultural analysis to respond to these various trends. Work-life balance is an example of a research and policy area that has emerged since the 1990s. The boundary between the household and the outside world has also been subject to scrutiny by cultural researchers, and these critically examine the intersection between work and consumption, gender and care (cf. Nippert-Eng; Sorenson and Lie; Noble and Lupton, "Consuming" and "Mine"; Lally). These responses are examples of a shift away from what Urry has dubbed "structures and stable organisations" to a concern with flows, movements and the blurring of boundaries between life spheres (5). In a similar vein, researchers recently have proposed alternative ways to describe the changing times and places of employment. In their study of UK professionals, Felstead, Jewson and Walters proposed a model of "plural workscapes" to explain a major shift in the spatial organisation of work (23). Mobility theorists Sheller and Urry have called for the need to "develop a more dynamic conceptualisation of the fluidities and mobilities that have increasingly hybridised the public and private" (113). All of this literature has reinforced a growing concern that in the face of new patterns of production and consumption and with the rise of complex media environments, traditional models and measures of space and time are inadequate to account for contemporary work. Analyses that rely on conventional measures of work based on hourly units clearly point to an increase in the volume of work, the speed of work and to the collision (cf. Pocock) of work and life but fall down in accounting for the complex and often contradictory role of technology. Media technologies are "Janus-faced" as Michael Arnold has suggested, referring to the two-faced Roman god to foreground the contradictory effects at the centre of all technologies (232). Wajcman notes this paradox in her research on mobile media and time, pointing out that mobile phones are just as likely to "save" time as to "consume" it (15). It was precisely this problematic of the complex interactions of the space, time and technology of work that was at stake in my research on the daily use of ICT by professional workers. In the context of changes to the location, activity and meaning of work, and with the multiplying array of old and new media technologies used by workers, how can the boundary and scope of work be determined? What are the implications of these shifting grounds for the experience and quality of work? Officing: A Material Practices Account of Office Work In the remaining article I introduce some of the key ideas and principles of a material practices account developed in my PhD, Officing: Professionals' Daily ICT Use and the Changing Space and Time of Work. This research took place between 2006 and 2007 focusing in-depth on the daily technology practices of twenty professional workers in a municipal council in Sydney and a unit of a global telecommunication company taking part in a trial of a new smart phone. Officing builds on efforts to develop a more accurate account of the space and time of work bringing into play the complex and highly mediated environment in which work takes place. It extends more recent practice-based, actor-network and cultural approaches that have, for some time, been moving towards a more co-constitutive and process-oriented approach to media and technology in society. Turning first to "bricolage" from the French bricole meaning something small and handmade, bricolage refers to the ways that individuals and groups borrow from existing cultural forms and meanings to create new uses, meanings and identities. Initially proposed by Levi-Strauss and then taken up by de Certeau, bricolage has been a useful concept within subculture and lifestyle studies to reveal the creative work performed on signs and meaning systems in forming cultural identities (cf. O'Sullivan et al.). Bricolage is also an important concept for understanding how meanings and uses are inscribed into forms in use rather than being read or activated off their design. This is the process of appropriation, through which both the object and the person are mutually shaped and users gain a sense of control and ownership (cf. Noble and Lupton; Lally; Silverstone and Haddon). The concept of bricolage highlights the improvisational qualities of appropriation and its status as work. A bricoleur is thus a person who constructs new meanings and forms by drawing on and assembling a wide range of resources at hand, sourced from multiple spheres of life. One of the problems with how bricolage and appropriation has been applied to date, notwithstanding the priority given to the domestic sphere, is the tendency to grant individuals and collectives too much control to stabilise the meanings and purposes of technologies. This problem is evident in the research drawing on the framework of "domestication" (cf. Silverstone and Haddon). In practice, the sheer volume of technologically-related issues encountered on a daily basis and the accompanying sense of frustration indicates there is no inevitable drift towards stability, nor are problems merely aberrational or trivial. Instead, daily limits to agency and attempts to overcome these are points at which meanings as well as uses are re-articulated and potentially re-invented. This is where "articulation work" comes in. Initially put forward by Anselm Strauss in 1985, articulation work has become an established analytical tool for informing technology design processes in such fields as Computer-Supported Cooperative Work (CSCW) and Workplace Studies. In these, articulation work is narrowly defined to refer to the real time activities of cooperative work. It includes dealing with contingencies, keeping technologies and systems working and making adjustments to accommodate for problems (Suchman "Supporting", 407). In combination with naturalistic investigations, this concept has facilitated engagement with the increasingly complex technological and media environments of work. It has been a powerful tool for highlighting practices deemed unimportant but which are nevertheless crucial for getting work done. Articulation work, however, has the potential to be applied in a broader sense to explain the significance of the instability of technologies and the efforts to overcome these as transformative in themselves, part of the ongoing process of appropriation that goes well beyond individual tasks or technologies. With clear correspondences to actor-network theory, this expanded definition provides the basis for a new understanding of the office as a temporary and provisional condition of stability achieved through the daily creative and improvisational activities of workers. The office, then, is dependent on and inextricably bound up in its ongoing articulation and crucially, is not bound to a particular place or time. In the context of the large-scale transformations in work already discussed, this expanded definition of articulation work helps to; firstly, address how work is re-organised and re-rationalised through changes to the material conditions of work; secondly, identify the ongoing articulations that this entails and thirdly; understand the role of these articulations in the construction of the space and time of work. This expanded definition is achieved in the newly developed concept of officing. Officing describes a form of labour directed towards the production of a stable office platform. Significantly, one of the main characteristics of this work is that it often goes undetected by organisations as well as by the workers that perform it. As explained later, its "invisibility" is in part a function of its embodiment but also relates to the boundless nature of officing, taking place both inside and outside the workplace, in or out of work time. Officing is made up of a set of interwoven activities of three main types: connecting, synchronising and configuring. Connecting can be understood as aligning technical and social relations for the performance of work at a set time. Synchronising brings together and coordinates different times and temporal demands, for example, the time of "work" with "life" or the time "out in the field" with time "in the workplace". Configuring prepares the space of work, making a single technology or media environment work to some planned action or existing pattern of activity. To give an example of connecting: in the Citizens' Service Centre of the Council, Danielle's morning rituals involved a series of connections even before her work of advising customers begins: My day: get in, sit down, turn on the computer and then slowly open each software program that I will need to use…turn on the phone, key in my password, turn on the headphones and sit there and wait for the calls! (Humphry Officing, 123) These connections not only set up and initiate the performance of work but also mark Danielle's presence in her office. Through these activities, which in practice overlap and blur, the space and time of the office comes to appear as a somewhat separate and mostly invisible structure or infrastructure. The work that goes into making the office stable takes place around the boundary of work with implications for how this boundary is constituted. These efforts do not cluster around boundaries in any simple sense but become part of the process of boundary making, contributing to the construction of categories such as "work" and "life". So, for example, for staff in the smart phone trial, the phone had become their main source of information and communication. Turning their smart phone off, or losing connectivity had ramifications that cascaded throughout their lifeworld. On the one hand, this lead to the breakdown of the distinction between "work" and "life" and a sense of "ever-presence", requiring constant and vigilant "boundary work" (cf. Nippert-Eng). On the other hand, this same state also enabled workers to respond to demands in their own time and across multiple boundaries, giving workers a sense of flexibility, control and of being "in sync". Connecting, configuring and synchronising are activities performed by bodies, producing an embodied transformation. In the tradition of phenomenology, most notably in the works of Heidegger, Merleau-Ponty and more recently Ihde, embodiment is used to explain the relationship between subjects and objects. This concept has since been developed to be understood as not residing in the body but as spread through social, material and discursive arrangements (cf. Haraway, "Situated" and Simians; Henke; Suchman, "Figuring"). Tracing efforts towards making the office stable is thus a way of uncovering how the body, as a constitutive part of a larger arrangement or network, is formed through embodiment, how it gains its competencies, social meanings and ultimately, how workers gain a sense of what it means to be a professional. So, in the smart phone trial, staff managed their connections by replying immediately to their voice, text and data messages. This immediacy not only acted as proof of their presence in the office. It also signalled their commitment to their office: their active participation and value to the organisation and their readiness to perform when called on. Importantly, this embodied transformation also helps to explain how officing becomes an example of "invisible work" (cf. Star and Strauss). Acts of connecting, synchronising and configuring become constituted and forgotten in and through bodies, spaces and times. Through their repeated performance these acts become habits, a transparent means through which the environment of work is navigated in the form of skills and techniques, configurations and routines. In conclusion, researching work in contemporary societies means confronting its marginalisation within cultural research and developing ways to comprehend and measure the interaction of space, time and the ever-multiplying array of media technologies. Officing provides a way to do this by shifting to an understanding of the workplace as a contingent product of work itself. The strength of this approach is that it highlights the creative and ongoing work of individuals on their media infrastructures. It also helps to identify and describe work activities that are not neatly contained in a workplace, thus adding to their invisibility. The invisibility of these practices can have significant impacts on workers: magnifying feelings of time pressure and a need to work faster, longer and harder even as discrete technologies are utilised to save time. In this way, officing exposes some of the additional contributions to the changing experience and quality of work as well as to the construction of everyday domains. Officing supports an evaluation of claims of productivity and work-life balance in relation to new media technologies. In the smart phone trial, contrary to an assumed increase in productivity, mobility of work was achieved at the expense of productivity. Making the mobile office stable—getting it up and running, keeping it working in changing environments and meeting expectations of speed and connectivity—took up time, resulting in an overall productivity loss and demanding more "boundary work". In spite of their adaptability and flexibility, staff tended to overwork to counteract this loss. This represented a major shift in the burden of effort in the production of office forms away from the organisation and towards the individual. Finally, though not addressed here in any detail, officing could conceivably have practical uses for designing more sustainable office environments that better support the work process and the balance of work and life. Thus, by accounting more accurately for the resource requirements of work, organisations can reduce the daily effort, space and time taken up by employees on their work environments. In any case, what is clear, is the ongoing need to continue a cultural research agenda on work—to address the connections between transformations in work and the myriad material practices that individuals perform in going about their daily work. References Arnold, Michael. "On the Phenomenology of Technology: The 'Janus-Faces' of Mobile Phones." Information and Organization 13.4 (2003): 231–56. Australian Bureau of Statistics. "6275.0 - Locations of Work, Nov 2008." Australian Bureau of Statistics, 8 May 2009. 20 May 2009 ‹http://www.abs.gov.au/ausstats/abs@.nsf/mf/6275.0›. Bauman, Zygmunt. Freedom. Minneapolis: U of Minnesota P, 1989. Castells, Manuel. The Rise of the Network Society. Malden, Massachusetts: Blackwell, 1996. Chesters, Jennifer, Janeen Baxter, and Mark Western. "Paid and Unpaid Work in Australian Households: Towards an Understanding of the New Gender Division of Labour." Familes through Life - 10th Australian Institute of Families Studies Conference, 9-11th July 2008, Melbourne: AIFS, 2008. Delphy, Christine. Close to Home: A Materialist Analysis of Women's Oppression. Amherst MA: U of Massachusetts, 1984. Edwards, Paul, and Judy Wajcman. The Politics of Working Life. Oxford: Oxford UP, 2005. Felstead, Alan, Nick Jewson, and Sally Walters. Changing Places of Work. New York: Palgrave Macmillan, 2005. Folbre, Nancy. "Exploitation Comes Home: A Critique of the Marxian Theory of Family Labor." Cambridge Journal of Economics 6.4 (1982): 317-29. Haraway, Donna. "Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective." Feminist Studies 14.3 (1988): 575-99. –––. Simians, Cyborgs, and Women: The Reinvention of Nature. London, Free Association Books, 1991. Henke, Christopher. "The Mechanics of Workplace Order: Toward a Sociology of Repair." Berkeley Journal of Sociology 44 (2000): 55-81. Humphry, Justine. Officing: Professionals' Daily ICT Use and the Changing Space and Time of Work. Dissertation, University of Western Sydney. 2010. Lally, Elaine. At Home with Computers. Oxford, New York: Berg, 2002. Nippert-Eng, Christena E. Home and Work: Negotiating Boundaries through Everyday Life. Chicago: U of Chicago P, 1996. Noble, Greg. "Everyday Work." Interpreting Everyday Culture. Ed. Fran Martin. New York: Hodder Arnold, 2004. 87-102. Noble, Greg, and Deborah Lupton. "Consuming Work: Computers, Subjectivity and Appropriation in the University Workplace." The Sociological Review 46.4 (1998): 803-27. –––. "Mine/Not Mine: Appropriating Personal Computers in the Academic Workplace." Journal of Sociology 38.1 (2002): 5-23. O'Sullivan, Tim, John Hartley, Danny Saunders, Martin Montgomery, and John Fiske. Key Concepts in Communication and Cultural Studies. London: Routledge, 1994. Pocock, Barbara. The Work/Life Collision: What Work Is Doing to Australians and What to Do about It. Sydney: The Federation P, 2003. Ritzer, George, and Nathan Jurgenson. "Production, Consumption, Prosumption." Journal of Consumer Culture 10.1 (2010): 13-36. Sheller, Mimi, and John Urry. "Mobile Transformations of 'Public' and 'Private' Life." Theory, Culture & Society 20.3 (2003): 107-25. Silverstone, Roger, and Leslie Haddon. "Design and the Domestication of Information and Communication Technologies: Technical Change and Everyday Life." Communication by Design: The Politics of Information and Communication Technologies. Eds. Roger Silverstone and Robin Mansell. Oxford: U of Oxford P, 1996. 44-74. Skinner, Natalie, and Barbara Pocock. "Work, Life and Workplace Culture: The Australian Work and Life Index (AWALI) 2008." Adelaide: The Centre for Work and Life, Hawke Research Institute, University of South Australia 2008 ‹http://www.unisa.edu.au/hawkeinstitute/cwl/default.asp›.Sorenson, Knut H., and Merete Lie. Making Technology Our Own? Domesticating Technologies into Everyday Life. Oslo: Scandinavian UP, 1996.Star, Susan L. "The Sociology of the Invisible: The Primacy of Work in the Writings of Anselm Strauss." Social Organization and Social Process: Essays in Honor of Anselm Strauss. New York: Walter de Gruyter, 1991. 265-83. Star, Susan L., and Anselm Strauss. "Layers of Silence, Arenas of Voice: The Ecology of Visible and Invisible Work." Computer Supported Cooperative Work 8 (1999): 9-30. Strangleman, Timothy. "Sociological Futures and the Sociology of Work." Sociological Research Online 10.4 (2005). 5 Nov. 2005 ‹http://www.socresonline.org.uk/10/4/strangleman.html›.Strauss, Anselm. "Work and the Division of Labor." The Sociological Quarterly 26 (1985): 1-19. Suchman, Lucy A. "Figuring Personhood in Sciences of the Artificial." Department of Sociology, Lancaster University. 1 Nov. 2004. 18 Jun. 2005 ‹http://www.lancs.ac.uk/fass/sociology/papers/suchman-figuring-personhood.pdf›–––. "Supporting Articulation Work." Computerization and Controversy: Value Conflicts and Social Choices. Ed. Rob Kling. San Diego: Academic P, 1995. 407-423.Urry, John. Sociology beyond Societies: Mobilities for the Twenty-First Century. London: Routledge, 2000. Van Dijk, Jan. The Network Society: Social Aspects of New Media. London: Thousand Oaks, 2006. Wajcman, Judy. "Life in the Fast Lane? Towards a Sociology of Technology and Time." The British Journal of Sociology 59.1 (2008): 59-77.Watson, Ian, John Buchanan, Iain Campbell, and Chris Briggs. Fragmented Futures: New Challenges in Working Life. Sydney: Federation P, 2003.
APA, Harvard, Vancouver, ISO, and other styles
19

Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 10, no. 6 (April 1, 2008). http://dx.doi.org/10.5204/mcj.2723.

Full text
Abstract:
“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media? How could practice-based approaches inform this research instead of relying on espoused theories-in-use? What new methodologies could be developed for CJ implementation? What role can the “heroic” individual reporter or editor have in “the swarm”? Do the claims about OhmyNews and other sites stand up to longitudinal observation? Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators? How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/>. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 http://www.innosight.com/documents/Theory%20Building.pdf>. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit>. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 http://www.gladwell.com/1997/1997_03_17_a_cool.htm>. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all>. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 http://www.demos.co.uk/publications/proameconomy>. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm>. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out>. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine>. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 http://blog.futurestreetconsulting.com/?p=39>. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973. Citation reference for this article MLA Style Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 10.6/11.1 (2008). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0804/10-burns.php>. APA Style Burns, A. (Apr. 2008) "Select Issues with New Media Theories of Citizen Journalism," M/C Journal, 10(6)/11(1). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0804/10-burns.php>.
APA, Harvard, Vancouver, ISO, and other styles
20

Burns, Alex. "Select Issues with New Media Theories of Citizen Journalism." M/C Journal 11, no. 1 (June 1, 2008). http://dx.doi.org/10.5204/mcj.30.

Full text
Abstract:
“Journalists have to begin a new type of journalism, sometimes being the guide on the side of the civic conversation as well as the filter and gatekeeper.” (Kolodzy 218) “In many respects, citizen journalism is simply public journalism removed from the journalism profession.” (Barlow 181) 1. Citizen Journalism — The Latest Innovation? New Media theorists such as Dan Gillmor, Henry Jenkins, Jay Rosen and Jeff Howe have recently touted Citizen Journalism (CJ) as the latest innovation in 21st century journalism. “Participatory journalism” and “user-driven journalism” are other terms to describe CJ, which its proponents argue is a disruptive innovation (Christensen) to the agenda-setting media institutions, news values and “objective” reportage. In this essay I offer a “contrarian” view, informed by two perspectives: (1) a three-stage model of theory-building (Carlile & Christensen) to evaluate the claims made about CJ; and (2) self-reflexive research insights (Etherington) from editing the US-based news site Disinformation between November 1999 and February 2008. New media theories can potentially create “cognitive dissonance” (Festinger) when their explanations of CJ practices are compared with what actually happens (Feyerabend). First I summarise Carlile & Christensen’s model and the dangers of “bad theory” (Ghoshal). Next I consider several problems in new media theories about CJ: the notion of ‘citizen’, new media populism, parallels in event-driven and civic journalism, and mergers and acquisitions. Two ‘self-reflexive’ issues are considered: ‘pro-ams’ or ‘professional amateurs’ as a challenge to professional journalists, and CJ’s deployment in new media operations and production environments. Finally, some exploratory questions are offered for future researchers. 2. An Evaluative Framework for New Media Theories on Citizen Journalism Paul Carlile and Clayton M. Christensen’s model offers one framework with which to evaluate new media theories on CJ. This framework is used below to highlight select issues and gaps in CJ’s current frameworks and theories. Carlile & Christensen suggest that robust theory-building emerges via three stages: Descriptive, Categorisation and Normative (Carlile & Christensen). There are three sub-stages in Descriptive theory-building; namely, the observation of phenomena, inductive classification into schemas and taxonomies, and correlative relationships to develop models (Carlile & Christensen 2-5). Once causation is established, Normative theory evolves through deductive logic which is subject to Kuhnian paradigm shifts and Popperian falsifiability (Carlile & Christensen 6). Its proponents situate CJ as a Categorisation or new journalism agenda that poses a Normative challenged and Kuhnian paradigm shift to traditional journalism. Existing CJ theories jump from the Descriptive phase of observations like “smart mobs” in Japanese youth subcultures (Rheingold) to make broad claims for Categorisation such as that IndyMedia, blogs and wiki publishing systems as new media alternatives to traditional media. CJ theories then underpin normative beliefs, values and worldviews. Correlative relationships are also used to differentiate CJ from the demand side of microeconomic analysis, from the top-down editorial models of traditional media outlets, and to adopt a vanguard stance. To support this, CJ proponents cite research on emergent collective behaviour such as the “wisdom of crowds” hypothesis (Surowiecki) or peer-to-peer network “swarms” (Pesce) to provide scientific justification for their Normative theories. However, further evaluative research is needed for three reasons: the emergent collective behaviour hypothesis may not actually inform CJ practices, existing theories may have “correlation not cause” errors, and the link may be due to citation network effects between CJ theorists. Collectively, this research base also frames CJ as an “ought to” Categorisation and then proceeds to Normative theory-building (Carlile & Christensen 7). However, I argue below that this Categorisation may be premature: its observations and correlative relationships might reinforce a ‘weak’ Normative theory with limited generalisation. CJ proponents seem to imply that it can be applied anywhere and under any condition—a “statement of causality” that almost makes it a fad (Carlile & Christensen 8). CJ that relies on Classification and Normative claims will be problematic without a strong grounding in Descriptive observation. To understand what’s potentially at stake for CJ’s future consider the consider the parallel debate about curricula renewal for the Masters of Business Administration in the wake of high-profile corporate collapses such as Enron, Worldcom, HIH and OneTel. The MBA evolved as a sociological and institutional construct to justify management as a profession that is codified, differentiated and has entry barriers (Khurana). This process might partly explain the pushback that some media professionals have to CJ as one alternative. MBA programs faced criticism if they had student cohorts with little business know-how or experiential learning (Mintzberg). Enron’s collapse illustrated the ethical dilemmas and unintended consequences that occurred when “bad theories” were implemented (Ghoshal). Professional journalists are aware of this: MBA-educated managers challenged the “craft” tradition in the early 1980s (Underwood). This meant that journalism’s ‘self-image’ (Morgan; Smith) is intertwined with managerial anxieties about media conglomerates in highly competitive markets. Ironically, as noted below, Citizen Journalists who adopt a vanguard position vis-a-vis media professionals step into a more complex game with other players. However, current theories have a naïve idealism about CJ’s promise of normative social change in the face of Machiavellian agency in business, the media and politics. 3. Citizen Who? Who is the “citizen” in CJ? What is their self-awareness as a political agent? CJ proponents who use the ‘self-image’ of ‘citizen’ draw on observations from the participatory vision of open source software, peer-to-peer networks, and case studies such as Howard Dean’s 2004 bid for the Democrat Party nominee in the US Presidential election campaign (Trippi). Recent theorists note Alexander Hamilton’s tradition of civic activism (Barlow 178) which links contemporary bloggers with the Federalist Papers and early newspaper pamphlets. One unsurfaced assumption in these observations and correlations is that most bloggers will adopt a coherent political philosophy as informed citizens: a variation on Lockean utilitarianism, Rawlsian liberalism or Nader consumer activism. To date there is little discussion about how political philosophy could deepen CJ’s ‘self-image’: how to critically evaluate sources, audit and investigation processes, or strategies to deal with elites, deterrence and power. For example, although bloggers kept Valerie Plame’s ‘outing’ as a covert intelligence operative highly visible in the issues-attention cycle, it was agenda-setting media like The New York Times who the Bush Administration targeted to silence (Pearlstine). To be viable, CJ needs to evolve beyond a new media populism, perhaps into a constructivist model of agency, norms and social change (Finnemore). 4. Citizen Journalism as New Media Populism Several “precursor trends” foreshadowed CJ notably the mid-1990s interest in “cool-hunting” by new media analysts and subculture marketeers (Gibson; Gladwell). Whilst this audience focus waned with the 1995-2000 dotcom bubble it resurfaced in CJ and publisher Tim O’Reilly’s Web 2.0 vision. Thus, CJ might be viewed as new media populism that has flourished with the Web 2.0 boom. Yet if the boom becomes a macroeconomic bubble (Gross; Spar) then CJ could be written off as a “silver bullet” that ultimately failed to deliver on its promises (Brooks, Jr.). The reputations of uncritical proponents who adopted a “true believer” stance would also be damaged (Hoffer). This risk is evident if CJ is compared with a parallel trend that shares its audience focus and populist view: day traders and technical analysts who speculate on financial markets. This parallel trend provides an alternative discipline in which the populism surfaced in an earlier form (Carlile & Christensen 12). Fidelity’s Peter Lynch argues that stock pickers can use their Main Street knowledge to beat Wall Street by exploiting information asymmetries (Lynch & Rothchild). Yet Lynch’s examples came from the mid-1970s to early 1980s when indexed mutual fund strategies worked, before deregulation and macroeconomic volatility. A change in the Web 2.0 boom might similarly trigger a reconsideration of Citizen Journalism. Hedge fund maven Victor Niederhoffer contends that investors who rely on technical analysis are practicing a Comtean religion (Niederhoffer & Kenner 72-74) instead of Efficient Market Hypothesis traders who use statistical arbitrage to deal with ‘random walks’ or Behavioural Finance experts who build on Amos Tversky and Daniel Kahneman’s Prospect Theory (Kahneman & Tversky). Niederhoffer’s deeper point is that technical analysts’ belief that the “trend is your friend” is no match for the other schools, despite a mini-publishing industry and computer trading systems. There are also ontological and epistemological differences between the schools. Similarly, CJ proponents who adopt a ‘Professional Amateur’ or ‘Pro-Am’ stance (Leadbeater & Miller) may face a similar gulf when making comparisons with professional journalists and the production environments in media organisations. CJ also thrives as new media populism because of institutional vested interests. When media conglomerates cut back on cadetships and internships CJ might fill the market demand as one alternative. New media programs at New York University and others can use CJ to differentiate themselves from “hyperlocal” competitors (Christensen; Slywotzky; Christensen, Curtis & Horn). This transforms CJ from new media populism to new media institution. 5. Parallels: Event-driven & Civic Journalism For new media programs, CJ builds on two earlier traditions: the Event-driven journalism of crises like the 1991 Gulf War (Wark) and the Civic Journalism school that emerged in the 1960s social upheavals. Civic Journalism’s awareness of minorities and social issues provides the character ethic and political philosophy for many Citizen Journalists. Jay Rosen and others suggest that CJ is the next-generation heir to Civic Journalism, tracing a thread from the 1968 Chicago Democratic Convention to IndyMedia’s coverage of the 1999 “Battle in Seattle” (Rosen). Rosen’s observation could yield an interesting historiography or genealogy. Events such as the Southeast Asian tsunami on 26 December 2004 or Al Qaeda’s London bombings on 7 July 2005 are cited as examples of CJ as event-driven journalism and “pro-am collaboration” (Kolodzy 229-230). Having covered these events and Al Qaeda’s attacks on 11th September 2001, I have a slightly different view: this was more a variation on “first responder” status and handicam video footage that journalists have sourced for the past three decades when covering major disasters. This different view means that the “salience of categories” used to justify CJ and “pro-am collaboration” these events does not completely hold. Furthermore, when Citizen Journalism proponents tout Flickr and Wikipedia as models of real-time media they are building on a broader phenomenon that includes CNN’s Gulf War coverage and Bloomberg’s dominance of financial news (Loomis). 6. The Mergers & Acquisitions Scenario CJ proponents often express anxieties about the resilience of their outlets in the face of predatory venture capital firms who initiate Mergers & Acquisitions (M&A) activities. Ironically, these venture capital firms have core competencies and expertise in the event-driven infrastructure and real-time media that CJ aspires to. Sequoia Capital and other venture capital firms have evaluative frameworks that likely surpass Carlile & Christensen in sophistication, and they exploit parallels, information asymmetries and market populism. Furthermore, although venture capital firms such as Union Street Ventures have funded Web 2.0 firms, they are absent from the explanations of some theorists, whose examples of Citizen Journalism and Web 2.0 success may be the result of survivorship bias. Thus, the venture capital market remains an untapped data source for researchers who want to evaluate the impact of CJ outlets and institutions. The M&A scenario further problematises CJ in several ways. First, CJ is framed as “oppositional” to traditional media, yet this may be used as a stratagem in a game theory framework with multiple stakeholders. Drexel Burnham Lambert’s financier Michael Milken used market populism to sell ‘high-yield’ or ‘junk’ bonds to investors whilst disrupting the Wall Street establishment in the late 1980s (Curtis) and CJ could fulfil a similar tactical purpose. Second, the M&A goal of some Web 2.0 firms could undermine the participatory goals of a site’s community if post-merger integration fails. Jason Calacanis’s sale of Weblogs, Inc to America Online in 2005 and MSNBC’s acquisition of Newsvine on 5 October 2007 (Newsvine) might be success stories. However, this raises issues of digital “property rights” if you contribute to a community that is then sold in an M&A transaction—an outcome closer to business process outsourcing. Third, media “buzz” can create an unrealistic vision when a CJ site fails to grow beyond its start-up phase. Backfence.com’s demise as a “hyperlocal” initiative (Caverly) is one cautionary event that recalls the 2000 dotcom crash. The M&A scenarios outlined above are market dystopias for CJ purists. The major lesson for CJ proponents is to include other market players in hypotheses about causation and correlation factors. 7. ‘Pro-Ams’ & Professional Journalism’s Crisis CJ emerged during a period when Professional Journalism faced a major crisis of ‘self-image’. The Demos report The Pro-Am Revolution (Leadbeater & Miller) popularised the notion of ‘professional amateurs’ which some CJ theorists adopt to strengthen their categorisation. In turn, this triggers a response from cultural theorists who fear bloggers are new media’s barbarians (Keen). I concede Leadbeater and Miller have identified an important category. However, how some CJ theorists then generalise from ‘Pro-Ams’ illustrates the danger of ‘weak’ theory referred to above. Leadbeater and Miller’s categorisation does not really include a counter-view on the strengths of professionals, as illustrated in humanistic consulting (Block), professional service firms (Maister; Maister, Green & Galford), and software development (McConnell). The signs of professionalism these authors mention include a commitment to learning and communal verification, mastery of a discipline and domain application, awareness of methodology creation, participation in mentoring, and cultivation of ethical awareness. Two key differences are discernment and quality of attention, as illustrated in how the legendary Hollywood film editor Walter Murch used Apple’s Final Cut Pro software to edit the 2003 film Cold Mountain (Koppelman). ‘Pro-Ams’ might not aspire to these criteria but Citizen Journalists shouldn’t throw out these standards, either. Doing so would be making the same mistake of overconfidence that technical analysts make against statistical arbitrageurs. Key processes—fact-checking, sub-editing and editorial decision-making—are invisible to the end-user, even if traceable in a blog or wiki publishing system, because of the judgments involved. One post-mortem insight from Assignment Zero was that these processes were vital to create the climate of authenticity and trust to sustain a Citizen Journalist community (Howe). CJ’s trouble with “objectivity” might also overlook some complexities, including the similarity of many bloggers to “noise traders” in financial markets and to op-ed columnists. Methodologies and reportage practices have evolved to deal with the objections that CJ proponents raise, from New Journalism’s radical subjectivity and creative non-fiction techniques (Wolfe & Johnson) to Precision Journalism that used descriptive statistics (Meyer). Finally, journalism frameworks could be updated with current research on how phenomenological awareness shapes our judgments and perceptions (Thompson). 8. Strategic Execution For me, one of CJ’s major weaknesses as a new media theory is its lack of “rich description” (Geertz) about the strategic execution of projects. As Disinfo.com site editor I encountered situations ranging from ‘denial of service’ attacks and spam to site migration, publishing systems that go offline, and ensuring an editorial consistency. Yet the messiness of these processes is missing from CJ theories and accounts. Theories that included this detail as “second-order interactions” (Carlile & Christensen 13) would offer a richer view of CJ. Many CJ and Web 2.0 projects fall into the categories of mini-projects, demonstration prototypes and start-ups, even when using a programming language such as Ajax or Ruby on Rails. Whilst the “bootstrap” process is a benefit, more longitudinal analysis and testing needs to occur, to ensure these projects are scalable and sustainable. For example, South Korea’s OhmyNews is cited as an exemplar that started with “727 citizen reporters and 4 editors” and now has “38,000 citizen reporters” and “a dozen editors” (Kolodzy 231). How does OhmyNews’s mix of hard and soft news change over time? Or, how does OhmyNews deal with a complex issue that might require major resources, such as security negotiations between North and South Korea? Such examples could do with further research. We need to go beyond “the vision thing” and look at the messiness of execution for deeper observations and counterintuitive correlations, to build new descriptive theories. 9. Future Research This essay argues that CJ needs re-evaluation. Its immediate legacy might be to splinter ‘journalism’ into micro-trends: Washington University’s Steve Boriss proclaims “citizen journalism is dead. Expert journalism is the future.” (Boriss; Mensching). The half-lives of such micro-trends demand new categorisations, which in turn prematurely feeds the theory-building cycle. Instead, future researchers could reinvigorate 21st century journalism if they ask deeper questions and return to the observation stage of building descriptive theories. In closing, below are some possible questions that future researchers might explore: Where are the “rich descriptions” of journalistic experience—“citizen”, “convergent”, “digital”, “Pro-Am” or otherwise in new media?How could practice-based approaches inform this research instead of relying on espoused theories-in-use?What new methodologies could be developed for CJ implementation?What role can the “heroic” individual reporter or editor have in “the swarm”?Do the claims about OhmyNews and other sites stand up to longitudinal observation?Are the theories used to justify Citizen Journalism’s normative stance (Rheingold; Surowiecki; Pesce) truly robust generalisations for strategic execution or do they reflect the biases of their creators?How could developers tap the conceptual dimensions of information technology innovation (Shasha) to create the next Facebook, MySpace or Wikipedia? References Argyris, Chris, and Donald Schon. Theory in Practice. San Francisco: Jossey-Bass Publishers, 1976. Barlow, Aaron. The Rise of the Blogosphere. Westport, CN: Praeger Publishers, 2007. Block, Peter. Flawless Consulting. 2nd ed. San Francisco, CA: Jossey-Bass/Pfeiffer, 2000. Boriss, Steve. “Citizen Journalism Is Dead. Expert Journalism Is the Future.” The Future of News. 28 Nov. 2007. 20 Feb. 2008 < http://thefutureofnews.com/2007/11/28/citizen-journalism-is-dead- expert-journalism-is-the-future/ >. Brooks, Jr., Frederick P. The Mythical Man-Month: Essays on Software Engineering. Rev. ed. Reading, MA: Addison-Wesley Publishing Company, 1995. Campbell, Vincent. Information Age Journalism: Journalism in an International Context. New York: Arnold, 2004. Carlile, Paul R., and Clayton M. Christensen. “The Cycles of Building Theory in Management Research.” Innosight working paper draft 6. 6 Jan. 2005. 19 Feb. 2008 < http://www.innosight.com/documents/Theory%20Building.pdf >. Caverly, Doug. “Hyperlocal News Site Takes A Hit.” WebProNews.com 6 July 2007. 19 Feb. 2008 < http://www.webpronews.com/topnews/2007/07/06/hyperlocal-news- sites-take-a-hit >. Chenoweth, Neil. Virtual Murdoch: Reality Wars on the Information Superhighway. Sydney: Random House Australia, 2001. Christensen, Clayton M. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press, 1997. Christensen, Clayton M., Curtis Johnson, and Michael Horn. Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns. New York: McGraw-Hill, 2008. Curtis, Adam. The Mayfair Set. London: British Broadcasting Corporation, 1999. Etherington, Kim. Becoming a Reflexive Researcher: Using Ourselves in Research. London: Jessica Kingsley Publishers, 2004. Festinger, Leon. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press, 1962. Feyerabend, Paul. Against Method. 3rd ed. London: Verso, 1993. Finnemore, Martha. National Interests in International Society. Ithaca, NY: Cornell University Press, 1996. Geertz, Clifford. The Interpretation of Cultures. New York: Basic Books, 1973. Ghoshal, Sumantra. “Bad Management Theories Are Destroying Good Management Practices.” Academy of Management Learning & Education 4.1 (2005): 75-91. Gibson, William. Pattern Recognition. London: Viking, 2003. Gladwell, Malcolm. “The Cool-Hunt.” The New Yorker Magazine 17 March 1997. 20 Feb. 2008 < http://www.gladwell.com/1997/1997_03_17_a_cool.htm >. Gross, Daniel. Pop! Why Bubbles Are Great for the Economy. New York: Collins, 2007. Hoffer, Eric. The True Believer. New York: Harper, 1951. Howe, Jeff. “Did Assignment Zero Fail? A Look Back, and Lessons Learned.” Wired News 16 July 2007. 19 Feb. 2008 < http://www.wired.com/techbiz/media/news/2007/07/assignment_ zero_final?currentPage=all >. Kahneman, Daniel, and Amos Tversky. Choices, Values and Frames. Cambridge: Cambridge UP, 2000. Keen, Andrew. The Cult of the Amateur. New York: Doubleday Currency, 2007. Khurana, Rakesh. From Higher Aims to Hired Hands. Princeton, NJ: Princeton UP, 2007. Kolodzy, Janet. Convergence Journalism: Writing and Reporting across the News Media. Oxford: Rowman & Littlefield, 2006. Koppelman, Charles. Behind the Seen: How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema. Upper Saddle River, NJ: New Rider, 2004. Leadbeater, Charles, and Paul Miller. “The Pro-Am Revolution”. London: Demos, 24 Nov. 2004. 19 Feb. 2008 < http://www.demos.co.uk/publications/proameconomy >. Loomis, Carol J. “Bloomberg’s Money Machine.” Fortune 5 April 2007. 20 Feb. 2008 < http://money.cnn.com/magazines/fortune/fortune_archive/2007/04/16/ 8404302/index.htm >. Lynch, Peter, and John Rothchild. Beating the Street. Rev. ed. New York: Simon & Schuster, 1994. Maister, David. True Professionalism. New York: The Free Press, 1997. Maister, David, Charles H. Green, and Robert M. Galford. The Trusted Advisor. New York: The Free Press, 2004. Mensching, Leah McBride. “Citizen Journalism on Its Way Out?” SFN Blog, 30 Nov. 2007. 20 Feb. 2008 < http://www.sfnblog.com/index.php/2007/11/30/940-citizen-journalism- on-its-way-out >. Meyer, Philip. Precision Journalism. 4th ed. Lanham, MD: Rowman & Littlefield, 2002. McConnell, Steve. Professional Software Development. Boston, MA: Addison-Wesley, 2004. Mintzberg, Henry. Managers Not MBAs. San Francisco, CA: Berrett-Koehler, 2004. Morgan, Gareth. Images of Organisation. Rev. ed. Thousand Oaks, CA: Sage, 2006. Newsvine. “Msnbc.com Acquires Newsvine.” 7 Oct. 2007. 20 Feb. 2008 < http://blog.newsvine.com/_news/2007/10/07/1008889-msnbccom- acquires-newsvine >. Niederhoffer, Victor, and Laurel Kenner. Practical Speculation. New York: John Wiley & Sons, 2003. Pearlstine, Norman. Off the Record: The Press, the Government, and the War over Anonymous Sources. New York: Farrar, Straus & Giroux, 2007. Pesce, Mark D. “Mob Rules (The Law of Fives).” The Human Network 28 Sep. 2007. 20 Feb. 2008 < http://blog.futurestreetconsulting.com/?p=39 >. Rheingold, Howard. Smart Mobs: The Next Social Revolution. Cambridge MA: Basic Books, 2002. Rosen, Jay. What Are Journalists For? Princeton NJ: Yale UP, 2001. Shasha, Dennis Elliott. Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. New York: Copernicus, 1995. Slywotzky, Adrian. Value Migration: How to Think Several Moves Ahead of the Competition. Boston, MA: Harvard Business School Press, 1996. Smith, Steve. “The Self-Image of a Discipline: The Genealogy of International Relations Theory.” Eds. Steve Smith and Ken Booth. International Relations Theory Today. Cambridge, UK: Polity Press, 1995. 1-37. Spar, Debora L. Ruling the Waves: Cycles of Discovery, Chaos and Wealth from the Compass to the Internet. New York: Harcourt, 2001. Surowiecki, James. The Wisdom of Crowds. New York: Doubleday, 2004. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Cambridge, MA: Belknap Press, 2007. Trippi, Joe. The Revolution Will Not Be Televised. New York: ReganBooks, 2004. Underwood, Doug. When MBA’s Rule the Newsroom. New York: Columbia University Press, 1993. Wark, McKenzie. Virtual Geography: Living with Global Media Events. Bloomington IN: Indiana UP, 1994. Wolfe, Tom, and E.W. Johnson. The New Journalism. New York: Harper & Row, 1973.
APA, Harvard, Vancouver, ISO, and other styles
21

Ibrahim, Yasmin. "Commodifying Terrorism." M/C Journal 10, no. 3 (June 1, 2007). http://dx.doi.org/10.5204/mcj.2665.

Full text
Abstract:
Introduction Figure 1 The counter-Terrorism advertising campaign of London’s Metropolitan Police commodifies some everyday items such as mobile phones, computers, passports and credit cards as having the potential to sustain terrorist activities. The process of ascribing cultural values and symbolic meanings to some everyday technical gadgets objectifies and situates Terrorism into the everyday life. The police, in urging people to look out for ‘the unusual’ in their normal day-to-day lives, juxtapose the everyday with the unusual, where day-to-day consumption, routines and flows of human activity can seemingly house insidious and atavistic elements. This again is reiterated in the Met police press release: Terrorists live within our communities making their plans whilst doing everything they can to blend in, and trying not to raise suspicions about their activities. (MPA Website) The commodification of Terrorism through uncommon and everyday objects situates Terrorism as a phenomenon which occupies a liminal space within the everyday. It resides, breathes and co-exists within the taken-for-granted routines and objects of ‘the everyday’ where it has the potential to explode and disrupt without warning. Since 9/11 and the 7/7 bombings Terrorism has been narrated through the disruption of mobility, whether in mid-air or in the deep recesses of the Underground. The resonant thread of disruption to human mobility evokes a powerful meta-narrative where acts of Terrorism can halt human agency amidst the backdrop of the metropolis, which is often a metaphor for speed and accelerated activities. If globalisation and the interconnected nature of the world are understood through discourses of risk, Terrorism bears the same footprint in urban spaces of modernity, narrating the vulnerability of the human condition in an inter-linked world where ideological struggles and resistance are manifested through inexplicable violence and destruction of lives, where the everyday is suspended to embrace the unexpected. As a consequence ambient fear “saturates the social spaces of everyday life” (Hubbard 2). The commodification of Terrorism through everyday items of consumption inevitably creates an intertextuality with real and media events, which constantly corrode the security of the metropolis. Paddy Scannell alludes to a doubling of place in our mediated world where “public events now occur simultaneously in two different places; the place of the event itself and that in which it is watched and heard. The media then vacillates between the two sites and creates experiences of simultaneity, liveness and immediacy” (qtd. in Moores 22). The doubling of place through media constructs a pervasive environment of risk and fear. Mark Danner (qtd. in Bauman 106) points out that the most powerful weapon of the 9/11 terrorists was that innocuous and “most American of technological creations: the television set” which provided a global platform to constantly replay and remember the dreadful scenes of the day, enabling the terrorist to appear invincible and to narrate fear as ubiquitous and omnipresent. Philip Abrams argues that ‘big events’ (such as 9/11 and 7/7) do make a difference in the social world for such events function as a transformative device between the past and future, forcing society to alter or transform its perspectives. David Altheide points out that since September 11 and the ensuing war on terror, a new discourse of Terrorism has emerged as a way of expressing how the world has changed and defining a state of constant alert through a media logic and format that shapes the nature of discourse itself. Consequently, the intensity and centralisation of surveillance in Western countries increased dramatically, placing the emphasis on expanding the forms of the already existing range of surveillance processes and practices that circumscribe and help shape our social existence (Lyon, Terrorism 2). Normalisation of Surveillance The role of technologies, particularly information and communication technologies (ICTs), and other infrastructures to unevenly distribute access to the goods and services necessary for modern life, while facilitating data collection on and control of the public, are significant characteristics of modernity (Reiman; Graham and Marvin; Monahan). The embedding of technological surveillance into spaces and infrastructures not only augment social control but also redefine data as a form of capital which can be shared between public and private sectors (Gandy, Data Mining; O’Harrow; Monahan). The scale, complexity and limitations of omnipresent and omnipotent surveillance, nevertheless, offer room for both subversion as well as new forms of domination and oppression (Marx). In surveillance studies, Foucault’s analysis is often heavily employed to explain lines of continuity and change between earlier forms of surveillance and data assemblage and contemporary forms in the shape of closed-circuit television (CCTV) and other surveillance modes (Dee). It establishes the need to discern patterns of power and normalisation and the subliminal or obvious cultural codes and categories that emerge through these arrangements (Fopp; Lyon, Electronic; Norris and Armstrong). In their study of CCTV surveillance, Norris and Armstrong (cf. in Dee) point out that when added to the daily minutiae of surveillance, CCTV cameras in public spaces, along with other camera surveillance in work places, capture human beings on a database constantly. The normalisation of surveillance, particularly with reference to CCTV, the popularisation of surveillance through television formats such as ‘Big Brother’ (Dee), and the expansion of online platforms to publish private images, has created a contradictory, complex and contested nature of spatial and power relationships in society. The UK, for example, has the most developed system of both urban and public space cameras in the world and this growth of camera surveillance and, as Lyon (Surveillance) points out, this has been achieved with very little, if any, public debate as to their benefits or otherwise. There may now be as many as 4.2 million CCTV cameras in Britain (cf. Lyon, Surveillance). That is one for every fourteen people and a person can be captured on over 300 cameras every day. An estimated £500m of public money has been invested in CCTV infrastructure over the last decade but, according to a Home Office study, CCTV schemes that have been assessed had little overall effect on crime levels (Wood and Ball). In spatial terms, these statistics reiterate Foucault’s emphasis on the power economy of the unseen gaze. Michel Foucault in analysing the links between power, information and surveillance inspired by Bentham’s idea of the Panopticon, indicated that it is possible to sanction or reward an individual through the act of surveillance without their knowledge (155). It is this unseen and unknown gaze of surveillance that is fundamental to the exercise of power. The design and arrangement of buildings can be engineered so that the “surveillance is permanent in its effects, even if it is discontinuous in its action” (Foucault 201). Lyon (Terrorism), in tracing the trajectory of surveillance studies, points out that much of surveillance literature has focused on understanding it as a centralised bureaucratic relationship between the powerful and the governed. Invisible forms of surveillance have also been viewed as a class weapon in some societies. With the advancements in and proliferation of surveillance technologies as well as convergence with other technologies, Lyon argues that it is no longer feasible to view surveillance as a linear or centralised process. In our contemporary globalised world, there is a need to reconcile the dialectical strands that mediate surveillance as a process. In acknowledging this, Giles Deleuze and Felix Guattari have constructed surveillance as a rhizome that defies linearity to appropriate a more convoluted and malleable form where the coding of bodies and data can be enmeshed to produce intricate power relationships and hierarchies within societies. Latour draws on the notion of assemblage by propounding that data is amalgamated from scattered centres of calculation where these can range from state and commercial institutions to scientific laboratories which scrutinise data to conceive governance and control strategies. Both the Latourian and Deleuzian ideas of surveillance highlight the disparate arrays of people, technologies and organisations that become connected to make “surveillance assemblages” in contrast to the static, unidirectional Panopticon metaphor (Ball, “Organization” 93). In a similar vein, Gandy (Panoptic) infers that it is misleading to assume that surveillance in practice is as complete and totalising as the Panoptic ideal type would have us believe. Co-optation of Millions The Metropolitan Police’s counter-Terrorism strategy seeks to co-opt millions where the corporeal body can complement the landscape of technological surveillance that already co-exists within modernity. In its press release, the role of civilian bodies in ensuring security of the city is stressed; Keeping Londoners safe from Terrorism is not a job solely for governments, security services or police. If we are to make London the safest major city in the world, we must mobilise against Terrorism not only the resources of the state, but also the active support of the millions of people who live and work in the capita. (MPA Website). Surveillance is increasingly simulated through the millions of corporeal entities where seeing in advance is the goal even before technology records and codes these images (William). Bodies understand and code risk and images through the cultural narratives which circulate in society. Compared to CCTV technology images, which require cultural and political interpretations and interventions, bodies as surveillance organisms implicitly code other bodies and activities. The travel bag in the Metropolitan Police poster reinforces the images of the 7/7 bombers and the renewed attempts to bomb the London Underground on the 21st of July. It reiterates the CCTV footage revealing images of the bombers wearing rucksacks. The image of the rucksack both embodies the everyday as well as the potential for evil in everyday objects. It also inevitably reproduces the cultural biases and prejudices where the rucksack is subliminally associated with a specific type of body. The rucksack in these terms is a laden image which symbolically captures the context and culture of risk discourses in society. The co-optation of the population as a surveillance entity also recasts new forms of social responsibility within the democratic polity, where privacy is increasingly mediated by the greater need to monitor, trace and record the activities of one another. Nikolas Rose, in discussing the increasing ‘responsibilisation’ of individuals in modern societies, describes the process in which the individual accepts responsibility for personal actions across a wide range of fields of social and economic activity as in the choice of diet, savings and pension arrangements, health care decisions and choices, home security measures and personal investment choices (qtd. in Dee). While surveillance in individualistic terms is often viewed as a threat to privacy, Rose argues that the state of ‘advanced liberalism’ within modernity and post-modernity requires considerable degrees of self-governance, regulation and surveillance whereby the individual is constructed both as a ‘new citizen’ and a key site of self management. By co-opting and recasting the role of the citizen in the age of Terrorism, the citizen to a degree accepts responsibility for both surveillance and security. In our sociological imagination the body is constructed both as lived as well as a social object. Erving Goffman uses the word ‘umwelt’ to stress that human embodiment is central to the constitution of the social world. Goffman defines ‘umwelt’ as “the region around an individual from which signs of alarm can come” and employs it to capture how people as social actors perceive and manage their settings when interacting in public places (252). Goffman’s ‘umwelt’ can be traced to Immanuel Kant’s idea that it is the a priori categories of space and time that make it possible for a subject to perceive a world (Umiker-Sebeok; qtd. in Ball, “Organization”). Anthony Giddens adapted the term Umwelt to refer to “a phenomenal world with which the individual is routinely ‘in touch’ in respect of potential dangers and alarms which then formed a core of (accomplished) normalcy with which individuals and groups surround themselves” (244). Benjamin Smith, in considering the body as an integral component of the link between our consciousness and our material world, observes that the body is continuously inscribed by culture. These inscriptions, he argues, encompass a wide range of cultural practices and will imply knowledge of a variety of social constructs. The inscribing of the body will produce cultural meanings as well as create forms of subjectivity while locating and situating the body within a cultural matrix (Smith). Drawing on Derrida’s work, Pugliese employs the term ‘Somatechnics’ to conceptualise the body as a culturally intelligible construct and to address the techniques in and through which the body is formed and transformed (qtd. in Osuri). These techniques can encompass signification systems such as race and gender and equally technologies which mediate our sense of reality. These technologies of thinking, seeing, hearing, signifying, visualising and positioning produce the very conditions for the cultural intelligibility of the body (Osuri). The body is then continuously inscribed and interpreted through mediated signifying systems. Similarly, Hayles, while not intending to impose a Cartesian dichotomy between the physical body and its cognitive presence, contends that the use and interactions with technology incorporate the body as a material entity but it also equally inscribes it by marking, recording and tracing its actions in various terrains. According to Gayatri Spivak (qtd. in Ball, “Organization”) new habits and experiences are embedded into the corporeal entity which then mediates its reactions and responses to the social world. This means one’s body is not completely one’s own and the presence of ideological forces or influences then inscribe the body with meanings, codes and cultural values. In our modern condition, the body and data are intimately and intricately bound. Outside the home, it is difficult for the body to avoid entering into relationships that produce electronic personal data (Stalder). According to Felix Stalder our physical bodies are shadowed by a ‘data body’ which follows the physical body of the consuming citizen and sometimes precedes it by constructing the individual through data (12). Before we arrive somewhere, we have already been measured and classified. Thus, upon arrival, the citizen will be treated according to the criteria ‘connected with the profile that represents us’ (Gandy, Panoptic; William). Following September 11, Lyon (Terrorism) reveals that surveillance data from a myriad of sources, such as supermarkets, motels, traffic control points, credit card transactions records and so on, was used to trace the activities of terrorists in the days and hours before their attacks, confirming that the body leaves data traces and trails. Surveillance works by abstracting bodies from places and splitting them into flows to be reassembled as virtual data-doubles, and in the process can replicate hierarchies and centralise power (Lyon, Terrorism). Mike Dee points out that the nature of surveillance taking place in modern societies is complex and far-reaching and in many ways insidious as surveillance needs to be situated within the broadest context of everyday human acts whether it is shopping with loyalty cards or paying utility bills. Physical vulnerability of the body becomes more complex in the time-space distanciated surveillance systems to which the body has become increasingly exposed. As such, each transaction – whether it be a phone call, credit card transaction, or Internet search – leaves a ‘data trail’ linkable to an individual person or place. Haggerty and Ericson, drawing from Deleuze and Guattari’s concept of the assemblage, describe the convergence and spread of data-gathering systems between different social domains and multiple levels (qtd. in Hier). They argue that the target of the generic ‘surveillance assemblage’ is the human body, which is broken into a series of data flows on which surveillance process is based. The thrust of the focus is the data individuals can yield and the categories to which they can contribute. These are then reapplied to the body. In this sense, surveillance is rhizomatic for it is diverse and connected to an underlying, invisible infrastructure which concerns interconnected technologies in multiple contexts (Ball, “Elements”). The co-opted body in the schema of counter-Terrorism enters a power arrangement where it constitutes both the unseen gaze as well as the data that will be implicated and captured in this arrangement. It is capable of producing surveillance data for those in power while creating new data through its transactions and movements in its everyday life. The body is unequivocally constructed through this data and is also entrapped by it in terms of representation and categorisation. The corporeal body is therefore part of the machinery of surveillance while being vulnerable to its discriminatory powers of categorisation and victimisation. As Hannah Arendt (qtd. in Bauman 91) had warned, “we terrestrial creatures bidding for cosmic significance will shortly be unable to comprehend and articulate the things we are capable of doing” Arendt’s caution conveys the complexity, vulnerability as well as the complicity of the human condition in the surveillance society. Equally it exemplifies how the corporeal body can be co-opted as a surveillance entity sustaining a new ‘banality’ (Arendt) in the machinery of surveillance. Social Consequences of Surveillance Lyon (Terrorism) observed that the events of 9/11 and 7/7 in the UK have inevitably become a prism through which aspects of social structure and processes may be viewed. This prism helps to illuminate the already existing vast range of surveillance practices and processes that touch everyday life in so-called information societies. As Lyon (Terrorism) points out surveillance is always ambiguous and can encompass genuine benefits and plausible rationales as well as palpable disadvantages. There are elements of representation to consider in terms of how surveillance technologies can re-present data that are collected at source or gathered from another technological medium, and these representations bring different meanings and enable different interpretations of life and surveillance (Ball, “Elements”). As such surveillance needs to be viewed in a number of ways: practice, knowledge and protection from threat. As data can be manipulated and interpreted according to cultural values and norms it reflects the inevitability of power relations to forge its identity in a surveillance society. In this sense, Ball (“Elements”) concludes surveillance practices capture and create different versions of life as lived by surveilled subjects. She refers to actors within the surveilled domain as ‘intermediaries’, where meaning is inscribed, where technologies re-present information, where power/resistance operates, and where networks are bound together to sometimes distort as well as reiterate patterns of hegemony (“Elements” 93). While surveillance is often connected with technology, it does not however determine nor decide how we code or employ our data. New technologies rarely enter passive environments of total inequality for they become enmeshed in complex pre-existing power and value systems (Marx). With surveillance there is an emphasis on the classificatory powers in our contemporary world “as persons and groups are often risk-profiled in the commercial sphere which rates their social contributions and sorts them into systems” (Lyon, Terrorism 2). Lyon (Terrorism) contends that the surveillance society is one that is organised and structured using surveillance-based techniques recorded by technologies, on behalf of the organisations and governments that structure our society. This information is then sorted, sifted and categorised and used as a basis for decisions which affect our life chances (Wood and Ball). The emergence of pervasive, automated and discriminatory mechanisms for risk profiling and social categorising constitute a significant mechanism for reproducing and reinforcing social, economic and cultural divisions in information societies. Such automated categorisation, Lyon (Terrorism) warns, has consequences for everyone especially in face of the new anti-terror measures enacted after September 11. In tandem with this, Bauman points out that a few suicidal murderers on the loose will be quite enough to recycle thousands of innocents into the “usual suspects”. In no time, a few iniquitous individual choices will be reprocessed into the attributes of a “category”; a category easily recognisable by, for instance, a suspiciously dark skin or a suspiciously bulky rucksack* *the kind of object which CCTV cameras are designed to note and passers-by are told to be vigilant about. And passers-by are keen to oblige. Since the terrorist atrocities on the London Underground, the volume of incidents classified as “racist attacks” rose sharply around the country. (122; emphasis added) Bauman, drawing on Lyon, asserts that the understandable desire for security combined with the pressure to adopt different kind of systems “will create a culture of control that will colonise more areas of life with or without the consent of the citizen” (123). This means that the inhabitants of the urban space whether a citizen, worker or consumer who has no terrorist ambitions whatsoever will discover that their opportunities are more circumscribed by the subject positions or categories which are imposed on them. Bauman cautions that for some these categories may be extremely prejudicial, restricting them from consumer choices because of credit ratings, or more insidiously, relegating them to second-class status because of their colour or ethnic background (124). Joseph Pugliese, in linking visual regimes of racial profiling and the shooting of Jean Charles de Menezes in the aftermath of 7/7 bombings in London, suggests that the discursive relations of power and visuality are inextricably bound. Pugliese argues that racial profiling creates a regime of visuality which fundamentally inscribes our physiology of perceptions with stereotypical images. He applies this analogy to Menzes running down the platform in which the retina transforms him into the “hallucinogenic figure of an Asian Terrorist” (Pugliese 8). With globalisation and the proliferation of ICTs, borders and boundaries are no longer sacrosanct and as such risks are managed by enacting ‘smart borders’ through new technologies, with huge databases behind the scenes processing information about individuals and their journeys through the profiling of body parts with, for example, iris scans (Wood and Ball 31). Such body profiling technologies are used to create watch lists of dangerous passengers or identity groups who might be of greater ‘risk’. The body in a surveillance society can be dissected into parts and profiled and coded through technology. These disparate codings of body parts can be assembled (or selectively omitted) to construct and represent whole bodies in our information society to ascertain risk. The selection and circulation of knowledge will also determine who gets slotted into the various categories that a surveillance society creates. Conclusion When the corporeal body is subsumed into a web of surveillance it often raises questions about the deterministic nature of technology. The question is a long-standing one in our modern consciousness. We are apprehensive about according technology too much power and yet it is implicated in the contemporary power relationships where it is suspended amidst human motive, agency and anxiety. The emergence of surveillance societies, the co-optation of bodies in surveillance schemas, as well as the construction of the body through data in everyday transactions, conveys both the vulnerabilities of the human condition as well as its complicity in maintaining the power arrangements in society. Bauman, in citing Jacques Ellul and Hannah Arendt, points out that we suffer a ‘moral lag’ in so far as technology and society are concerned, for often we ruminate on the consequences of our actions and motives only as afterthoughts without realising at this point of existence that the “actions we take are most commonly prompted by the resources (including technology) at our disposal” (91). References Abrams, Philip. Historical Sociology. Shepton Mallet, UK: Open Books, 1982. Altheide, David. “Consuming Terrorism.” Symbolic Interaction 27.3 (2004): 289-308. Arendt, Hannah. Eichmann in Jerusalem: A Report on the Banality of Evil. London: Faber & Faber, 1963. Bauman, Zygmunt. Liquid Fear. Cambridge, UK: Polity, 2006. Ball, Kristie. “Elements of Surveillance: A New Framework and Future Research Direction.” Information, Communication and Society 5.4 (2002): 573-90 ———. “Organization, Surveillance and the Body: Towards a Politics of Resistance.” Organization 12 (2005): 89-108. Dee, Mike. “The New Citizenship of the Risk and Surveillance Society – From a Citizenship of Hope to a Citizenship of Fear?” Paper Presented to the Social Change in the 21st Century Conference, Queensland University of Technology, Queensland, Australia, 22 Nov. 2002. 14 April 2007 http://eprints.qut.edu.au/archive/00005508/02/5508.pdf>. Deleuze, Gilles, and Felix Guattari. A Thousand Plateaus. Minneapolis: U of Minnesota P, 1987. Fopp, Rodney. “Increasing the Potential for Gaze, Surveillance and Normalization: The Transformation of an Australian Policy for People and Homeless.” Surveillance and Society 1.1 (2002): 48-65. Foucault, Michel. Discipline and Punish: The Birth of the Prison. London: Allen Lane, 1977. Giddens, Anthony. Modernity and Self-Identity. Self and Society in the Late Modern Age. Stanford: Stanford UP, 1991. Gandy, Oscar. The Panoptic Sort: A Political Economy of Personal Information. Boulder, CO: Westview, 1997. ———. “Data Mining and Surveillance in the Post 9/11 Environment.” The Intensification of Surveillance: Crime, Terrorism and War in the Information Age. Eds. Kristie Ball and Frank Webster. Sterling, VA: Pluto Press, 2003. Goffman, Erving. Relations in Public. Harmondsworth: Penguin, 1971. Graham, Stephen, and Simon Marvin. Splintering Urbanism: Networked Infrastructures, Technological Mobilities and the Urban Condition. New York: Routledge, 2001. Hier, Sean. “Probing Surveillance Assemblage: On the Dialectics of Surveillance Practices as Process of Social Control.” Surveillance and Society 1.3 (2003): 399-411. Hayles, Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago: U of Chicago P, 1999. Hubbard, Phil. “Fear and Loathing at the Multiplex: Everyday Anxiety in the Post-Industrial City.” Capital & Class 80 (2003). Latour, Bruno. Science in Action. Cambridge, Mass: Harvard UP, 1987 Lyon, David. The Electronic Eye – The Rise of Surveillance Society. Oxford: Polity Press, 1994. ———. “Terrorism and Surveillance: Security, Freedom and Justice after September 11 2001.” Privacy Lecture Series, Queens University, 12 Nov 2001. 16 April 2007 http://privacy.openflows.org/lyon_paper.html>. ———. “Surveillance Studies: Understanding Visibility, Mobility and the Phonetic Fix.” Surveillance and Society 1.1 (2002): 1-7. Metropolitan Police Authority (MPA). “Counter Terrorism: The London Debate.” Press Release. 21 June 2006. 18 April 2007 http://www.mpa.gov.uk.access/issues/comeng/Terrorism.htm>. Pugliese, Joseph. “Asymmetries of Terror: Visual Regimes of Racial Profiling and the Shooting of Jean Charles de Menezes in the Context of the War in Iraq.” Borderlands 5.1 (2006). 30 May 2007 http://www.borderlandsejournal.adelaide.edu.au/vol15no1_2006/ pugliese.htm>. Marx, Gary. “A Tack in the Shoe: Neutralizing and Resisting the New Surveillance.” Journal of Social Issues 59.2 (2003). 18 April 2007 http://web.mit.edu/gtmarx/www/tack.html>. Moores, Shaun. “Doubling of Place.” Mediaspace: Place Scale and Culture in a Media Age. Eds. Nick Couldry and Anna McCarthy. Routledge, London, 2004. Monahan, Teri, ed. Surveillance and Security: Technological Politics and Power in Everyday Life. Routledge: London, 2006. Norris, Clive, and Gary Armstrong. The Maximum Surveillance Society: The Rise of CCTV. Oxford: Berg, 1999. O’Harrow, Robert. No Place to Hide. New York: Free Press, 2005. Osuri, Goldie. “Media Necropower: Australian Media Reception and the Somatechnics of Mamdouh Habib.” Borderlands 5.1 (2006). 30 May 2007 http://www.borderlandsejournal.adelaide.edu.au/vol5no1_2006 osuri_necropower.htm>. Rose, Nikolas. “Government and Control.” British Journal of Criminology 40 (2000): 321–399. Scannell, Paddy. Radio, Television and Modern Life. Oxford: Blackwell, 1996. Smith, Benjamin. “In What Ways, and for What Reasons, Do We Inscribe Our Bodies?” 15 Nov. 1998. 30 May 2007 http:www.bmezine.com/ritual/981115/Whatways.html>. Stalder, Felix. “Privacy Is Not the Antidote to Surveillance.” Surveillance and Society 1.1 (2002): 120-124. Umiker-Sebeok, Jean. “Power and the Construction of Gendered Spaces.” Indiana University-Bloomington. 14 April 2007 http://www.slis.indiana.edu/faculty/umikerse/papers/power.html>. William, Bogard. The Simulation of Surveillance: Hypercontrol in Telematic Societies. Cambridge: Cambridge UP, 1996. Wood, Kristie, and David M. Ball, eds. “A Report on the Surveillance Society.” Surveillance Studies Network, UK, Sep. 2006. 14 April 2007 http://www.ico.gov.uk/upload/documents/library/data_protection/ practical_application/surveillance_society_full_report_2006.pdf>. Citation reference for this article MLA Style Ibrahim, Yasmin. "Commodifying Terrorism: Body, Surveillance and the Everyday." M/C Journal 10.3 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0706/05-ibrahim.php>. APA Style Ibrahim, Y. (Jun. 2007) "Commodifying Terrorism: Body, Surveillance and the Everyday," M/C Journal, 10(3). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0706/05-ibrahim.php>.
APA, Harvard, Vancouver, ISO, and other styles
22

Jethani, Suneel, and Robbie Fordyce. "Darkness, Datafication, and Provenance as an Illuminating Methodology." M/C Journal 24, no. 2 (April 27, 2021). http://dx.doi.org/10.5204/mcj.2758.

Full text
Abstract:
Data are generated and employed for many ends, including governing societies, managing organisations, leveraging profit, and regulating places. In all these cases, data are key inputs into systems that paradoxically are implemented in the name of making societies more secure, safe, competitive, productive, efficient, transparent and accountable, yet do so through processes that monitor, discipline, repress, coerce, and exploit people. (Kitchin, 165) Introduction Provenance refers to the place of origin or earliest known history of a thing. It refers to the custodial history of objects. It is a term that is commonly used in the art-world but also has come into the language of other disciplines such as computer science. It has also been applied in reference to the transactional nature of objects in supply chains and circular economies. In an interview with Scotland’s Institute for Public Policy Research, Adam Greenfield suggests that provenance has a role to play in the “establishment of reliability” given that a “transaction or artifact has a specified provenance, then that assertion can be tested and verified to the satisfaction of all parities” (Lawrence). Recent debates on the unrecognised effects of digital media have convincingly argued that data is fully embroiled within capitalism, but it is necessary to remember that data is more than just a transactable commodity. One challenge in bringing processes of datafication into critical light is how we understand what happens to data from its point of acquisition to the point where it becomes instrumental in the production of outcomes that are of ethical concern. All data gather their meaning through relationality; whether acting as a representation of an exterior world or representing relations between other data points. Data objectifies relations, and despite any higher-order complexities, at its core, data is involved in factualising a relation into a binary. Assumptions like these about data shape reasoning, decision-making and evidence-based practice in private, personal and economic contexts. If processes of datafication are to be better understood, then we need to seek out conceptual frameworks that are adequate to the way that data is used and understood by its users. Deborah Lupton suggests that often we give data “other vital capacities because they are about human life itself, have implications for human life opportunities and livelihoods, [and] can have recursive effects on human lives (shaping action and concepts of embodiment ... selfhood [and subjectivity]) and generate economic value”. But when data are afforded such capacities, the analysis of its politics also calls for us to “consider context” and “making the labour [of datafication] visible” (D’Ignazio and Klein). For Jenny L. Davis, getting beyond simply thinking about what data affords involves bringing to light how continually and dynamically to requests, demands, encourages, discourages, and refuses certain operations and interpretations. It is in this re-orientation of the question from what to how where “practical analytical tool[s]” (Davis) can be found. Davis writes: requests and demands are bids placed by technological objects, on user-subjects. Encourage, discourage and refuse are the ways technologies respond to bids user-subjects place upon them. Allow pertains equally to bids from technological objects and the object’s response to user-subjects. (Davis) Building on Lupton, Davis, and D’Ignazio and Klein, we see three principles that we consider crucial for work on data, darkness and light: data is not simply a technological object that exists within sociotechnical systems without having undergone any priming or processing, so as a consequence the data collecting entity imposes standards and way of imagining data before it comes into contact with user-subjects; data is not neutral and does not possess qualities that make it equivalent to the things that it comes to represent; data is partial, situated, and contingent on technical processes, but the outcomes of its use afford it properties beyond those that are purely informational. This article builds from these principles and traces a framework for investigating the complications arising when data moves from one context to another. We draw from the “data provenance” as it is applied in the computing and informational sciences where it is used to query the location and accuracy of data in databases. In developing “data provenance”, we adapt provenance from an approach that solely focuses on technical infrastructures and material processes that move data from one place to another and turn to sociotechnical, institutional, and discursive forces that bring about data acquisition, sharing, interpretation, and re-use. As data passes through open, opaque, and darkened spaces within sociotechnical systems, we argue that provenance can shed light on gaps and overlaps in technical, legal, ethical, and ideological forms of data governance. Whether data becomes exclusive by moving from light to dark (as has happened with the removal of many pages and links from Facebook around the Australian news revenue-sharing bill), or is publicised by shifting from dark to light (such as the Australian government releasing investigative journalist Andie Fox’s welfare history to the press), or even recontextualised from one dark space to another (as with genetic data shifting from medical to legal contexts, or the theft of personal financial data), there is still a process of transmission here that we can assess and critique through provenance. These different modalities, which guide data acquisition, sharing, interpretation, and re-use, cascade and influence different elements and apparatuses within data-driven sociotechnical systems to different extents depending on context. Attempts to illuminate and make sense of these complex forces, we argue, exposes data-driven practices as inherently political in terms of whose interests they serve. Provenance in Darkness and in Light When processes of data capture, sharing, interpretation, and re-use are obscured, it impacts on the extent to which we might retrospectively examine cases where malpractice in responsible data custodianship and stewardship has occurred, because it makes it difficult to see how things have been rendered real and knowable, changed over time, had causality ascribed to them, and to what degree of confidence a decision has been made based on a given dataset. To borrow from this issue’s concerns, the paradigm of dark spaces covers a range of different kinds of valences on the idea of private, secret, or exclusive contexts. We can parallel it with the idea of ‘light’ spaces, which equally holds a range of different concepts about what is open, public, or accessible. For instance, in the use of social data garnered from online platforms, the practices of academic researchers and analysts working in the private sector often fall within a grey zone when it comes to consent and transparency. Here the binary notion of public and private is complicated by the passage of data from light to dark (and back to light). Writing in a different context, Michael Warner complicates the notion of publicness. He observes that the idea of something being public is in and of itself always sectioned off, divorced from being fully generalisable, and it is “just whatever people in a given context think it is” (11). Michael Hardt and Antonio Negri argue that publicness is already shadowed by an idea of state ownership, leaving us in a situation where public and private already both sit on the same side of the propertied/commons divide as if the “only alternative to the private is the public, that is, what is managed and regulated by states and other governmental authorities” (vii). The same can be said about the way data is conceived as a public good or common asset. These ideas of light and dark are useful categorisations for deliberately moving past the tensions that arise when trying to qualify different subspecies of privacy and openness. The problem with specific linguistic dyads of private vs. public, or open vs. closed, and so on, is that they are embedded within legal, moral, technical, economic, or rhetorical distinctions that already involve normative judgements on whether such categories are appropriate or valid. Data may be located in a dark space for legal reasons that fall under the legal domain of ‘private’ or it may be dark because it has been stolen. It may simply be inaccessible, encrypted away behind a lost password on a forgotten external drive. Equally, there are distinctions around lightness that can be glossed – the openness of Open Data (see: theodi.org) is of an entirely separate category to the AACS encryption key, which was illegally but enthusiastically shared across the internet in 2007 to the point where it is now accessible on Wikipedia. The language of light and dark spaces allows us to cut across these distinctions and discuss in deliberately loose terms the degree to which something is accessed, with any normative judgments reserved for the cases themselves. Data provenance, in this sense, can be used as a methodology to critique the way that data is recontextualised from light to dark, dark to light, and even within these distinctions. Data provenance critiques the way that data is presented as if it were “there for the taking”. This also suggests that when data is used for some or another secondary purpose – generally for value creation – some form of closure or darkening is to be expected. Data in the public domain is more than simply a specific informational thing: there is always context, and this contextual specificity, we argue, extends far beyond anything that can be captured in a metadata schema or a licensing model. Even the transfer of data from one open, public, or light context to another will evoke new degrees of openness and luminosity that should not be assumed to be straightforward. And with this a new set of relations between data-user-subjects and stewards emerges. The movement of data between public and private contexts by virtue of the growing amount of personal information that is generated through the traces left behind as people make use of increasingly digitised services going about their everyday lives means that data-motile processes are constantly occurring behind the scenes – in darkness – where it comes into the view, or possession, of third parties without obvious mechanisms of consent, disclosure, or justification. Given that there are “many hands” (D’Iganzio and Klein) involved in making data portable between light and dark spaces, equally there can be diversity in the approaches taken to generate critical literacies of these relations. There are two complexities that we argue are important for considering the ethics of data motility from light to dark, and this differs from the concerns that we might have when we think about other illuminating tactics such as open data publishing, freedom-of-information requests, or when data is anonymously leaked in the public interest. The first is that the terms of ethics must be communicable to individuals and groups whose data literacy may be low, effectively non-existent, or not oriented around the objective of upholding or generating data-luminosity as an element of a wider, more general form of responsible data stewardship. Historically, a productive approach to data literacy has been finding appropriate metaphors from adjacent fields that can help add depth – by way of analogy – to understanding data motility. Here we return to our earlier assertion that data is more than simply a transactable commodity. Consider the notion of “giving” and “taking” in the context of darkness and light. The analogy of giving and taking is deeply embedded into the notion of data acquisition and sharing by virtue of the etymology of the word data itself: in Latin, “things having been given”, whereby in French données, a natural gift, perhaps one that is given to those that attempt capture for the purposes of empiricism – representation in quantitative form is a quality that is given to phenomena being brought into the light. However, in the contemporary parlance of “analytics” data is “taken” in the form of recording, measuring, and tracking. Data is considered to be something valuable enough to give or take because of its capacity to stand in for real things. The empiricist’s preferred method is to take rather than to accept what is given (Kitchin, 2); the data-capitalist’s is to incentivise the act of giving or to take what is already given (or yet to be taken). Because data-motile processes are not simply passive forms of reading what is contained within a dataset, the materiality and subjectivity of data extraction and interpretation is something that should not be ignored. These processes represent the recontextualisation of data from one space to another and are expressed in the landmark case of Cambridge Analytica, where a private research company extracted data from Facebook and used it to engage in psychometric analysis of unknowing users. Data Capture Mechanism Characteristics and Approach to Data Stewardship Historical Information created, recorded, or gathered about people of things directly from the source or a delegate but accessed for secondary purposes. Observational Represents patterns and realities of everyday life, collected by subjects by their own choice and with some degree of discretion over the methods. Third parties access this data through reciprocal arrangement with the subject (e.g., in exchange for providing a digital service such as online shopping, banking, healthcare, or social networking). Purposeful Data gathered with a specific purpose in mind and collected with the objective to manipulate its analysis to achieve certain ends. Integrative Places less emphasis on specific data types but rather looks towards social and cultural factors that afford access to and facilitate the integration and linkage of disparate datasets Table 1: Mechanisms of Data Capture There are ethical challenges associated with data that has been sourced from pre-existing sets or that has been extracted from websites and online platforms through scraping data and then enriching it through cleaning, annotation, de-identification, aggregation, or linking to other data sources (tab. 1). As a way to address this challenge, our suggestion of “data provenance” can be defined as where a data point comes from, how it came into being, and how it became valuable for some or another purpose. In developing this idea, we borrow from both the computational and biological sciences (Buneman et al.) where provenance, as a form of qualitative inquiry into data-motile processes, centres around understanding the origin of a data point as part of a broader almost forensic analysis of quality and error-potential in datasets. Provenance is an evaluation of a priori computational inputs and outputs from the results of database queries and audits. Provenance can also be applied to other contexts where data passes through sociotechnical systems, such as behavioural analytics, targeted advertising, machine learning, and algorithmic decision-making. Conventionally, data provenance is based on understanding where data has come from and why it was collected. Both these questions are concerned with the evaluation of the nature of a data point within the wider context of a database that is itself situated within a larger sociotechnical system where the data is made available for use. In its conventional sense, provenance is a means of ensuring that a data point is maintained as a single source of truth (Buneman, 89), and by way of a reproducible mechanism which allows for its path through a set of technical processes, it affords the assessment of a how reliable a system’s output might be by sheer virtue of the ability for one to retrace the steps from point A to B. “Where” and “why” questions are illuminating because they offer an ends-and-means view of the relation between the origins and ultimate uses of a given data point or set. Provenance is interesting when studying data luminosity because means and ends have much to tell us about the origins and uses of data in ways that gesture towards a more accurate and structured research agenda for data ethics that takes the emphasis away from individual moral patients and reorients it towards practices that occur within information management environments. Provenance offers researchers seeking to study data-driven practices a similar heuristic to a journalist’s line of questioning who, what, when, where, why, and how? This last question of how is something that can be incorporated into conventional models of provenance that make it useful in data ethics. The question of how data comes into being extends questions of power, legality, literacy, permission-seeking, and harm in an entangled way and notes how these factors shape the nature of personal data as it moves between contexts. Forms of provenance accumulate from transaction to transaction, cascading along, as a dataset ‘picks up’ the types of provenance that have led to its creation. This may involve multiple forms of overlapping provenance – methodological and epistemological, legal and illegal – which modulate different elements and apparatuses. Provenance, we argue is an important methodological consideration for workers in the humanities and social sciences. Provenance provides a set of shared questions on which models of transparency, accountability, and trust may be established. It points us towards tactics that might help data-subjects understand privacy in a contextual manner (Nissenbaum) and even establish practices of obfuscation and “informational self-defence” against regimes of datafication (Brunton and Nissenbaum). Here provenance is not just a declaration of what means and ends of data capture, sharing, linkage, and analysis are. We sketch the outlines of a provenance model in table 2 below. Type Metaphorical frame Dark Light What? The epistemological structure of a database determines the accuracy of subsequent decisions. Data must be consistent. What data is asked of a person beyond what is strictly needed for service delivery. Data that is collected for a specific stated purpose with informed consent from the data-subject. How does the decision about what to collect disrupt existing polities and communities? What demands for conformity does the database make of its subjects? Where? The contents of a database is important for making informed decisions. Data must be represented. The parameters of inclusion/exclusion that create unjust risks or costs to people because of their inclusion or exclusion in a dataset. The parameters of inclusion or exclusion that afford individuals representation or acknowledgement by being included or excluded from a dataset. How are populations recruited into a dataset? What divides exist that systematically exclude individuals? Who? Who has access to data, and how privacy is framed is important for the security of data-subjects. Data access is political. Access to the data by parties not disclosed to the data-subject. Who has collected the data and who has or will access it? How is the data made available to those beyond the data subjects? How? Data is created with a purpose and is never neutral. Data is instrumental. How the data is used, to what ends, discursively, practically, instrumentally. Is it a private record, a source of value creation, the subject of extortion or blackmail? How the data was intended to be used at the time that it was collected. Why? Data is created by people who are shaped by ideological factors. Data has potential. The political rationality that shapes data governance with regard to technological innovation. The trade-offs that are made known to individuals when they contribute data into sociotechnical systems over which they have limited control. Table 2: Forms of Data Provenance Conclusion As an illuminating methodology, provenance offers a specific line of questioning practices that take information through darkness and light. The emphasis that it places on a narrative for data assets themselves (asking what when, who, how, and why) offers a mechanism for traceability and has potential for application across contexts and cases that allows us to see data malpractice as something that can be productively generalised and understood as a series of ideologically driven technical events with social and political consequences without being marred by perceptions of exceptionality of individual, localised cases of data harm or data violence. References Brunton, Finn, and Helen Nissenbaum. "Political and Ethical Perspectives on Data Obfuscation." Privacy, Due Process and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology. Eds. Mireille Hildebrandt and Katja de Vries. New York: Routledge, 2013. 171-195. Buneman, Peter, Sanjeev Khanna, and Wang-Chiew Tan. "Data Provenance: Some Basic Issues." International Conference on Foundations of Software Technology and Theoretical Computer Science. Berlin: Springer, 2000. Davis, Jenny L. How Artifacts Afford: The Power and Politics of Everyday Things. Cambridge: MIT Press, 2020. D'Ignazio, Catherine, and Lauren F. Klein. Data Feminism. Cambridge: MIT Press, 2020. Hardt, Michael, and Antonio Negri. Commonwealth. Cambridge: Harvard UP, 2009. Kitchin, Rob. "Big Data, New Epistemologies and Paradigm Shifts." Big Data & Society 1.1 (2014). Lawrence, Matthew. “Emerging Technology: An Interview with Adam Greenfield. ‘God Forbid That Anyone Stopped to Ask What Harm This Might Do to Us’. Institute for Public Policy Research, 13 Oct. 2017. <https://www.ippr.org/juncture-item/emerging-technology-an-interview-with-adam-greenfield-god-forbid-that-anyone-stopped-to-ask-what-harm-this-might-do-us>. Lupton, Deborah. "Vital Materialism and the Thing-Power of Lively Digital Data." Social Theory, Health and Education. Eds. Deana Leahy, Katie Fitzpatrick, and Jan Wright. London: Routledge, 2018. Nissenbaum, Helen F. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford: Stanford Law Books, 2020. Warner, Michael. "Publics and Counterpublics." Public Culture 14.1 (2002): 49-90.
APA, Harvard, Vancouver, ISO, and other styles
23

Watkins, Patti Lou. "Fat Studies 101: Learning to Have Your Cake and Eat It Too." M/C Journal 18, no. 3 (May 18, 2015). http://dx.doi.org/10.5204/mcj.968.

Full text
Abstract:
“I’m fat–and it’s okay! It doesn’t mean I’m stupid, or ugly, or lazy, or selfish. I’m fat!” so proclaims Joy Nash in her YouTube video, A Fat Rant. “Fat! It’s three little letters–what are you afraid of?!” This is the question I pose to my class on day one of Fat Studies. Sadly, many college students do fear fat, and negative attitudes toward fat people are quite prevalent in this population (Ambwani et al. 366). As I teach it, Fat Studies is cross-listed between Psychology and Gender Studies. However, most students who enrol have majors in Psychology or other behavioural health science fields in which weight bias is particularly pronounced (Watkins and Concepcion 159). Upon finding stronger bias among third- versus first-year Physical Education students, O’Brien, Hunter, and Banks (308) speculated that the weight-centric curriculum that typifies this field actively engenders anti-fat attitudes. Based on their exploration of textbook content, McHugh and Kasardo (621) contend that Psychology too is complicit in propagating weight bias by espousing weight-centric messages throughout the curriculum. Such messages include the concepts that higher body weight invariably leads to poor health, weight control is simply a matter of individual choice, and dieting is an effective means of losing weight and improving health (Tylka et al.). These weight-centric tenets are, however, highly contested. For instance, there exists a body of research so vast that it has its own name, the “obesity paradox” literature. This literature (McAuley and Blair 773) entails studies that show that “obese” persons with chronic disease have relatively better survival rates and that a substantial portion of “overweight” and “obese” individuals have levels of metabolic health similar to or better than “normal” weight individuals (e.g., Flegal et al. 71). Finally, the “obesity paradox” literature includes studies showing that cardiovascular fitness is a far better predictor of mortality than weight. In other words, individuals may be both fit and fat, or conversely, unfit and thin (Barry et al. 382). In addition, Tylka et al. review literature attesting to the complex causes of weight status that extend beyond individual behaviour, ranging from genetic predispositions to sociocultural factors beyond personal control. Lastly, reviews of research on dieting interventions show that these are overwhelmingly ineffective in producing lasting weight loss or actual improvements in health and may in fact lead to disordered eating and other unanticipated adverse consequences (e.g., Bacon and Aphramor; Mann et al. 220; Salas e79; Tylka et al.).The newfound, interdisciplinary field of scholarship known as Fat Studies aims to debunk weight-centric misconceptions by elucidating findings that counter these mainstream suppositions. Health At Every Size® (HAES), a weight-neutral approach to holistic well-being, is an important facet of Fat Studies. The HAES paradigm advocates intuitive eating and pleasurable physical activity for health rather than restrictive dieting and regimented exercise for weight loss. HAES further encourages body acceptance of self and others regardless of size. Empirical evidence shows that HAES-based interventions improve physical and psychological health without harmful side-effects or high dropout rates associated with weight loss interventions (Bacon and Aphramor; Clifford et al. “Impact of Non-Diet Approaches” 143). HAES, like the broader field of Fat Studies, seeks to eradicate weight-based discrimination, positioning weight bias as a social justice issue that intersects with oppression based on other areas of difference such as gender, race, and social class. Much like Queer Studies, Fat Studies seeks to reclaim the word, fat, thus stripping it of its pejorative connotations. As Nash asserts in her video, “Fat is a descriptive physical characteristic. It’s not an insult, or an obscenity, or a death sentence!” As an academic discipline, Fat Studies is expanding its visibility and reach. The Fat Studies Reader, the primary source of reading for my course, provides a comprehensive overview of the field (Rothblum and Solovay 1). This interdisciplinary anthology addresses fat history and activism, fat as social inequality, fat in healthcare, and fat in popular culture. Ward (937) reviews this and other recently-released fat-friendly texts. The field features its own journal, Fat Studies: An Interdisciplinary Journal of Body Weight and Society, which publishes original research, overview articles, and reviews of assorted media. Both the Popular Culture Association and National Women’s Studies Association have special interest groups devoted to Fat Studies, and the American Psychological Association’s Division on the Psychology of Women has recently formed a task force on sizism (Bergen and Carrizales 22). Furthermore, Fat Studies conferences have been held in Australia and New Zealand, and the third annual Weight Stigma Conference will occur in Iceland, September 2015. Although the latter conference is not necessarily limited to those who align themselves with Fat Studies, keynote speakers include Ragen Chastain, a well-known member of the fat acceptance movement largely via her blog, Dances with Fat. The theme of this year’s conference, “Institutionalised Weightism: How to Challenge Oppressive Systems,” is consistent with Fat Studies precepts:This year’s theme focuses on the larger social hierarchies that favour thinness and reject fatness within western culture and how these systems have dictated the framing of fatness within the media, medicine, academia and our own identities. What can be done to oppose systemised oppression? What can be learned from the fight for social justice and equality within other arenas? Can research and activism be united to challenge prevailing ideas about fat bodies?Concomitantly, Fat Studies courses have begun to appear on college campuses. Watkins, Farrell, and Doyle-Hugmeyer (180) identified and described four Fat Studies and two HAES courses that were being taught in the U.S. and abroad as of 2012. Since then, a Fat Studies course has been taught online at West Virginia University and another will soon be offered at Washington State University. Additionally, a new HAES class has been taught at Saint Mary’s College of California during the last two academic years. Cameron (“Toward a Fat Pedagogy” 28) describes ways in which nearly 30 instructors from five different countries have incorporated fat studies pedagogy into university courses across an array of academic areas. This growing trend is manifested in The Fat Pedagogy Reader (Russell and Cameron) due out later this year. In this article, I describe content and pedagogical strategies that I use in my Fat Studies course. I then share students’ qualitative reactions, drawing upon excerpts from written assignments. During the term reported here, the class was comprised of 17 undergraduate and 5 graduate students. Undergraduate majors included 47% in Psychology, 24% in Women Studies, 24% in various other College of Liberal Arts fields, and 6% in the College of Public Health. Graduate majors included 40% in the College of Public Health and 60% in the College of Education. Following submission of final grades, students provided consent via email allowing written responses on assignments to be anonymously incorporated into research reports. Assignments drawn upon for this report include weekly reading reactions to specific journal articles in which students were to summarise the main points, identify and discuss a specific quote or passage that stood out to them, and consider and discuss applicability of the information in the article. This report also utilises responses to a final assignment in which students were to articulate take-home lessons from the course.Despite the catalogue description, many students enter Fat Studies with a misunderstanding of what the course entails. Some admitted that they thought the course was about reducing obesity and the presumed health risks associated with this alleged pathological condition (Watkins). Others understood, but were somewhat dubious, at least at the outset, “Before I began this class, I admit that I was skeptical of what Fat Studies meant.” Another student experienced “a severe cognitive dissonance” between the Fat Studies curriculum and that of a previous behavioural health class:My professor spent the entire quarter spouting off statistics, such as the next generation of children will be the first generation to have a lower life expectancy than their parents and the ever increasing obesity rates that are putting such a tax on our health care system, and I took her words to heart. I was scared for myself and for the populations I would soon be working with. I was worried that I was destined to a chronic disease and bothered that my BMI was two points above ‘normal.’ I believed everything my professor alluded to on the danger of obesity because it was things I had heard in the media and was led to believe all my life.Yet another related, “At first, I will be honest, it was hard for me to accept a lot of this information, but throughout the term every class changed my mind about my view of fat people.” A few students have voiced even greater initial resistance. During a past term, one student lamented that the material represented an attack on her intended behavioural health profession. Cameron (“Learning to Teach Everybody”) describes comparable reactions among students in her Critical Obesity course taught within a behavioural health science unit. Ward (937) attests that, even in Gender Studies, fat is the topic that creates the most controversy. Similarly, she describes students’ immense discomfort when asked to entertain perspectives that challenge deeply engrained ideas inculcated by our culture’s “obesity epidemic.” Discomfort, however, is not necessarily antithetical to learning. In prompting students to unlearn “the biomedically-informed truth of obesity, namely that fat people are unfit, unhealthy, and in need of ‘saving’ through expert interventions,” Moola at al. recommend equipping them with an “ethics of discomfort” (217). No easy task, “It requires courage to ask our students to forgo the security of prescriptive health messaging in favour of confusion and uncertainty” (221). I encourage students to entertain conflicting perspectives by assigning empirically-based articles emanating from peer-reviewed journals in their own disciplines that challenge mainstream discourses on obesity (e.g., Aphramor; Bombak e60; Tomiyama, Ahlstrom, and Mann 861). Students whose training is steeped in the scientific method seem to appreciate having quantitative data at their disposal to convince themselves–and their peers and professors–that widely held weight-centric beliefs and practices may not be valid. One student remarked, “Since I have taken this course, I feel like I am prepared to discuss the fallacy of the weight-health relationship,” citing specific articles that would aid in the effort. Likewise, Cameron’s (“Learning to Teach Everybody”) students reported a need to read research reports in order to begin questioning long-held beliefs.In addition, I assign readings that provide students with the opportunity to hear the voices of fat people themselves, a cornerstone of Fat Studies. Besides chapters in The Fat Studies Reader authored by scholars and activists who identify as fat, I assign qualitative articles (e.g., Lewis et al.) and narrative reports (e.g., Pause 42) in which fat people describe their experiences with weight and weight bias. Additionally, I provide positive images of fat people via films and websites (Clifford et al. HAES®; Watkins; Watkins and Doyle-Hugmeyer 177) in order to counteract the preponderance of negative, dehumanising portrayals in popular media (e.g., Ata and Thompson 41). In response, a student stated:One of the biggest things I took away from this term was the confidence I found in fat women through films and stories. They had more confidence than I have seen in any tiny girl and owned the body they were given.I introduce “normal” weight allies as well, most especially Linda Bacon whose treatise on thin privilege tends to set the stage for viewing weight bias as a form of oppression (Bacon). One student observed, “It was a relief to be able to read and talk about weight oppression in a classroom setting for once.” Another appreciated that “The class did a great job at analysing fat as oppression and not like a secondhand oppression as I have seen in my past classes.” Typically, fat students were already aware of weight-based privilege and oppression, often painfully so. Thinner students, however, were often astonished by this concept, several describing Bacon’s article as “eye-opening.” In reaction, many vowed to act as allies:This class has really opened my eyes and prepared me to be an ally to fat people. It will be difficult for some time while I try to get others to understand my point of view on fat people but I believe once there are enough allies, people’s minds will really start changing and it will benefit everyone for the better.Pedagogically, I choose to share my own experiences as they relate to course content and encourage students, at least in their written assignments, to do the same. Other instructors refrain from this practice for fear of reinforcing traditional discourses or eliciting detrimental reactions from students (Watkins, Farrell, and Doyle-Hugmeyer 191). Nevertheless, this tack seems to work well in my course, with many students opting to disclose their relevant circumstances during classroom discussions: Throughout the term I very much valued and appreciated when classmates would share their experiences. I love listening and hearing to others experiences and I think that is a great way to understand the material and learn from one another.It really helped to read different articles and hear classmates discuss and share stories that I was able to relate to. The idea of hearing people talk about issues that I thought I was the only one who dealt with was so refreshing and enlightening.The structure of this class allowed me to learn how this information is applicable to my life and made it deeper than just memorising information.Thus far, across three terms, no student has described iatrogenic effects from this process. In fact, most attribute positive transformations to the class. These include enhanced body acceptance of self and others: This class decreased my fat phobia towards others and gave me a better understanding about the intersectionality of one’s weight. For example, I now feel that I no longer view my family in a fat phobic way and I also feel responsible for educating my brother and helping him develop a strong self-esteem regardless of his size.I never thought this class would change my life, almost save my life. Through studies shown in class and real life people following their dreams, it made my mind completely change about how I view my body and myself.I can only hope that in the future, I will be more forgiving, tolerant, and above all accepting of myself, much less others. Regardless of a person’s shape and size, we are all beautiful, and while I’m just beginning to understand this, it can only get better from here.Students also reported becoming more savvy consumers of weight-centric media messages as well as realigning their eating and exercise behaviour in accordance with HAES: I find myself disgusted at the television now, especially with the amount of diet ads, fitness club ads, and exercise equipment ads all aimed at making a ‘better you.’ I now know that I would never be better off with a SlimFast shake, P90X, or a Total Gym. I would be better off eating when I’m hungry, working out because it is fun, and still eating Thin Mints when I want to. Prior to this class, I would work out rigorously, running seven miles a day. Now I realise why at times I dreaded to work out, it was simply a mathematical system to burn the energy that I had acquired earlier in the day. Instead what I realise I should do is something I enjoy, that way I will never get tired of whatever I am doing. While I do enjoy running, other activities would bring more joy while engaging in a healthy lifestyle like hiking or mountain biking.I will never go on another diet. I will stop choosing exercises I don’t love to do. I will not weigh myself every single day hoping for the number on the scale to change.A reduction in self-weighing was perhaps the most frequent behaviour change that students expressed. This is particularly valuable in that frequent self-weighing is associated with disordered eating and unhealthy weight control behaviours (Neumark-Sztainer et al. 811):I have realised that the number on the scale is simply a number on the scale. That number does not define who you are. I have stopped weighing myself every morning. I put the scale in the storage closet so I don’t have to look at it. I even encouraged my roommate to stop weighing herself too. What has been most beneficial for me to take away from this class is the notion that the number on the scale has so much less to do with fitness levels than most people understand. Coming from a numbers obsessed person like myself, this class has actually gotten me to leave the scales behind. I used to weigh myself every single day and my self-confidence reflected whether I was up or down in weight from the day before. It seems so silly to me now. From this class, I take away a new outlook on body diversity. I will evaluate who I am for what I do and not represent myself with a number. I’m going to have my cake this time, and actually eat it too!Finally, students described ways in which they might carry the concepts from Fat Studies into their future professions: I want to go to law school. This model is something I will work toward in the fight for social justice.As a teacher and teacher of teachers, I plan to incorporate discussions on size diversity and how this should be addressed within the field of adapted physical education.I do not know how I would have gone forward if I had never taken this class. I probably would have continued to use weight loss as an effective measure of success for both nutrition and physical activity interventions. I will never be able to think about the obesity prevention movement in the same way.Since I am working toward being a clinical psychologist, I don’t want to have a client who is pursuing weight loss and then blindly believe that they need to lose weight. I’d rather be of the mindset that every person is unique, and that there are other markers of health at every size.Jones and Hughes-Decatur (59) call for increased scholarship illustrating and evaluating critical body pedagogies so that teachers might provide students with tools to critique dominant discourses, helping them forge healthy relationships with their own bodies in the process. As such, this paper describes elements of a Fat Studies class that other instructors may choose to adopt. It additionally presents qualitative data suggesting that students came to think about fat and fat people in new and divergent ways. Qualitative responses also suggest that students developed better body image and more adaptive eating and exercise behaviours throughout the term. Although no students have yet described lasting adverse effects from the class, one stated that she would have preferred less of a focus on health and more of a focus on issues such as fat fashion. Indeed, some Fat Studies scholars (e.g., Lee) advocate separating discussions of weight bias from discussions of health status to avoid stigmatising fat people who do experience health problems. While concerns about fostering healthism within the fat acceptance movement are valid, as a behavioural health professional with an audience of students training in these fields, I have chosen to devote three weeks of our ten week term to this subject matter. Depending on their academic background, others who teach Fat Studies may choose to emphasise different aspects such as media representations or historical connotations of fat.Nevertheless, the preponderance of positive comments evidenced throughout students’ assignments may certainly be a function of social desirability. Although I explicitly invite critique, and in fact assign readings (e.g., Welsh 33) and present media that question HAES and Fat Studies concepts, students may still feel obliged to articulate acceptance of and transformations consistent with the principles of these movements. As a more objective assessment of student outcomes, I am currently conducting a quantitative evaluation, in which I remain blind to students’ identities, of this year’s Fat Studies course compared to other upper division/graduate Psychology courses, examining potential changes in weight bias, body image and dieting behaviour, adherence to appearance-related media messages, and obligatory exercise behaviour. I postulate results akin to those of Humphrey, Clifford, and Neyman Morris (143) who found reductions in weight bias, improved body image, and improved eating behaviour among college students as a function of their HAES course. As Fat Studies pedagogy proliferates, instructors are called upon to share their teaching strategies, document the effects, and communicate these results within and outside of academic spheres.ReferencesAmbwani, Suman, Katherine M. Thomas, Christopher J. Hopwood, Sara A. Moss, and Carlos M. Grilo. “Obesity Stigmatization as the Status Quo: Structural Considerations and Prevalence among Young Adults in the U.S.” Eating Behaviors 15.3 (2014): 366-370. Aphramor, Lucy. “Validity of Claims Made in Weight Management Research: A Narrative Review of Dietetic Articles.” Nutrition Journal 9 (2010): n. pag. 15 May 2015 ‹http://www.nutritionj.com/content/9/1/30›.Ata, Rheanna M., and J. Kevin Thompson. “Weight Bias in the Media: A Review of Recent Research.” Obesity Facts 3.1 (2010): 41-46.Bacon, Linda. “Reflections on Fat Acceptance: Lessons Learned from Thin Privilege.” 2009. 23 Apr. 2015 ‹http://www.lindabacon.org/Bacon_ThinPrivilege080109.pdf›.Bacon, Linda, and Lucy Aphramor. “Weight Science: Evaluating the Evidence for a Paradigm Shift.” Nutrition Journal 10 (2011). 23 Apr. 2015 ‹http://www.nutritionj.com/content/10/1/9›.Barry, Vaughn W., Meghan Baruth, Michael W. Beets, J. Larry Durstine, Jihong Liu, and Steven N. Blair. “Fitness vs. Fatness on All-Cause Mortality: A Meta-Analysis.” Progress in Cardiovascular Diseases 56.4 (2014): 382-390.Bergen, Martha, and Sonia Carrizales. “New Task Force Focused on Size.” The Feminist Psychologist 42.1 (2015): 22.Bombak, Andrea. “Obesity, Health at Every Size, and Public Health Policy.” American Journal of Public Health 104.2 (2014): e60-e67.Cameron, Erin. “Learning to Teach Everybody: Exploring the Emergence of an ‘Obesity” Pedagogy’.” The Fat Pedagogy Reader: Challenging Weight-Based Oppression in Education. Eds. Erin Cameron and Connie Russell. New York: Peter Lang Publishing, in press.Cameron, Erin. “Toward a Fat Pedagogy: A Study of Pedagogical Approaches Aimed at Challenging Obesity Discourses in Post-Secondary Education.” Fat Studies 4.1 (2015): 28-45.Chastain, Ragen. Dances with Fat. 15 May 2015 ‹https://danceswithfat.wordpress.com/blog/›.Clifford, Dawn, Amy Ozier, Joanna Bundros, Jeffrey Moore, Anna Kreiser, and Michele Neyman Morris. “Impact of Non-Diet Approaches on Attitudes, Behaviors, and Health Outcomes: A Systematic Review.” Journal of Nutrition Education and Behavior 47.2 (2015): 143-155.Clifford, Dawn, Patti Lou Watkins, and Rebecca Y. Concepcion. “HAES® University: Bringing a Weight Neutral Message to Campus.” Association for Size Diversity and Health, 2015. 23 Apr. 2015 ‹https://www.sizediversityandhealth.org/content.asp?id=258›.Fat Studies: An Interdisciplinary Journal of Body Weight and Society. 23 Apr. 2015 ‹http://www.tandfonline.com/toc/ufts20/current#.VShpqdhFDBC›.Flegal, Katherine M., Brian K. Kit, Heather Orpana, and Barry L. Graubard. “Association of All-Cause Mortality with Overweight and Obesity Using Standard Body Mass Index Categories: A Systematic Review and Meta-Analysis.” Journal of the American Medical Association 309.1 (2013): 71-82.Humphrey, Lauren, Dawn Clifford, and Michelle Neyman Morris. “Health At Every Size College Course Reduces Dieting Behaviors and Improves Intuitive Eating, Body Esteem, and Anti-Fat Attitudes.” Journal of Nutrition Education and Behavior, in press.Jones, Stephanie, and Hilary Hughes-Decatur. “Speaking of Bodies in Justice-Oriented Feminist Teacher Education.” Journal of Teacher Education 63.1 (2012): 51-61.Lee, Jenny. Embodying Stereotypes: Memoir, Fat and Health. Fat Studies: Reflective Intersections, July 2012, Wellington, NZ. Unpublished conference paper.Lewis, Sophie, Samantha L. Thomas, Jim Hyde, David Castle, R. Warwick Blood, and Paul A. Komesaroff. “’I Don't Eat a Hamburger and Large Chips Every Day!’ A Qualitative Study of the Impact of Public Health Messages about Obesity on Obese Adults.” BMC Public Health 10.309 (2010). 23 Apr 2015 ‹http://www.biomedcentral.com/1471-2458/10/309›.Mann, Traci, A. Janet Tomiyama, Erika Westling, Ann-Marie Lew, Barbara Samuels, and Jason Chatman. “Medicare’s Search for Effective Obesity Treatments: Diets Are Not the Answer.” American Psychologist 62.3 (2007): 220-233.McAuley, Paul A., and Steven N. Blair. “Obesity Paradoxes.” Journal of Sports Sciences 29.8 (2011): 773-782. McHugh, Maureen C., and Ashley E. Kasardo. “Anti-Fat Prejudice: The Role of Psychology in Explication, Education and Eradication.” Sex Roles 66.9-10 (2012): 617-627.Moola, Fiona J., Moss E. Norman, LeAnne Petherick, and Shaelyn Strachan. “Teaching across the Lines of Fault in Psychology and Sociology: Health, Obesity and Physical Activity in the Canadian Context.” Sociology of Sport Journal 31.2 (2014): 202-227.Nash, Joy. “A Fat Rant.” YouTube, 17 Mar. 2007. 23 Apr. 2015 ‹https://www.youtube.com/watch?v=yUTJQIBI1oA›.Neumark-Sztainer, Dianne, Patricia van den Berg, Peter J. Hannan, and Mary Story. “Self-Weighing in Adolescents: Helpful or Harmful? Longitudinal Associations with Body Weight Changes and Disordered Eating.” Journal of Adolescent Health 39.6 (2006): 811–818.O’Brien, K.S., J.A. Hunter, and M. Banks. “Implicit Anti-Fat Bias in Physical Educators: Physical Attributes, Ideology, and Socialization.” International Journal of Obesity 31.2 (2007): 308-314.Pause, Cat. “Live to Tell: Coming Out as Fat.” Somatechnics 2.1 (2012): 42-56.Rothblum, Esther, and Sondra Solovay, eds. The Fat Studies Reader. New York: New York University Press, 2009.Russell, Connie, and Erin Cameron, eds. The Fat Pedagogy Reader: Challenging Weight-Based Oppression in Education. New York: Peter Lang Publishing, in press. Salas, Ximena Ramos. “The Ineffectiveness and Unintended Consequences of the Public Health War on Obesity.” Canadian Journal of Public Health 106.2 (2015): e79-e81. Tomiyama, A. Janet, Britt Ahlstrom, and Traci Mann. “Long-Term Effects of Dieting: Is Weight Loss Related to Health?” Social and Personality Psychology Compass 7.12 (2013): 861-877.Tylka, Tracy L., Rachel A. Annunziato, Deb Burgard, Sigrun Daníelsdóttir, Ellen Shuman, Chad Davis, and Rachel M. Calogero. “The Weight-Inclusive versus Weight-Normative Approach to Health: Evaluating the Evidence for Prioritizing Well-Being over Weight Loss.” Journal of Obesity (2014). 23 Apr. 2015 ‹http://www.hindawi.com/journals/jobe/2014/983495/›.Ward, Anna E. “The Future of Fat.” American Quarterly 65.4 (2013): 937-947.Watkins, Patti Lou. “Inclusion of Fat Studies in a Difference, Power, and Discrimination Curriculum.” The Fat Pedagogy Reader: Challenging Weight-Based Oppression in Education. Eds. Erin Cameron and Connie Russell. New York: Peter Lang Publishing, in press. Watkins, Patti Lou, and Rebecca Y. Concepcion. “Teaching HAES to Health Care Students and Professionals.” Wellness Not Weight: Motivational Interviewing and a Non-Diet Approach. Ed. Ellen Glovsky. San Diego: Cognella Academic Publishing, 2014: 159-169. Watkins, Patti Lou, and Andrea Doyle-Hugmeyer. “Teaching about Eating Disorders from a Fat Studies Perspective. Transformations 23.2 (2013): 147-158. Watkins, Patti Lou, Amy E. Farrell, and Andrea Doyle Hugmeyer. “Teaching Fat Studies: From Conception to Reception. Fat Studies 1.2 (2012): 180-194. Welsh, Taila L. “Healthism and the Bodies of Women: Pleasure and Discipline in the War against Obesity.” Journal of Feminist Scholarship 1 (2011): 33-48. Weight Stigma Conference. 23 Apr. 2015 ‹http://stigmaconference.com/›.
APA, Harvard, Vancouver, ISO, and other styles
24

Wallace, Derek. "'Self' and the Problem of Consciousness." M/C Journal 5, no. 5 (October 1, 2002). http://dx.doi.org/10.5204/mcj.1989.

Full text
Abstract:
Whichever way you look at it, self is bound up with consciousness, so it seems useful to review some of the more significant existing conceptions of this relationship. A claim by Mikhail Bakhtin can serve as an anchoring point for this discussion. He firmly predicates the formation of self not just on the existence of an individual consciousness, but on what might be called a double or social (or dialogic) consciousness. Summarising his argument, Pam Morris writes: 'A single consciousness could not generate a sense of its self; only the awareness of another consciousness outside the self can produce that image.' She goes on to say that, 'Behind this notion is Bakhtin's very strong sense of the physical and spatial materiality of bodily being,' and quotes directly from Bakhtin's essay as follows: This other human being whom I am contemplating, I shall always see and know something that he, from his place outside and over against me, cannot see himself: parts of his body that are inaccessible to his own gaze (his head, his face and its expression), the world behind his back . . . are accessible to me but not to him. As we gaze at each other, two different worlds are reflected in the pupils of our eyes . . . to annihilate this difference completely, it would be necessary to merge into one, to become one and the same person. This ever--present excess of my seeing, knowing and possessing in relation to any other human being, is founded in the uniqueness and irreplaceability of my place in the world. (Bakhtin in Morris 6 Recent investigations in neuroscience and the philosophy of mind lay down a challenge to this social conception of the self. Notably, it is a challenge that does not involve the restoration of any variant of Cartesian rationalism; indeed, it arguably over--privileges rationalism's subjective or phenomenological opposite. 'Self' in this emerging view is a biologically generated but illusory construction, an effect of the operation of what are called 'neural correlates of consciousness' (NCC). Very briefly, an NCC refers to the distinct pattern of neurochemical activity, a 'neural representational system' -- to some extent observable by modern brain--imaging equipment – that corresponds to a particular configuration of sense--phenomena, or 'content of consciousness' (a visual image, a feeling, or indeed a sense of self). Because this science is still largely hypothetical, with many alternative terms and descriptions, it would be better in this limited space to focus on one particular account – one that is particularly well developed in the area of selfhood and one that resonates with other conceptions included in this discussion. Thomas Metzinger begins by postulating the existence within each person (or 'system' in his terms) of a 'self--model', a representation produced by neural activity -- what he calls a 'neural correlate of self--consciousness' -- that the individual takes to be the actual self, or what Metzinger calls the 'phenomenal self'. 'A self--model is important,' Metzinger says, 'in enabling a system to represent itself to itself as an agent' (293). The individual is able to maintain this illusion because 'the self--model is the only representational structure that is anchored in the brain by a continuous source of internally generated input' (297). In a manner partly reminiscent of Bakhtin, he continues: 'The body is always there, and although its relational properties in space and in movement constantly change, the body is the only coherent perceptual object that constantly generates input.' The reason why the individual is able to jump from the self--model to the phenomenal self in the first place is because: We are systems that are not able to recognise their subsymbolic self--model as a model. For this reason, we are permanently operating under the conditions of a 'naïve--realistic self--misunderstanding': We experience ourselves as being in direct and immediate epistemic contact with ourselves. What we have in the past simply called a 'self' is not a non--physical individual, but only the content of an ongoing dynamical process – the process of transparent self—modeling. (Metzinger 299) The question that nonetheless arises is why it should be concluded that this self--model emerges from subjective neural activity and not, say, from socialisation. Why should a self--model be needed in the first place? Metzinger's response is to say that there is good evidence 'for some kind of innate 'body prototype'' (298), and he refers to research that shows that even children born without limbs develop self--models which sometimes include limbs, or report phantom sensations in limbs that have never existed. To me, this still leaves open the possibility that such children are modelling their body image on strong identification with human others. But be that as it may, one of the things that remains unclear after this relatively rich account of contemporary or scientific phenomenology is the extent to which 'neural consciousness' is or can be supplemented by other kinds of consciousness, or indeed whether neural consciousness can be overridden by the 'self' acting on the basis of these other kinds of consciousness. The key stake in Metzinger's account is 'subjectivity'. The reason why the neural correlate of self--consciousness is so important to him is: 'Only if we find the neural and functional correlates of the phenomenal self will we be able to discover a more general theoretical framework into which all data can fit. Only then will we have a chance to understand what we are actually talking about when we say that phenomenal experience is a subjective phenomenon' (301). What other kinds of consciousness might there be? It is significant that, not only do NCC exponents have little to say about the interaction with other people, they rarely mention language, and they are unanimously and emphatically of the opinion that the thinking or processing that takes place in consciousness is not dependent on language, or indeed any signifying system that we know of (though conceivably, it occurs to me, the neural correlates may signify to, or 'call up', each other). And they show little 'consciousness' that a still influential body of opinion (informed latterly by post--structuralist thinking) has argued for the consciousness shaping effects of 'discourse' -- i.e. for socially and culturally generated patterns of language or other signification to order the processing of reality. We could usefully coin the term 'verbal correlates of consciousness' (VCC) to refer to these patterns of signification (words, proverbs, narratives, discourses). Again, however, the same sorts of questions apply, since few discourse theorists mention anything like neuroscience: To what extent is verbal consciousness supplemented by other forms of consciousness, including neural consciousness? These questions may never be fully answerable. However, it is interesting to work through the idea that NCC and VCC both exist and can be in some kind of relation even if the precise relationship is not measurable. This indeed is close to the case that Charles Shepherdson makes for psychoanalysis in attempting to retrieve it from the misunderstanding under which it suffers today: We are now familiar with debates between those who seek to demonstrate the biological foundations of consciousness and sexuality, and those who argue for the cultural construction of subjectivity, insisting that human life has no automatically natural form, but is always decisively shaped by contingent historical conditions. No theoretical alternative is more widely publicised than this, or more heavily invested today. And yet, this very debate, in which 'nature' and 'culture' are opposed to one another, amounts to a distortion of psychoanalysis, an interpretive framework that not only obscures its basic concepts, but erodes the very field of psychoanalysis as a theoretically distinct formation (2--3). There is not room here for an adequate account of Shepherdson's recuperation of psychoanalytic categories. A glimpse of the stakes involved is provided by Shepherdson's account, following Eugenie Lemoine--Luccione, of anorexia, which neither biomedical knowledge nor social constructionism can adequately explain. The further fact that anorexia is more common among women of the same family than in the general population, and among women rather than men, but in neither case exclusively so, thereby tending to rule out a genetic factor, allows Shepherdson to argue: [A]norexia can be understood in terms of the mother--daughter relation: it is thus a symbolic inheritance, a particular relation to the 'symbolic order', that is transmitted from one generation to another . . . we may add that this relation to the 'symbolic order' [which in psychoanalytic theory is not coextensive with language] is bound up with the symbolisation of sexual difference. One begins to see from this that the term 'sexual difference' is not used biologically, but also that it does not refer to general social representations of 'gender,' since it concerns a more particular formation of the 'subject' (12). An intriguing, and related, possibility, suggested by Foucault, is that NCC and VCC (or in Foucault's terms the 'visible' and the 'articulable'), operate independently of each other – that there is a 'disjunction' (Deleuze 64) or 'dislocation' (Shepherdson 166) between them that prevents any dialectical relation. Clearly, for Foucault, the lack of dialectical relation between the two modes does not mean that both are not at all times equally functional. But one can certainly speculate that, increasingly under postmodernity and media saturation, the verbal (i.e. the domain of signification in general) is influential. And if linguistic formations -- discourses, narratives, etc. -- can proliferate and feed on each other unconstrained by other aspects of reality, we get the sense of language 'running away with itself' and, at least for a time, becoming divorced from a more complete sense of reality. (This of course is basically the argument of Baudrillard.) The reverse may also be possible, in certain periods, although the idea that language could have no mediating effect at all on the production of reality (just inconsequential fluff on the surface of things) seems far--fetched in the wake of so much postmodern and media theory. However, the notion is consistent with the theories of hard--line materialists and genetic determinists. But we should at least consider the possibility that some sort of shaping interaction between NCC and VCC, without implicating the full conceptual apparatus of psychoanalysis, is continuously occurring. This possibility is, for me, best realised by Jacques Derrida when he writes of an irreducible interweaving of consciousness and language (the latter for Derrida being a cover term for any system of signification). This interweaving is such that the significatory superstructure 'reacts upon' the 'substratum of non--expressive acts and contents', and the name for this interweaving is 'text' (Mowitt 98). A further possibility is that provided by Pierre Bourdieu's notion of habitus -- the socially inherited schemes of perception and selection, imparted by language and example, which operate for the most part below the level of consciousness but are available to conscious reflection by any individual lucky enough to learn how to recognise that possibility. If the subjective representations of NCC exist, this habitus can be at best only partial; something denied by Bourdieu whose theory of individual agency is founded in what he has referred to as 'the relation between two states of the social' – i.e. 'between history objectified in things, in the form of institutions, and history incarnate in the body, in the form of that system of durable dispositions I call habitus' (190). At the same time, much of Bourdieu's thinking about the habitus seems as though it could be consistent with the kind of predictable representations that might be produced by NCC. For example, there are the simple oppositions that structure much perception in Bourdieu's account. These range from the obvious phenomenological ones (dark/light; bright/dull; male/female; hard/soft, etc.) through to the more abstract, often analogical or metaphorical ones, such as those produced by teachers when assessing their students (bright/dull again; elegant/clumsy, etc.). It seems possible that NCC could provide the mechanism or realisation for the representation, storage, and reactivation of impressions constituting a social model--self. However, an entirely different possibility remains to be considered – which perhaps Bourdieu is also getting at – involving a radical rejection of both NCC and VCC. Any correlational or representational theory of the relationship between a self and his/her environment -- which, according to Charles Altieri, includes the anti--logocentrism of Derrida -- assumes that the primary focus for any consciousness is the mapping and remapping of this relationship rather than the actions and purposes of the agent in question. Referring to the later philosophy of Wittgenstein, Altieri argues: 'Conciousness is essentially not a way of relating to objects but of relating to actions we learn to perform . . . We do not usually think about objects, but about the specific form of activity which involves us with these objects at this time' (233). Clearly, there is not yet any certainty in the arguments provided by neuroscience that neural activity performs a representational role. Is it not, then, possible that this activity, rather than being a 'correlate' of entities, is an accompaniment to, a registration of, action that the rest of the body is performing? In this view, self is an enactment, an expression (including but not restricted to language), and what self--consciousness is conscious of is this activity of the self, not the self as entity. In a way that again returns us towards Bakhtin, Altieri writes: '>From an analytical perspective, it seems likely that our normal ways of acting in the world provide all the criteria we need for a sense of identity. As Sidney Shoemaker has shown, the most important source of the sense of our identity is the way we use the spatio--temporal location of our body to make physical distinctions between here and there, in front and behind, and so on' (234). Reasonably consistent with the Wittgensteinian view -- in its focus on self--activity -- is that contemporary theorisation of the self that compares in influence with that posed by neuroscience. This is the self avowedly constructed by networked computer technology, as described by Mark Poster: [W]hat has occurred in the advanced industrial societies with increasing rapidity . . . is the dissemination of technologies of symbolisation, or language machines, a process that may be described as the electronic textualisation of daily life, and the concomitant transformations of agency, transformations of the constitution of individuals as fixed identities (autonomous, self--regulating, independent) into subjects that are multiple, diffuse, fragmentary. The old (modern) agent worked with machines on natural materials to form commodities, lived near other workers and kin in urban communities, walked to work or traveled by public transport, and read newspapers but engaged as a communicator mostly in face--to--face relations. The new (postmodern) agent works mostly on symbols using computers, lives in isolation from other workers and kin, travels to work by car, and receives news and entertainment from television. . . . Individuals who have this experience do not stand outside the world of objects, observing, exercising rational faculties and maintaining a stable character. The individuals constituted by the new modes of information are immersed and dispersed in textualised practices where grounds are less important than moves. (44--45) Interestingly, Metzinger's theorisation of the model--self lends itself to the self--mutability -- though not the diffusion -- favoured by postmodernists like Poster. [I]t is . . . well conceivable that a system generates a number of different self--models which are functionally incompatible, and therefore modularised. They nevertheless could be internally coherent, each endowed with its own characteristic phenomenal content and behavioral profile. . . this does not have to be a pathological situation. Operating under different self--models in different situational contexts may be biologically as well as socially adaptive. Don't we all to some extent use multiple personalities to cope efficiently with different parts of our lives? (295--6) Poster's proposition is consistent with that of many in the humanities and social sciences today, influenced variously by postmodernism and social constructionism. What I believe remains at issue about his account is that it exchanges one form of externally constituted self ('fixed identity') for another (that produced by the 'modes of information'), and therefore remains locked in a logic of deterministic constitution. (There is a parallel here with Altieri's point about Derrida's inability to escape representation.) Furthermore, theorists like Poster may be too quickly generalising from the experience of adults in 'textualised environments'. Until such time as human beings are born directly into virtual reality environments, each will, for a formative period of time, experience the world in the way described by Bakhtin – through 'a unified perception of bodily and personal being . . . characterised . . . as a loving gift mutually exchanged between self and other across the borderzone of their two consciousnesses' (cited in Morris 6). I suggest it is very unlikely that this emergent sense of being can ever be completely erased even when people subsequently encounter each other in electronic networked environments. It is clearly not the role of a brief survey like this to attempt any resolution of these matters. Indeed, my review has made all the more apparent how far from being settled the question of consciousness, and by extension the question of selfhood, remains. Even the classical notion of the homunculus (the 'little inner man' or the 'ghost in the machine') has been put back into play with Francis Crick and Christof Koch's (2000) neurobiological conception of the 'unconscious homunculus'. The balance of contemporary evidence and argument suggests that the best thing to do right now is to keep the questions open against any form of reductionism – whether social or biological. One way to do this is to explore the notions of self and consciousness as implicated in ongoing processes of complex co--adaptation between biology and culture -- or their individual level equivalents, brain and mind (Taylor Ch. 7). References Altieri, C. "Wittgenstein on Consciousness and Language: a Challenge to Derridean Literary Theory." Wittgenstein, Theory and the Arts. Ed. Richard Allen and Malcolm Turvey. New York: Routledge, 2001. Bourdieu, P. In Other Words: Essays Towards a Reflexive Sociology. Trans. Matthew Adamson. Stanford: Stanford University Press, 1990. Crick, F. and Koch, C. "The Unconscious Homunculus." Neural Correlates of Consciousness: Empirical and Conceptual Questions. Ed. Thomas Metzinger. Cambridge, Mass.: MIT Press, 2000. Deleuze, G. Foucault. Trans. Sean Hand. Minneapolis: University of Minnesota Press, 1988. Metzinger, T. "The Subjectivity of Subjective Experience: A Representationalist Analysis of the First-Person Perspective." Neural Correlates of Consciousness: Empirical and Conceptual Questions. Ed. Thomas Metzinger. Cambridge, Mass.: MIT Press, 2000. Morris, P. (ed.). The Bakhtin Reader: Selected Writings of Bakhtin, Medvedev, Voloshinov. London: Edward Arnold, 1994. Mowitt, J. Text: The Genealogy of an Interdisciplinary Object. Durham: Duke University Press, 1992. Poster, M. Cultural History and Modernity: Disciplinary Readings and Challenges. New York: Columbia University Press, 1997. Shepherdson, C. Vital Signs: Nature, Culture, Psychoanalysis. New York: Routledge, 2000. Taylor, M. C. The Moment of Complexity: Emerging Network Culture. Chicago: University of Chicago Press, 2001. Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Wallace, Derek. "'Self' and the Problem of Consciousness" M/C: A Journal of Media and Culture 5.5 (2002). [your date of access] < http://www.media-culture.org.au/mc/0210/Wallace.html &gt. Chicago Style Wallace, Derek, "'Self' and the Problem of Consciousness" M/C: A Journal of Media and Culture 5, no. 5 (2002), < http://www.media-culture.org.au/mc/0210/Wallace.html &gt ([your date of access]). APA Style Wallace, Derek. (2002) 'Self' and the Problem of Consciousness. M/C: A Journal of Media and Culture 5(5). < http://www.media-culture.org.au/mc/0210/Wallace.html &gt ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography