Journal articles on the topic 'Court media interface'

To see the other types of publications on this topic, follow the link: Court media interface.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 40 journal articles for your research on the topic 'Court media interface.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Mehmood, Arshad, Tayyaba Bashir, Khan Fida Hussain Khan, and Shamim Ali. "Power Struggle Between Supreme Court and the Government: Ideological Role of Pakistani Print Media in Representation of Swiss Letter Issue." International Journal of English Linguistics 9, no. 4 (July 3, 2019): 163. http://dx.doi.org/10.5539/ijel.v9n4p163.

Full text
Abstract:
Newspaper headlines constitute an essential part of media discourse, which is an important field of research in Discourse and Communication Studies. Particularly, certain features of news headlines and their role in observing and directing readers’ attention have made the interface between linguistic analysis of newspaper headlines and the opinion building of the readership. In order to explore the ideological role of print media in representation of Swiss Letter Issue which resulted nullification of an elected prime minister of Pakistan by Supreme Court and the next PM of the same political party was also facing the same challenge. Three widely distributed English newspapers (The News International, DAWN and The Nation) have been selected using purposive sampling technique. Designated time ranges between 1st July 2012 and 31st December 2012, very significant pre-election period in Pakistan. To find the coverage given to the issue by the selected newspapers, total 319 related headlines were found. The data were selected through simple random sampling technique. The obtained data has been analysed by using Faiclough’s three-dimenional model of critical discourse analysis, and simple statistical analysis as well. The findings of the study indicate that print media of Pakistan used manipulative strategies in construction of headlines on Swiss Letter Issue and represented the issue in a biased manner.
APA, Harvard, Vancouver, ISO, and other styles
2

van Binsbergen, Wim. "Kazanga – Ethnicité en Afrique Entre État et Tradition." Afrika Focus 9, no. 1-2 (February 2, 1993): 17–41. http://dx.doi.org/10.1163/2031356x-0090102003.

Full text
Abstract:
Kazanga – Ethnicity in Africa Between State and Tradition The production of cultural forms at the interface between a rural-based tradition and the state is a familiar aspect of ethnicity in contemporary Africa. This paper seeks to identify some of the characteristics of this process, whose products are too often misunderstood, and cherished, as ‘authentic’ forms of ‘tradition’. Highlighting the role of ethnic brokers, of the modem mass media, and of a model of commoditified ‘performance’ as an aspect of contemporary electronic mass culture, the argument explores the production of expressive culture in the context of the Kazanga cultural association and its Kazanga annual festival among the Nkoya people of central western Zambia since the early 1980s, against the background of Nkoya ethnicity and Nkoya expressive and court culture since the 19th century.
APA, Harvard, Vancouver, ISO, and other styles
3

Keskinen, Mikko. "Book and Radio Play Silences: Medial Pauses and Reticence in ‘Murke's Collected Silences’ by Heinrich Böll." CounterText 5, no. 3 (December 2019): 352–70. http://dx.doi.org/10.3366/count.2019.0170.

Full text
Abstract:
This article analyses silence at the interface between print and audio media by reading and listening to Heinrich Böll's short story ‘Murke's Collected Silences’ (‘Doktor Murkes gesammeltes Schweigen’) in its book (1958) and three German-language radio play versions (1965; 1986; 1989). Reference is also made to Benjamin Gwilliam's sound art piece (2007) based on the 1986 adaptation. The Böll story thematises silence and media in various ways, and has definite countertextual aspects, in the sense of technology, textuality, and materiality of language. In the printed story, silence is either verbally named or typographically indicated, whereas the radio plays present or perform it. The comparison of the three silence-related scenes in the Murke radio plays shows considerable variation in the length and manner of pauses. The article considers the differences in receiving silence through print and audio media, and concludes that ‘Murke’ demonstrates, in both formats, that the medium is an integral part of the ‘message’, even the silent one.
APA, Harvard, Vancouver, ISO, and other styles
4

Nakamura, Norikazu, Hiroshi Chiba, and Shoichi Miyahara. "P-HDI-11 Feasibility Study of Ultra-Low Particle Count Media Overcoat Deposited by FCA Method for 2 Tbpsi HDDs(Head/Disk Interface and Tribology,Technical Program of Poster Session)." Proceedings of JSME-IIP/ASME-ISPS Joint Conference on Micromechatronics for Information and Precision Equipment : IIP/ISPS joint MIPE 2009 (2009): 369–70. http://dx.doi.org/10.1299/jsmemipe.2009.369.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

MN, Angga Septian, and Dian Megasari. "Analisis Sistem Human Interface (HMI) pada Kompetensi Programmable Logic Controller (PLC)." Jurnal Informatika Universitas Pamulang 5, no. 3 (September 30, 2020): 328. http://dx.doi.org/10.32493/informatika.v5i3.6858.

Full text
Abstract:
This analysis aims to determine the difference between Human Machine Interface (HMI) based learning media and conventional learning media in influencing student competence in operating the Programmable Logic Controller (PLC). The method used was a true experimental design with a posttest-only control group design. The experimental group was given learning treatment assisted by HMI Omron with the CX-Designer software, while the control group was given treatment with conventional learning media. The treatment effect was analyzed by using different test (Mann-Whitney). The results showed that the effect of HMI-based learning media on PLC competence is as follows: (1) 61% competence in the cognitive domain is in the very good category, 50% competence in the affective domain is in the very good category, and 50% competence in the psychomotor domain is in the very good category; and (2) There are competency differences between HMI-based learning media and conventional learning. This is evidenced by the Sig. Count value of 0.000 in the cognitive domain, 0.000 in the affective domain, and 0.001 in the psychomotor domain, which is smaller than the Sig. Of the study of 0.05 after being given treatment.
APA, Harvard, Vancouver, ISO, and other styles
6

Feroz Khan, Gohar, and Sokha Vong. "Virality over YouTube: an empirical analysis." Internet Research 24, no. 5 (September 30, 2014): 629–47. http://dx.doi.org/10.1108/intr-05-2013-0085.

Full text
Abstract:
Purpose – The purpose of this paper is to seek reasons for some videos going viral over YouTube (a type of social media platform). Design/methodology/approach – Using YouTube APIs (Application Programming Interface) and Webometrics analyst tool, the authors collected data on about 100 all-time-most-viewed YouTube videos and information about the users associated with the videos. The authors constructed and tested an empirical model to understand the relationship among users’ social and non-social capital (e.g. User Age, Gender, View Count, Subscriber, Join Date, Total Videos Posted), video characteristics (Post Date, Duration, and Video Category), external network capital (in-links and hit counts), and Virality (Likes, Dislikes, Favorite Count, View Count, and Comment Count). Partial least square and Webometric analysis was used to explore the association among the constructs. Findings – Among other findings, the results showed that popularity of the videos was not only the function of YouTube system per se, but that network dynamics (e.g. in-links and hits counts) and offline social capital (e.g. fan base and fame) play crucial roles in the viral phenomenon, particularly view count. Originality/value – The authors for the first time constructed and tested an empirical model to find out the determinants of viral phenomenon over YouTube.
APA, Harvard, Vancouver, ISO, and other styles
7

Ligon, John, Woonyoung Choi, Gady Cojocaru, Wei Fu, Emily Hsiue, Teniola Oke, Carol Morris, et al. "506 The tumor immune microenvironment of metastatic osteosarcoma is marked by lymphocyte exclusion and impacts patient progression-free survival." Journal for ImmunoTherapy of Cancer 8, Suppl 3 (November 2020): A541—A542. http://dx.doi.org/10.1136/jitc-2020-sitc2020.0506.

Full text
Abstract:
BackgroundPatients with relapsed metastatic osteosarcoma have no effective treatments available to them,1 and immunotherapy thus far has not succeeded in improving outcomes.2–5 We aim to understand the immune architecture of the tumor microenvironment (TME) of osteosarcoma, with the goal of harnessing the immune system as a major therapeutic strategy for the treatment of patients with osteosarcoma.Methods66 osteosarcoma tissue specimens were stained and analyzed by immunohistochemistry. Tumor-infiltrating lymphocytes (TILs) from 25 specimens were profiled by functional multiparameter flow cytometry (MFC). Distinct regions from 16 pulmonary metastases (PMs) were microdissected, and RNA was extracted to perform comparative transcriptomic studies. Clinical follow-up (median 24 months) was available from resection.ResultsDigital image analysis of immunohistochemistry demonstrated significantly higher infiltrating immune cells in the PMs compared to primary bone tumors, concentrated at the tumor-normal lung ‘PM interface’ region, and elevated expression of multiple immune checkpoint molecules at the PM interface (figure 1). MFC confirmed the increased expression of the immune checkpoint molecules programmed cell death 1 (PD-1, p<0.01) and lymphocyte activation gene 3 (LAG-3, p<0.01), as well as the activation marker IFN-γ (p<0.05) in CD8+ TILs. Gene expression profiling provided further evidence for the presence of TILs with expression of activation markers and inhibitory immune checkpoint molecules at the PM interface compared to the PM interior (figure 2). A strong M2 macrophage signature was present in both regions. Further analysis revealed that genes related to neutrophil and myeloid cell chemotaxis and known to be associated with polymorphonuclear myeloid-derived suppressor cells were highly expressed at the PM interface, along with genes for multiple subsets of dendritic cells (figure 3). Expression of PD-L1, LAG-3, and CSF1R at the PM interface were associated with worse progression-free survival (PFS), while gene sets associated with productive T cell immune response were associated with improved PFS (figure 4).Abstract 506 Figure 1Immunohistochemistry of osteosarcoma pulmonary metastasesA. H&E with demarcation of tumor-normal lung interface (center green line) and area quantified as the ‘PM interface’ (outer green lines). Pulmonary metastases demonstrate a higher concentration of immune cells (CD3 p<0.001, CD8 p<0.001, CD163 p<0.01) and PD-1 (p<0.001)/PD-L1 (p<0.05) at the PM interface.B. H&E with demarcation of PM interface as above. Pulmonary metastases demonstrating increased staining of TIM-3 (p<0.01), LAG-3 (p<0.01) and IDO1 (p<0.0001) at the PM interface (no significant concentration of CSF1R at PM interface).Abstract 506 Figure 2Activated/exhausted lymphocyte signatures at PM interfaceA. Heatmap displaying significant genes that contribute to leading-edge of core enrichment subset via Gene Set Enrichment Analysis (GSEA) demonstrating higher expression of immune regulatory molecules at the PM interface compared to the PM interior. Expression levels were converted into heatmaps and colors quantitatively correspond to fold changes. FDR=GSEA false-discovery rate q-value.B. Heatmap illustrating coefficients of xCell analysis shows higher expression of markers of cytotoxicity and activation, as well as multiple checkpoint molecules, at the PM interface, with evidence that they are being contributed chiefly by T cells. Intensity represents xCell coefficient, which corresponds to the amount that a particular region (PM interior or PM interface) or cell population (T cells, B cells, or myeloid cells) contributes to the expression of a specific gene.Abstract 506 Figure 3Genes related to dendritic cells and MDSCs at PM interfaceA. By GSEA, genes associated with multiple subclasses of antigen-presenting dendritic cells are significantly upregulated at the PM interface (cDC1=conventional type 1 dendritic cell; cDC2=conventional type 2 dendritic cell; pDC=plasmacytoid dendritic cell; moDC=monocyte-derived dendritic cell). FDR=GSEA false-discovery rate q-value.B. Heatmap shows heightened expression of cytokines, chemokines and endothelin transcripts associated with development, recruitment and maintenance of PMNs and granulocytic MDSCs at the PM interface compared to the PM interior.Abstract 506 Figure 4Markers of immune TME at PM interface correlate with PFSA. Hazard ratios for immunohistochemistry markers at the PM interface as they relate to PFS. For absolute count biomarkers (CD3, CD8, Foxp3, PD-1, CD163, and LAG-3) the unit is per 100 cells, and for percentage biomarkers (PD-L1, CSF1R, TIM-3, and IDO1) the unit is per 1%.B. Hazard ratios for gene sets at the PM interface as they relate to PFS. NS=p>0.05, *p<0.05, **p<0.01, ***p<0.001, ****p<0.0001ConclusionsIn contrast to primary bone osteosarcoma ‘immune deserts,’ osteosarcoma PMs represent an ‘immune-excluded’ TME where immune cells are present but are halted at the PM interface. TILs can produce effector cytokines, suggesting their capability of activation and recognition of tumor antigens. Our findings suggest cooperative immunosuppressive mechanisms in osteosarcoma PMs that prevent TILs from penetrating into the PM interior, including immune checkpoint molecule expression and the presence of immunosuppressive myeloid cells. We identify cellular and molecular signatures that are associated with PFS of patients, which could be potentially manipulated for successful immunotherapy.Ethics ApprovalThis study was approved by Johns Hopkins University’s Ethics Board, approval number FWA00005752.ReferencesMirabello L, Troisi RJ, Savage SA. Osteosarcoma incidence and survival rates from 1973 to 2004: Data from the surveillance, epidemiology, and end results program. Cancer 2009;115(7):1531–43.Tawbi HA, Burgess M, Bolejack V, Van Tine BA, Schuetze SM, Hu J, et al. Pembrolizumab in advanced soft-tissue sarcoma and bone sarcoma (SARC028): A multicentre, two-cohort, single-arm, open-label, phase 2 trial. Lancet Oncol 2017;18(11):1493–501.Davis KL, Fox E, Merchant MS, Reid JM, Kudgus RA, Liu X, et al. Nivolumab in children and young adults with relapsed or refractory solid tumours or lymphoma (ADVL1412): A multicentre, open-label, single-arm, phase 1–2 trial. Lancet Oncol 2020;21(4):541–50.D’Angelo SP, Mahoney MR, Van Tine BA, Atkins J, Milhem MM, Jahagirdar BN, et al. Nivolumab with or without ipilimumab treatment for metastatic sarcoma (alliance A091401): Two open-label, non-comparative, randomised, phase 2 trials. Lancet Oncol 2018;19(3):416–26.Paoluzzi L, Cacavio A, Ghesani M, Karambelkar A, Rapkiewicz A, Weber J, et al. Response to anti-PD1 therapy with nivolumab in metastatic sarcomas. Clin Sarcoma Res 2016;6:24.
APA, Harvard, Vancouver, ISO, and other styles
8

Tang, Wenyu, Hang Lin, Yifan Chen, Jingjing Feng, and Huihua Hu. "Mechanical Characteristics and Acoustic Emission Characteristics of Mortar-Rock Binary Medium." Buildings 12, no. 5 (May 17, 2022): 665. http://dx.doi.org/10.3390/buildings12050665.

Full text
Abstract:
The stability of the interface between mortar and rock is very important in engineering construction. In this paper, the all-digital acoustic emission (AE) system is used to detect the direct shear test of the mortar-rock binary medium interface with different sawtooth angles under different normal stress states. The stress-displacement information and AE signal during the whole shearing process are extracted. The coupling relationship between stress and AE characteristic parameters is discussed. The quantitative relationship between sawtooth angle and shear strength of binary medium is established, and three AE characteristic parameters that can be used to predict structural instability are proposed. The research shows that: With the increase of the normal stress and the sawtooth angle, the shear strength of the mortar-rock binary medium increases. The relationship of that is obtained by least squares fitting. The shear stress-displacement curve is divided into five stages according to the change of deformation law. Through the analysis of AE characteristic parameters, it is found that increasing the sawtooth angle makes the AE count and AE cumulative count increase. Based on the analysis of the characteristic parameters of RA-AF, the changes of shear cracks and tensile cracks within the whole shearing process were obtained, respectively. In the process of binary medium shearing, the AE peak frequency is in the range of 120–340 kHz. Three acoustic emission parameters that can predict the macroscopic damage of binary media are obtained: the AE b value, the ratio of shear crack signals, and the number of signals with a peak frequency of 220 Hz to 320 Hz.
APA, Harvard, Vancouver, ISO, and other styles
9

Tosello, María Elena, and María Georgina Bredanini. "A personal space in the Web. Bases, processes and evaluation of a collaborative digital design experience for significant learning." International Journal of Architectural Computing 15, no. 3 (September 2017): 230–45. http://dx.doi.org/10.1177/1478077117735216.

Full text
Abstract:
We live constantly networked, performing multiple activities in virtual spaces which are intertwined with physical space, shaping an augmented and symbiotic chronotope. Considering that personal space is an area surrounding individuals that provides a framework for developing activities wouldn’t it be necessary to count on a virtual personal space? This article presents the bases, processes, and results of a didactic experience which purpose was to imagine and design a personal space in the Web, representing its properties and characteristics through a transmedia narrative unfolded through diverse languages and media. Three cases are presented, selected because they propose different strategies to approach the problem. In order to perform a comparative analysis of the results, the categories were defined based on the triadic structure of Peirce’s Theory of Signs, which in turn were divided into sub-categories that incorporate the Principles of Design and Evaluation of Interface-Spaces.
APA, Harvard, Vancouver, ISO, and other styles
10

Lee, Jong-Chan, Brian J. Lee, Changhee Park, Hyunjoo Song, Chan-Young Ock, Hyojae Sung, Sungjin Woo, et al. "Efficacy improvement in searching MEDLINE database using a novel PubMed visual analytic system: EEEvis." PLOS ONE 18, no. 2 (February 9, 2023): e0281422. http://dx.doi.org/10.1371/journal.pone.0281422.

Full text
Abstract:
PubMed is the most extensively used database and search engine in the biomedical and healthcare fields. However, users could experience several difficulties in acquiring their target papers facing massive numbers of search results, especially in their unfamiliar fields. Therefore, we developed a novel user interface for PubMed and conducted three steps of study: step A, a preliminary user survey with 76 medical experts regarding the current usability for the biomedical literature search task at PubMed; step B is implementing EEEvis, a novel interactive visual analytic system for the search task; step C, a randomized user study comparing PubMed and EEEvis. First, we conducted a Google survey of 76 medical experts regarding the unmet needs of PubMed and the user requirements for a novel search interface. According to the data of preliminary Google survey, we implemented a novel interactive visual analytic system for biomedical literature search. This EEEvis provides enhanced literature data analysis functions including (1) an overview of the bibliographic features including publication date, citation count, and impact factors, (2) an overview of the co-authorship network, and (3) interactive sorting, filtering, and highlighting. In the randomized user study of 24 medical experts, the search speed of EEEvis was not inferior to PubMed in the time to reach the first article (median difference 3 sec, 95% CI -2.1 to 8.5, P = 0.535) nor in the search completion time (median difference 8 sec, 95% CI -4.7 to 19.1, P = 0.771). However, 22 participants (91.7%) responded that they are willing to use EEEvis as their first choice for a biomedical literature search task, and 21 participants (87.5%) answered the bibliographic sorting and filtering functionalities of EEEvis as a major advantage. EEEvis could be a supplementary interface for PubMed that can enhance the user experience in the search for biomedical literature.
APA, Harvard, Vancouver, ISO, and other styles
11

Millar, Richard C. "Integrated Instrumentation and Sensor Systems Enabling Condition-Based Maintenance of Aerospace Equipment." International Journal of Aerospace Engineering 2012 (2012): 1–6. http://dx.doi.org/10.1155/2012/804747.

Full text
Abstract:
The objective of the work reported herein was to use a systems engineering approach to guide development of integrated instrumentation/sensor systems (IISS) incorporating communications, interconnections, and signal acquisition. These require enhanced suitability and effectiveness for diagnostics and health management of aerospace equipment governed by the principles of Condition-based maintenance (CBM). It is concluded that the systems engineering approach to IISS definition provided clear benefits in identifying overall system requirements and an architectural framework for categorizing and evaluating alternative architectures, relative to a bottom up focus on sensor technology blind to system level user needs. CBM IISS imperatives identified include factors such as tolerance of the bulk of aerospace equipment operational environments, low intrusiveness, rapid reconfiguration, and affordable life cycle costs. The functional features identified include interrogation of the variety of sensor types and interfaces common in aerospace equipment applications over multiplexed communication media with flexibility to allow rapid system reconfiguration to adapt to evolving sensor needs. This implies standardized interfaces at the sensor location (preferably to open standards), reduced wire/connector pin count in harnesses (or their elimination through use of wireless communications).
APA, Harvard, Vancouver, ISO, and other styles
12

Elkaim, Lior M., Farbod Niazi, Jordan J. Levett, Rakan Bokhari, Carolina Gorodetsky, Sara Breitbart, Fahad Alotaibi, et al. "Deep brain stimulation in children and youth: perspectives of patients and caregivers gleaned through Twitter." Neurosurgical Focus 53, no. 4 (October 2022): E11. http://dx.doi.org/10.3171/2022.7.focus22276.

Full text
Abstract:
OBJECTIVE This study aims to glean patient and caregiver perspectives surrounding deep brain stimulation (DBS) in children and youth through an analysis of patterns of social media usage. METHODS The authors performed a comprehensive search of the Twitter Application Programming Interface (API) database for all tweets about DBS use in children and youth, with no date restriction. Data pertaining to each tweet were extracted for analysis. Results were analyzed using qualitative and quantitative methodologies. These included thematic analysis of tweets, accounts, and descriptive statistics. Sentiment analysis of extracted tweets was also performed. A multivariable regression model was used to identify predictors of higher engagement metrics (likes, retweets, and quotes). RESULTS A comprehensive search of the Twitter database yielded 877 tweets from 816 unique accounts meeting study inclusion criteria. Most tweets were from patients or caregivers, researchers, or news media outlets. The most common themes among analyzed tweets were research discussing novel findings (45.2%) or personal experiences of patients or caregivers (27.4%). Sentiment analysis showed that 54.5% of tweets were positive, 35.1% were neutral, and 10.4% were negative. The presence of pictures or videos increased the tweet engagement count by an average of 10.5 (95% CI 7.3–13.6). Tweets about personal patient experiences (β = 6, 95% CI 0.95–12) and tweets tagging other accounts (β = 3.2, 95% CI 0.63–5.8) were also significantly associated with higher engagement metrics. CONCLUSIONS The current study is the first to assess patient and caregiver perspectives surrounding pediatric DBS through a comprehensive analysis of social media usage. Given the nascent field, social media presents an opportunity to share experiences and promote patient and healthcare professional education surrounding pediatric DBS.
APA, Harvard, Vancouver, ISO, and other styles
13

Koontz, Bridget F., Erica Levine, Frances McSherry, Tykeytra Dale, Martin Streicher, Junzo P. Chino, Christopher Ryan Kelsey, et al. "Text messaging and activity tracker motivation program to increase physical activity in cancer survivors." Journal of Clinical Oncology 36, no. 7_suppl (March 1, 2018): 92. http://dx.doi.org/10.1200/jco.2018.36.7_suppl.92.

Full text
Abstract:
92 Background: Cancer survivors have high rates of sedentary behavior leading to obesity and cardiovascular disease. Physical activity improves quality of life (QOL) and reduces morbidity and mortality. However, cancer survivors commonly cite motivation as a barrier to increasing physical activity. We hypothesized that a motivational text-messaging feedback system linked to a Fitbit Flex activity tracker would increase the activity level of survivors and those undergoing cancer treatment. Methods: 29 participants were enrolled in an IRB-approved single-institution study. Eligibility allowed any cancer/stage, ≤2 days of exercise per week, life expectancy of 12+ months, and smartphone access. After baseline fitness/QOL testing, participants were provided a Fitbit Flex activity tracker. A text-messaging program automatically uploaded data from the tracker via an application programming interface and provided personalized text message feedback to subject’s smartphone daily for 3 months. Primary endpoint was change in step count from baseline to 3 months, with additional endpoints of change in 6 minute walk/QOL measures at 3 months, and continued exercise/use of tracker at 6 months. Results: To date, 24 have completed the 3 month program. Both academic and community sites participated, including areas with limited internet access. Most participants were female (71%) and white (63%). Eight cancer types and all stages were represented. Three participants withdrew – one because of lost tracker, one cancer death, and one “disappointed” with tracker function. Median daily steps at baseline were 3773 (IQR 2928) and 4365 at 3 months (IQR 4864). 42% had at least a 20% increase in median step count at 3 months. Improvement was noted in 45% of survivors and 38% of active treatment participants. Participants frequently used research nurses for guidance on use of wearable tracker (e.g. syncing, charging, features). Conclusions: Activity tracker with personalized daily feedback via text message successfully motivates cancer patients to increase daily activity. Patients are interested in health technology, but require technical support and coaching to maintain use. Clinical trial information: NCT02627079.
APA, Harvard, Vancouver, ISO, and other styles
14

Ramasubramanian, Melur K., Donald A. Shiffler, and Amit Jayachandran. "A Computational Fluid Dynamics Modeling and Experimental Study of the Mixing Process for Dispersion of Synthetic Fibers in Wet-Lay Forming." March 2010 9, no. 3 (April 1, 2010): 6–13. http://dx.doi.org/10.32964/tj9.3.6.

Full text
Abstract:
In this paper, we present results from a computational fluid dynamics (CFCFD) model for the mixing process used to disperse synthetic fibers in wet-lay process. We used CFCFD software, FLUENTFLUENTFLUENTFLUENTFLUENTFLUENT, together with the MIXSISIM user interface to accurately model the impeller geometry. A multiple reference frame (MRFRF) model and standard k-e turbulence model were used to model the problem. After obtaining a converged solution for the mixing tank with water, a discrete phase model was constructed by injecting spherical particles into the flow. A mixing tank with baffles and a centrally located impeller was used in experiments. PETET fibers (1.5 denier, 6.35 mm, 12.7 mm, and 38.7 mm) at a concentration of 0.01% were mixed in water for the study. In regions behind the baffles, where the model predicted higher concentration of particles, experimental results showed a 34% higher concentration relative to the region in the high turbulence zone near the center. Instantaneous sheets were formed by rapidly dipping and removing a flat wire mesh strainer into the tank at two different locations to assess the state of dispersion in the tank. The sheets were transferred onto a blotting paper and examined under a microscope to count defects. Results show that the number of rope defects was 43% higher in sheets drawn from the region behind the baffles than in the sheets drawn from regions near the center of the tank. Changing baffles from a rectangular to a triangular cross section decreased the number of rope defects, but increased the number of log defects in the sample sheets at the same location. The CFCFD model can be used to optimize mixing tank design for wet lay fiber dispersion. The model provides further insight into the mixing process by predicting the effect of changes in design parameters on dispersion quality.
APA, Harvard, Vancouver, ISO, and other styles
15

Marmotti, Antongiulio, Castoldi Filippo, Rossi Roberto, Alessia Tron, Francesco Giacalone, Stefano Marenco, Cristina Realmuto, et al. "Pre-Operative Bone Marrow-Derived Cell Mobilization by G-CSF Enhances Osseointegration of Bone Substitute In Patients Undergoing Surgery with High Tibial Valgus Osteotomy." Blood 116, no. 21 (November 19, 2010): 4773. http://dx.doi.org/10.1182/blood.v116.21.4773.4773.

Full text
Abstract:
Abstract Abstract 4773 Introduction. Bone substitutes are widely used to improve bone repair in orthopaedic surgical procedures. Osseointegration is a slow process that takes place both at bone-implant interface and inside the tridimensional structure. The process might benefit from the addition of bone marrow-derived cells (BMC). In order to exploit this possible effects, a study protocol has been designed including preoperative BMC mobilization induced by granulocyte-colony stimulating factor (G-CSF). Aim of the study was to verify feasibility, safety and efficacy of BMC-mobilization in patients undergoing high tibial valgus osteotomy (HTVO). Patients and Methods. Overall, 24 patients undergoing medial open wedge HTVO to treat genu varum were enrolled in a prospective phase II trial. The osteotomy gap was filled by hydroxyapatite and tricalciumphosphate bone graft substitute. Two consecutive cohorts of 12 patients were assigned to receive (GROUP A) or not receive (GROUP B) a daily dose of 10 μ g/kg of G-CSF for 3 consecutive days, with an additional dose 4 hours before surgery. BMC mobilization was monitored by White Blood Cell (WBC) count and flow cytometry analysis of circulating CD34+ cells. All patients underwent a clinical (Lysholm Score and SF-36) and X-ray evaluation preoperatively, and at 1, 3 and 6 months after surgery. Anteroposterior standard radiographs were analysed to compare bone structure of the osteotomy areas. The percentage of integration at the interface between host bone and bone substitute (“host bone-substitute interface”) was estimated by 2 blinded observers. A computed tomography (CT) evaluation of the host bone-substitute interface was performed at 2 months. The osseointegration at the host bone-substitute interface was estimated through a semiquantitative score by 2 blinded observers and through a measure of bone density. Results. Patients of Groups A and B were well balanced in terms of age and clinical presentation. All patients of both groups completed the study. The most common adverse events among patients assigned to G-CSF were mild to moderate bone pain and muscle discomfort, well controlled by oral analgesics. There were no severe adverse events in both Group A and B, all patients are presently alive and well. Mobilization of CD34+ve cells occurred in all patients receiving G-CSF: mean preoperative WBC and CD34+ values were 39,09 × 103/μ l (21,57-51,11) and 131,58/ƒnμ l (29.1 - 404) in Group A, and 6,77 (2,8-12-06) and 7,67/μ l (5,4-12) in Group B, respectively. At the post-surgery clinical evaluation, patients of Group A experienced pain and a slight impairment in overall performance at 1 month assessment, whereas they displayed a slight increase in overall performance at 3 and 6 months compared to Group B. Semi-quantitative radiographic evaluation revealed a higher rate of bone substitute osseointegration in Group A than in Group B at 1, 3 and 6 months post-surgery. Also semiquantitative CT evaluation at 2 months showed an overall improved osseointegration at the host bone-substitute interface in Group A patients. Bone density was measured at the host bone-substitute interface by the Hounsfield score: Group A patients scored lower values at the upper interface compared to Group B, accordingly with advanced stage bone remodelling. The differences between Group A and B on assessment of host bone-substitute interface reached a statistical significance (p< 0,05). Bone mineral density at the host bone-substitute interface as measured with DEXA (Dual-energy X-ray absorptiometry), was lower, although without statistical significance, in Group A than in Group B, again suggesting a more advanced stage of bone remodelling in the treated group compared to controls. Conclusions. G-CSF administration given to induce pre-operative mobilization of bone marrow-derived cells: i. is feasible and safe in patients undergoing orthopaedic surgery; ii. allows the peripheral blood circulation of high numbers of CD34+ve cells; iii. may hasten bone graft substitute integration as suggested by both clinical and radiographical and CT evaluations. The enhanced osseointegration might be the result of a direct activity of G-CSF or of a cellular effect mediated by either hematopoietic or endothelial progenitors mobilized by G-CSF or by the combination of all these factors. Disclosures: No relevant conflicts of interest to declare.
APA, Harvard, Vancouver, ISO, and other styles
16

Yoon, Edward, Rona Singer Weinberg, Bruce Sachais, Joel A. Brochstein, and Patricia A. Shi. "Collection and Processing Considerations in Haploidentical Stem Cell Transplantation Utilizing a 2-Step Methodology." Blood 128, no. 22 (December 2, 2016): 2178. http://dx.doi.org/10.1182/blood.v128.22.2178.2178.

Full text
Abstract:
Abstract Title: Collection and Processing Considerations in Haploidentical Stem Cell Transplantation Utilizing a 2-Step Methodology Background: Haploidentical transplantation is an active area of research because only about one-third of patients have an HLA-matched sibling and searching for an HLA-matched-unrelated donor can delay urgent need for alloBMT. However, nearly all patients have HLA-haploidentical related donors. The incidence of graft rejection, infection, and GVHD has been minimized by improving conditioning regimens and the use of cyclophosphamide-based T-cell tolerization. There are unique collection and processing considerations associated with such transplants. We describe the unique collection considerations for the Thomas Jefferson University 2 Step regimen (TJU2) (Grosso D, Blood 2011), where the donor lymphocyte infusion (DLI) precedes the CD34+ selected hematopoietic progenitor cell (HPC) transplant. The DLI target dose in this study is fixed at 2 x 10e8 CD3+/recipient kg. The HPC collection is CD34+ cell selected. The post-selection target cell doses/recipient kg are 2-10 x10e6 CD34+ cells and < 5 x 10e4 CD3+ cells. The points to consider in determining donor blood volume to be processed during apheresis collection are: 1. Although 0.14 x 10e8 CD3+/L processed/recipient kg is a median CD3+ dose collected with G-CSF mobilization (Korbling, JCA 2001), where the absolute lymphocyte count increases ~2-fold above its baseline (Holig, Blood 2009, Anderlini Transfusion 1996), the median dose for a non-mobilized collection is 0.07 x 10e8 CD3+/L processed/recipient kg (Korbling, JCA 2001). Therefore large volume DLI collection may need to be performed. 2. The collection facility's lymphocyte or CD3+ collection efficiency. Donor pre-collection peripheral blood CD3+ count is typically not available. 3. The collection facility's CD34+ cell collection efficiency. This can be calculated because the donor's peripheral blood CD34+ cell count is typically available. 4. Higher CD34+ cell doses appear to be beneficial in haploidentical transplant (Reisner, Blood 2011). Therefore we designed the HPC collection volume to obtain a post-CD34+ cell selection target dose of 7 x 10e6/recipient kg. 5. The processing facility's CD34+cell selection recovery. This typically varies from 30-100%. Methods: All procedures except one DLI (performed with the Spectra Optia CMNC) were performed using the Cobe Spectra MNC procedure (Terumo Corp). The following algorithms were used to estimate the donor blood volume ideally to be processed to ensure that target doses were reached: DLI: [(recipient kg x CD3 target/kg) / (0.6 x ALC)] / (median lymphocyte CE) 0.6 is the lower limit of a normal percentage of CD3+ lymphocytes ALC = donor absolute lymphocyte concentration on the day of collection CE = collection efficiency (facility-specific median = 0.58) HPC: [(recipient kg x post-selection CD34 target/kg) / PB CD34+] / (median CD34 CE) / (minimum CD34 selection recovery) Post-selection CD34 target/kg = 7 x 10e6/kg PB CD34+ = donor peripheral blood CD34+ concentration on day of collection CE = collection efficiency (facility-specific median = 0.45) Facility-specific minimal CD34+ selection recovery=0.31 CD 34+ cell enriched products were obtained by incubation of mobilized HPC, Apheresis products with Miltenyi CD34+ reagent and subsequent positive selection on the CliniMACS Plus (Milteyi Biotec GmbH) according to manufacturer's specifications. Results: Our results in 6 donor-recipient pairs were the following: Discussion: The variability in the predictive value is due to variabilities in lymphocyte CE (range 0.38-0.94), CD34+ CE (range 0.36-0.67), and CD34+ selection recovery (range 0.31-0.81). Nevertheless, the correlation coefficients are high and as the ranges of CEs and CD34+ selection recovery are optimized and narrowed, the predictive value of these formulas will increase. Of note, all collections except one were performed with the Cobe Spectra MNC procedure. With the transition to the Optia CMNC procedure, where the collection interface is controlled by the device, the CEs are expected to become more consistent. Table 1. Table 1. Table 2. Table 2. Figure 1. Figure 1. Disclosures No relevant conflicts of interest to declare.
APA, Harvard, Vancouver, ISO, and other styles
17

Saggese, Gerardo, and Antonio Giuseppe Maria Strollo. "Low-Power Energy-Based Spike Detector ASIC for Implantable Multichannel BMIs." Electronics 11, no. 18 (September 16, 2022): 2943. http://dx.doi.org/10.3390/electronics11182943.

Full text
Abstract:
Advances in microtechnology have enabled an exponential increase in the number of neurons that can be simultaneously recorded. To meet high-channel count and implantability demands, emerging applications require new methods for local real-time processing to reduce the data to transmit. Nonlinear energy operators are widely used to distinguish neural spikes from background noise featuring a good tradeoff between hardware resources and accuracy. However, they require an additional smoothing filter, which affects both area occupation and power dissipation. In this paper, we investigate a spike detector, based on a series of two nonlinear energy operators, and a simple and adaptive threshold, based on a three-point median operator. We show that our proposal provides good accuracy compared to other energy-based detectors on a synthetic dataset at different noise levels. Based on the proposed technique, a 1024-channel neural signal processor was designed in a 28 nm TSMC CMOS process by using latch-based static random-access memory (SRAM), demonstrating a total power consumption of 1.4 μW/ch and a silicon area occupation of 230 μm2/ch. These features, together with a comparison with the state of the art, demonstrate that our proposal constitutes an alternative for the development of next-generation multichannel neural interfaces.
APA, Harvard, Vancouver, ISO, and other styles
18

Zhang, Wanxia, Bo Liu, and Sang-Bing Tsai. "Analysis and Research on Digital Reading Platform of Multimedia Library by Big Data Computing in Internet Era." Wireless Communications and Mobile Computing 2022 (January 10, 2022): 1–10. http://dx.doi.org/10.1155/2022/5939138.

Full text
Abstract:
Digital reading promotion service is a service way for libraries to provide readers with a series of digital resources, enjoy the service functions, and share the experience of using them in various digital reading platforms, which is to meet the reading interests and reading needs of more readers, and is also the focus of the current library work. In the era of new media, the characteristics of digital reading are subtly changing the readers’ needs for reading environment, reading content, and reading style. Libraries should keep pace with the development of the times and provide readers with diversified, intelligent, and targeted digital reading platforms. The digital reading platform should continuously improve the digital reading service functions, broaden the service scope and dissemination channels, and finally realize the diversification, interest, and intelligence of digital reading service. This paper takes the digital reading platform of libraries in the region as the research theme and carries out research work on libraries. The province is divided into three regions according to the geographical map: southeastern region, central region, and northwestern region. The digital reading platforms of 14 prefecture-level public libraries and 58 libraries of higher education institutions in each region were accessed. Firstly, we check the construction of digital resources within the library websites, secondly, we count the opening of digital reading platform functions, and finally, we check the opening of digital reading platforms. Through the research, it is found that there are problems of unbalanced distribution of digital reading resources in regional libraries; unattractive design of readers’ interface and inadequate reading service functions; lack of continuous training of readers’ guidance; insufficient publicity and promotion; low efficiency of staff in responding to consultation; and low degree of platform openness and weak awareness of sharing. Finally, the problems found in the research are summarized, and the solution measures for the regional digital reading platform are proposed. Libraries in the digital era should give priority to systems that can manage all library resources comprehensively and effectively, adapt to more flexible library workflows, and enable libraries to provide better services to users.
APA, Harvard, Vancouver, ISO, and other styles
19

Capra, Marcelo, Fangfang Lv, Wei Li, Tongyu Lin, Eduardo Yañez, Jifeng Feng, Aryan Hamed, et al. "Abstract A32: Copanlisib plus rituximab vs placebo plus rituximab in patients with follicular lymphoma: 1-year follow-up of the Phase III, randomized CHRONOS-3 trial." Blood Cancer Discovery 3, no. 5_Supplement (September 6, 2022): A32. http://dx.doi.org/10.1158/2643-3249.lymphoma22-a32.

Full text
Abstract:
Abstract Introduction: In the Phase III CHRONOS-3 trial investigating the efficacy and safety of the PI3K inhibitor copanlisib in combination with rituximab (C+R), superior efficacy in progression-free survival (PFS) was demonstrated vs placebo plus rituximab (P+R) in patients (pts) with relapsed indolent non-Hodgkin lymphoma (iNHL; hazard ratio [HR] 0.52), including in the subset with follicular lymphoma (FL; HR 0.58) (data cut-off Aug 31, 2020) (Matasar et al. Lancet Oncol 2021). Here we report an updated 1-year follow-up analysis for the subset of pts with FL (data cut-off Aug 6, 2021).Methods: Eligible pts with relapsed iNHL who were progression- and treatment-free for ≥12 months (mo) after the last R-containing regimen, or for &gt;6 mo if unwilling/unfit to receive chemotherapy, were randomized 2:1 to C+R or P+R. Treatment continued until progression or unacceptable toxicity. C 60 mg flat dose was administered i.v. on days 1, 8, and 15 (28-day cycle) and R 375 mg/m2 i.v. on days 1, 8, 15, and 22 of cycle 1 and on day 1 of cycles 3, 5, 7, and 9. Primary endpoint was centrally assessed PFS. Secondary efficacy endpoints included overall survival (OS), objective response rate (ORR), duration of response (DoR), complete response rate (CRR), and treatment-emergent adverse events (TEAEs). As part of biomarker analysis, PTEN expression level was evaluated using immunohistochemistry.Results: The FL subset comprised 275 pts (184 C+R and 91 P+R). C+R reduced the risk of progression or death by 43% vs P+R; median PFS was 23.2 mo (95% confidence interval [CI] 19.2, 33.1) for C+R and 16.6 mo (11.0, 19.6) for P+R (HR 0.57 [95% CI 0.41, 0.80]. ORRs were 85% vs 56% for C+R vs P+R; CRRs were 38% vs 21%. Median DoR was 25.7 mo (17.2, 31.5) for C+R vs 18.2 mo (12.3, 26.3) for P+R (HR 0.75 [95% CI 0.49, 1.14]); for pts achieving a complete response, median DoR was not reached for C+R vs 24.7 mo (10.8, 29.7) for P+R (HR 0.41 [0.19, 0.87]), with DoR rates at 36 months 63% vs 23%. Median OS was not estimable, and no significant benefit was seen for C+R over P+R (HR 0.95 [95% CI 0.52, 1.74]). Hyperglycemia (76%), hypertension (54%), and decreased neutrophil count (38%) were the most common TEAEs in the C+R arm, consistent with the study population. Grade 5 events occurred in 5 pts (3%) in the C+R arm; 1 (1%) was deemed treatment-related. Of 119 pts with FL who were evaluable for PTEN analysis, 41 (34%) were positive. Percent PTEN positivity was not correlated to best overall response (p=0.44) or responsiveness (p=0.08). Median PFS was improved in PTEN-negative pts regardless of treatment. PTEN-positive pts had improved PFS with C+R vs P+R (p=0.005).Conclusion: Pts with FL from the CHRONOS-3 trial continued to show superior efficacy with C+R vs P+R and an acceptable safety profile 1 year after the primary disclosure, supporting the long-term use of C+R in pts with relapsed FL. Exploratory analysis suggests pts with PTEN-positive disease may experience significant benefit with C+R. Citation Format: Marcelo Capra, Fangfang Lv, Wei Li, Tongyu Lin, Eduardo Yañez, Jifeng Feng, Aryan Hamed, Xiaoling Li, Toshiki Uchida, Ming-Chung Wang, Koji Izutsu, Antonio Salar, Katya Sapunarova, Guray Saydam, Jan Zaucha, Lidia Mongay Soler, Shalini Chaturvedi, Anjun Cao, Barrett H Childs, Pier Luigi Zinzani, Matthew J Matasar. Copanlisib plus rituximab vs placebo plus rituximab in patients with follicular lymphoma: 1-year follow-up of the Phase III, randomized CHRONOS-3 trial [abstract]. In: Proceedings of the Third AACR International Meeting: Advances in Malignant Lymphoma: Maximizing the Basic-Translational Interface for Clinical Application; 2022 Jun 23-26; Boston, MA. Philadelphia (PA): AACR; Blood Cancer Discov 2022;3(5_Suppl):Abstract nr A32.
APA, Harvard, Vancouver, ISO, and other styles
20

Konoplev, S., S. M. Kornblau, E. Schlette, H. Lu, D. A. Thomas, N. Zhang, Y. H. Qiu, et al. "Overexpression of HIF1 Alpha Predicts Worse Overall and Event-Free Survival in Patients with Philadelphia Chromosome-Negative Precursor B-Lymphoblastic Leukemia." Blood 112, no. 11 (November 16, 2008): 2508. http://dx.doi.org/10.1182/blood.v112.11.2508.2508.

Full text
Abstract:
Abstract Background: Low oxygen levels are a defining characteristic of solid tumors, but the role of hypoxia in leukemogenesis remains unclear. Recent reports indicate that the endosteum at the murine bone-bone marrow (BM) interface is hypoxic, and data in a rat model demonstrate that leukemic cells infiltrating bone marrow were markedly hypoxic compared with cells in the BM of healthy rats. Hypoxia triggers a complex cellular and systemic adaptation mediated primarily through transcription by hypoxia inducible factors (HIFs) including HIF-1a. Although hypoxia is the best-characterized mechanism of HIF activation in tumors, HIF activity also can be induced in tumor cells through activation of the PI3K/Akt-signaling pathway. In this study, we assessed AKT and HIF-1a expression in newly diagnosed precursor B-cell acute lymphoblastic leukemia (pre-B ALL) and correlated the results with overall and progression-free patient survival. Methods: We analyzed expression of phosphorylated AKT (pAKT) and HIF-1a in leukemic cells by immunohistochemical methods using archival fixed, paraffin-embedded BM biopsy specimens of newly diagnosed pre-B ALL and antibodies specific for pAKT (Cell Signaling Technology, Beverly, MA) and HIF-1a (BD Biosciences, San Jose, CA). The initial observations were confirmed by a Reverse Phase Protein Array (RPPA) data set generated from protein lysates prepared from fresh blood and BM aspirate samples from patients with newly diagnosed pre-B ALL. Results: There were 26 men and 27 women with a median age of 39 years (range, 17–77). The median follow-up was 17 months (range, 1–71). The median WBC was 5.7 × 109/L (range, 0.8–369 × 109/L), the median percentage of blasts in bone marrow was 88% (range, 21–97%). Conventional cytogenetic studies detected a normal karyotype in 13 patients and abnormal karyotype in 37 patients including the Philadelphia chromosome (Ph) in 15 patients; no analyzable metaphases were recovered in 5 patients. Fluorescence in situ hybridization for BCR/ABL rearrangement was performed in all patients and was positive in all 15 patients with Ph and in 1 patient with normal conventional cytogenetics. 50 patients received HYPER-CVAD therapy, 3 patients received augmented BFM therapy. 49 (92%) patients achieved complete remission with a median time to response of 3 weeks (range, 2–8 weeks), 12 of them relapsed. 17 patients died, including 6 patients in complete remission. 3 year overall survival was 56% (CI, 46–66%). HIF-1a expression was detected in 37 (70%) patients, including 10 patients with Ph-positive ALL. HIF-1a expression was associated with expression of pAKT (R=0.4479, p&lt;0.01), and was not associated with age, sex, WBC, percentage of blasts in blood or BM, platelet count, serum levels of albumin, beta-2-microglobulin, bilirubin, creatinine, LDH, or presence of the Ph. HIF-1a expression was associated with worse overall survival for the entire patient population (p=0.023). The negative prognostic impact of HIF-1a expression remained when only Ph-negative patients were analyzed (p=0.004), Fig 1. Patients with HIF-1a expression appear to have worse progression-free survival, but the difference did not reach statistical significance, p=0.39. These results were confirmed by RPPA data set generated from 104 patients with newly diagnosed Ph-negative ALL. HIF-1a overexpression was strongly associated with worse overall (p=0.026) and event-free (p=0.0178). No association of HIF-1a overexpression with other clinical or laboratory parameters was detected. Conclusions: This is the first report demonstrating that HIF-1a expression is associated with worse overall and event-free survival in Ph-negative pre-B ALL. These findings implicate that inhibition of AKT signaling or blockade of HIF-1α-mediated pro-survival signaling events may improve clinical outcomes in pre-B ALL. Analysis of the key pro-survival signaling pathways activated by hypoxia and HIF-1a is ongoing (Frolova et al., ASH 2008). Figure Figure
APA, Harvard, Vancouver, ISO, and other styles
21

Benavent, D., F. J. Núñez-Benjumea, L. Fernández-Luque, V. Navarro-Compán, M. Sanz, E. Calvo Aranda, L. Lojo, A. Balsa, and C. Plasencia. "POS0374 MONITORING CHRONIC INFLAMMATORY MUSCULOSKELETAL DISEASES WITH A PRECISION DIGITAL COMPANION PLATFORM(TM)–RESULTS OF THE DIGIREUMA FEASIBILITY STUDY." Annals of the Rheumatic Diseases 81, Suppl 1 (May 23, 2022): 441–42. http://dx.doi.org/10.1136/annrheumdis-2022-eular.1423.

Full text
Abstract:
BackgroundPatients with rheumatic and musculoskeletal diseases (RMDs) require a tailored follow-up that is limited by the capacity of healthcare professionals. Innovative tools need to be implemented effectively in the clinical care of patients with RMDs.ObjectivesTo test the feasibility of a Precision Digital Companion Platform™ for real-time monitoring of disease outcomes in patients with rheumatoid arthritis (RA) and spondyloarthritis (SpA).MethodsDigireuma was a prospective study including patients with RA and SpA, using the digital Precision Digital Companion Platform, Adhera for Rheumatology (ISRCTN11896540). During a follow-up of 3 months, patients were asked to report disease specific electronic patient reported outcomes (ePROs) on a regular basis in the mobile solution. Two rheumatologists monitored these ePROs and, patients were contacted for online or face-to-face interventions when deemed necessary by clinicians (Figure 1). Assessment measures included patient global assessment (PGA) of disease activity, tender joint count (TJC), swollen joint count (SJC), Health Assessment Questionnaire (HAQ) and pain visual analogue scale (VAS), for patients with RA; VAS, PGA, TJC, SJC, Bath Ankylosing Spondylitis Disease Activity Index (BASDAI), Bath Ankylosing Spondylitis Functional Index (BASFI) and ASAS Health Index (ASAS-HI), for patients with SpA. In addition, flares, changes in medication and recent infections were asked. Usability of the digital solution was measured by the Net-Promoter Score (NPS).Figure 1.Digital monitoring in the study powered by Adhera for Rheumatology. Screenshots in top depict the mobile interface (left) and clinical web application (right)ResultsForty-six patients were recruited of whom 22 had RA and 24 SpA. Mean age was 48 ± 12 and 42 ± 9 years in the RA and SpA groups, respectively. 18/22 (82%) patients with RA and 9/24 (38%) with SpA were female. Among the total included patients, 41 (89%) completed the onboarding (18/22 (82%) RA, 23/24 (96%) SpA) and 37 (80%) submitted at least one entry. In the RA group who completed the onboarding (n=18) there were a total of 4019 total interactions (2178 questionnaire items, 648 accesses to educational units, 105 quizzes, 1088 rated messages), while patients with SpA (n=23) had a total of 3160 interactions (1637 questionnaire items, 684 accesses to educational units, 77 quizzes, 762 rated messages). ePROs measurements completion rates for RA and SpA patients that completed any data during follow-up are shown in Table 1. Patients with RA completed a median of 9.5 ePROs during follow-up, whereas patients with SpA completed a median of 3. Regarding alerts, 15 patients generated a total of 26 alerts, of which 24 were flares (10 RA, 14 SpA) and 2 were problems with the medication (1 RA, 1 SpA). 18 (69%) of the alerts were managed remotely, 5 (19%) required a face-to-face intervention and in 3 (12%) patients did not respond before the consultation. Regarding usability and patient satisfaction, 14 patients provided feedback. According to the NPS, 9/14 were considered promoters, 4/14 passives and 1/14 detractor. The overall rating of these 14 patients for the app was 4.3 out of 5 stars.Table 1.Onboarded patient engagement with regards to e-PROsRheumatoid Arthritis (n=18)PGATJCSJCVASHAQTotalePROs completed1.5 (0.25, 3)2 (0.25, 3)2 (0.25, 3)2 (0, 3)2 (1, 3)9.5 (4.3, 15.8)Patients with ≥ 1 entry13 (72.2)13 (72.2)13 (72.2)12 (66.7)16 (88.9)16 (88.9)Spondyloarthritis (n=23)PGATJCSJCBASDAIASAS-HITotalePROs completed1 (0,3)1 (0,3)1 (0,3)1 (0,2)1 (0,2)3 (1, 12)Patients with ≥ 1 entry16 (69.5)16 (69.5)16 (69.5)14 (60.8)14 (60.8)21 (91.3)Follow-up period was 3 months. Results are expressed in median (Q1, Q3) and n (%)ConclusionThis study shows that the use of a digital health solution is feasible in clinical practice. Based on these preliminary results, the next step will be to further implement the Precision Digital Companion Platform, Adhera for Rheumatology, in a multicentric setting to analyze the added value for monitoring patients.AcknowledgementsThis study was funded with an unrestricted grant from Abbvie.Disclosure of InterestsDiego Benavent Speakers bureau: Jannsen, Roche, Grant/research support from: Novartis, Abbvie, Francisco J. Núñez-Benjumea Employee of: AdheraHealth Inc, Luis Fernández-Luque Employee of: AdheraHealth Inc, Victoria Navarro-Compán Speakers bureau: AbbVie, Eli Lilly, Janssen, MSD, Novartis, Pfizer, UCB Pharma, Consultant of: AbbVie, Eli Lilly, MSD, Novartis, Pfizer, UCB Pharma, Grant/research support from: AbbVie and Novartis, María Sanz: None declared, Enrique Calvo Aranda Speakers bureau: Abbvie, LETICIA LOJO: None declared, Alejandro Balsa Speakers bureau: Pfizer, Abbvie, Lilly, Galapagos, BMS, Sandoz, Nordic Pharma, Gebro, Roche, Sanofi, UCB, Consultant of: Pfizer, Abbvie, Lilly, Galapagos, BMS, Nordic Pharma, Sanofi, UCB, Grant/research support from: Pfizer, Abbvie, BMS, Nordic Pharma, Gebro, Roche, UCB, Chamaida Plasencia Speakers bureau: Pfizer, Abbvie, Lilly, Sandoz, Sanofi, Biogen, Roche, Novartis, Grant/research support from: Pfizer y Abbvie
APA, Harvard, Vancouver, ISO, and other styles
22

Gill, Gurlal S., Balbir B. Singh, Navneet K. Dhand, Rabinder S. Aulakh, Michael P. Ward, and Victoria J. Brookes. "Stray Dogs and Public Health: Population Estimation in Punjab, India." Veterinary Sciences 9, no. 2 (February 10, 2022): 75. http://dx.doi.org/10.3390/vetsci9020075.

Full text
Abstract:
The overpopulation of stray dogs is a serious public health and animal welfare concern in India. Neglected zoonotic diseases such as rabies and echinococcosis are transmitted at the stray–dog human interface, particularly in low to middle-income countries. The current study was designed to estimate the stray dog populations in Punjab to enhance the implementation of animal birth and disease (for example, rabies vaccination) control programs. This is the first systematic estimation of the stray dog population using a recommended method (mark–re-sight) in Punjab, India. The study was conducted from August 2016 to November 2017 in selected villages or wards in Punjab. For the rural areas, 22 sub-districts in each district were randomly selected, then one village from each of the 22 selected sub-districts was selected (by convenience sampling). For urban areas, 3 towns (less than 100,000 human population) and 2 large cities (more than or equal to 100,000 human population) were randomly selected, followed by convenience selection of two wards from each of the 5 selected towns/cities. To estimate the dog population size, we used a modified mark–re-sight procedure and analysed counts using two methods; the Lincoln–Petersen formula with Chapman’s correction, and an application of Good–Turing theory (SuperDuplicates method; estimated per km2 and per 1000 adult humans and were compared between localities (villages vs. towns), dog sex (male vs. female) and age group (young vs. adult) using linear mixed models with district as a random effect. The predicted mean (95% CI) count of the dogs per village or ward were extrapolated to estimate the number of stray dogs in Punjab based on (a) the number of villages and wards in the state; (b) the adult human population of the state and (c) the built-up area of the state. Median stray dog populations per village and per ward using the Lincoln–Petersen formula with Chapman’s correction were estimated to be 33 and 65 dogs, respectively. Higher estimates of 61 per village and 112 per ward are reported using the SuperDuplicates method. The number of males was significantly higher than the number of females and the number of adult dogs was about three times the number of young dogs. Based on different methods, estimates of the mean stray dog population in the state of Punjab ranged from 519,000 to 1,569,000. The current study revealed that there are a substantial number of stray dogs and a high number reside in rural (versus urban) areas in Punjab. The estimated stray dog numbers pose a potential public health hazard in Punjab. This impact requires assessment. The estimated stray dog numbers will help develop a dog population and rabies control program in which information about the logistics required as well as costs of implementing such programmes in Punjab can be incorporated.
APA, Harvard, Vancouver, ISO, and other styles
23

Van Der Sluis, Inge M., Paola De Lorenzo, Rishi Sury Kotecha, Andishe Attarbaschi, Gabriele Escherich, Karsten Nysom, Jan Stary, et al. "A Phase 2 Study to Test the Feasibility, Safety and Efficacy of the Addition of Blinatumomab to the Interfant06 Backbone in Infants with Newly Diagnosed KMT2A-Rearranged Acute Lymphoblastic Leukemia. a Collaborative Study of the Interfant Network." Blood 138, Supplement 1 (November 5, 2021): 361. http://dx.doi.org/10.1182/blood-2021-144843.

Full text
Abstract:
Abstract Background: Infant acute lymphoblastic leukemia (ALL) is a rare disease with dismal outcome. While outcomes for older children have improved, with event-free survival (EFS) currently above 85%, newly diagnosed infants (&lt;1 year of age) with KMT2A-rearranged ALL have an 1-year EFS of 54.8% (SE 2.3) and a 3-year EFS of 39.6% (SE 2.3) (48% and 23% for medium risk (MR) and high risk (HR) patients, respectively). Ninety percent of all relapses occur during treatment, 66% within one year of diagnosis. Survival after relapse is only 20%. Intensifying chemotherapy with the Interfant06 protocol has not improved the outcome for infant ALL over the last two decades (Pieters et al., JCO 2019), hence there is an urgent need to improve upfront treatment. We studied the safety and efficacy of blinatumomab, a bispecific T-cell engager antibody targeting CD19, in infants with newly diagnosed KMT2A-r ALL. Methods: We conducted a prospective, single-arm, international, multicenter, phase 2 study. Newly diagnosed patients &lt;1 year of age with KMT2A-r ALL treated according to the Interfant06 protocol and with a M1/M2 marrow at the end of induction (EOI) were eligible to receive one course of blinatumomab (15ug/m 2/day, 28 day continuous infusion) after induction (Figure 1). Minimal residual disease (MRD) was measured at EOI (TP2), during blinatumomab (TP blina1 day15 and TP blina2 day29), before MARMA (TP4) and OCTADAD/hematopoietic stem cell transplant (HSCT) (TP5), and at the start of maintenance (TP6) using MLL and/or Ig/TCR polymerase chain reaction. HR KMT2A-r infant ALL was defined as age &lt;6 months at diagnosis AND white blood cell count≥300x10 9/L and/or poor prednisone response. All other KMT2A-r patients were classified as MR. MR patients with MRD levels &gt;0.05% before OCTADAD and all HR patients in complete remission were eligible for HSCT. (Serious) Adverse Events ((S)AEs) were collected from the start of blinatumomab until the next treatment block. Outcome data were compared to historical controls. Results: Twenty-eight patients were enrolled. Baseline characteristics are shown in Table 1. The median follow-up was 11 months (range 1.5-33 months). All patients received the full course of blinatumomab without treatment interruptions. Seven SAEs were reported during blinatumomab (3 fever, grade 1 and 4 infections, grade 3-4). None of the patients experienced neurological (S)AEs. In total, 70 AEs were reported, the most frequent grade &gt;3 adverse events were febrile neutropenia (n= 2), anemia (n=5), and elevated GGT (n=2). MRD negative complete response occurred in 54% (n=15/28) at TP blina1, as well as at TP blina2 (after 2 and 4 weeks of blinatumomab, Table 2), which tended to be higher compared to the end of consolidation in Interfant06 (40%, p=0.16). There were 89% (25/28) of patients who were MRD negative or not quantifiable (&lt;0.05%) at TP blina2. None of the MR patients had an indication for HSCT based on high MRD before OCTADAD, compared to 20% in Interfant06, however one patient was transplanted per investigator's discretion. All MR patients who continued chemotherapy became MRD negative during further treatment. MRD negative complete response at the end of blinatumomab was more frequently found in MR compared to HR patients (68% vs 22%, p=0.0418) and in patients with low MRD at EOI (&lt;0.05%) compared to patients with high MRD at EOI (76% vs 18 %, p=0.0056). The 1-year EFS was 96.2% (SE 3.8). One death in first complete remission (CR1) occurred just before HSCT, which was not blinatumomab related (tracheal bleeding due to a tracheal cannula). One MR patient with high MRD at EOI had a combined CD19 positive relapse in the bone marrow and CNS at the end of maintenance, and is in continuous CR2 after HSCT. Conclusion: This is the first trial to use blinatumomab in infants with newly diagnosed KMT2A-r ALL. Blinatumomab added to the Interfant06 backbone was very well tolerated, and has promising efficacy in terms of a high rate of complete MRD response and short term EFS. Longer follow-up is awaited, but the low relapse rate after blinatumomab is remarkable, given that in historical controls relapses occur frequently and early, during therapy. Given these findings, blinatumomab will be implemented for all infants with newly diagnosed KMT2A-r ALL in the next Interfant21 protocol. Figure 1 Figure 1. Disclosures Nysom: Y-mAbs: Consultancy, Membership on an entity's Board of Directors or advisory committees, Other: teaching; EUSA Pharma: Membership on an entity's Board of Directors or advisory committees; Bayer: Membership on an entity's Board of Directors or advisory committees, Other: teaching. Biondi: Amgen: Honoraria; Incyte: Consultancy, Other: Advisory Board; Bluebird: Other: Advisory Board; Novartis: Honoraria; Colmmune: Honoraria. OffLabel Disclosure: Investigational use of blinatumomab
APA, Harvard, Vancouver, ISO, and other styles
24

Van Binsbergen, Wim. "Kazanga. Ethnicity in Africa between State and Tradition." Afrika Focus 9, no. 1-2 (March 5, 1993). http://dx.doi.org/10.21825/af.v9i1-2.5778.

Full text
Abstract:
The production of cultural forms at the interface between a rural-based tradition and the state is a familiar aspect of ethnicity in contemporary Africa. This paper seeks to identify some of the characteristics of thisprocess, whose products are too often misunderstood, and cherished, as 'authentic' forms of 'tradition'. Highlighting the role of ethnic brokers, of the modern mass media, and of a model of commoditifled 'performance ' as an aspect of contemporary electronic mass culture, the argument explores the production of expressive culture in the context of the Kazanga cultural association and its Kazanga annual festival among the Nkoya people of central western Zambia since the early 1980s, against the background of Nkoya ethnicity and Nkoya expressive and court culture since the 19th century. KEY WORDS: associations, brokers, commoditification, dance, ethnicity, festivals, music, Nkoya, state, Zambia
APA, Harvard, Vancouver, ISO, and other styles
25

McCosker, Anthony, and Timothy Graham. "Data Publics: Urban Protest, Analytics and the Courts." M/C Journal 21, no. 3 (August 15, 2018). http://dx.doi.org/10.5204/mcj.1427.

Full text
Abstract:
This article reflects on part of a three-year battle over the redevelopment of an iconic Melbourne music venue, the Palace-Metro Nightclub (the Palace), involving the tactical use of Facebook Page data at trial. We were invited by the Save the Palace group, Melbourne City Council and the National Trust of Australia to provide Facebook Page data analysis as evidence of the social value of the venue at an appeals trial heard at the Victorian Civil Administration Tribunal (VCAT) in 2016. We take a reflexive ethnographic approach here to explore the data production, collection and analysis processes as these represent and constitute a “data public”.Although the developers won the appeal and were able to re-develop the site, the court accepted the validity of social media data as evidence of the building’s social value (Jinshan Investment Group Pty Ltd v Melbourne CC [2016] VCAT 626, 117; see also Victorian Planning Reports). Through the case, we elaborate on the concept of data publics by considering the “affordising” (Pollock) processes at play when extracting, analysing and visualising social media data. Affordising refers to the designed, deliberate and incidental effects of datafication and highlights the need to attend to the capacities for data collection and processing as they produce particular analytical outcomes. These processes foreground the compositional character of data publics, and the unevenness of data literacies (McCosker “Data Literacies”; Gray et al.) as a factor of the interpersonal and institutional capacity to read and mobilise data for social outcomes.We begin by reconsidering the often-assumed connection between social media data and their publics. Taking onboard theoretical accounts of publics as problem-oriented (Dewey) and dynamically constituted (Kelty), we conceptualise data publics through the key elements of a) consequentiality, b) sufficient connection over time, c) affective or emotional qualities of connection and interaction with the events. We note that while social data analytics may be a powerful tool for public protest, it equally affords use against public interests and introduces risks in relation to a lack of transparency, access or adequate data literacy.Urban Protest and Data Publics There are many examples globally of the use of social media to engage publics in battles over urban development or similar issues (e.g. Fredericks and Foth). Some have asked how social media might be better used by neighborhood organisations to mobilise protest and save historic buildings, cultural landmarks or urban sites (Johnson and Halegoua). And we can only note here the wealth of research literature on social movements, protest and social media. To emphasise Gerbaudo’s point, drawing on Mattoni, we “need to account for how exactly the use of these media reshapes the ‘repertoire of communication’ of contemporary movements and affects the experience of participants” (2). For us, this also means better understanding the role that social data plays in both aiding and reshaping urban protest or arming third sector groups with evidence useful in social institutions such as the courts.New modes of digital engagement enable forms of distributed digital citizenship, which Meikle sees as the creative political relationships that form through exercising rights and responsibilities. Associated with these practices is the transition from sanctioned, simple discursive forms of social protest in petitions, to new indicators of social engagement in more nuanced social media data and the more interactive forms of online petition platforms like change.org or GetUp (Halpin et al.). These technical forms code publics in specific ways that have implications for contemporary protest action. That is, they provide the operational systems and instructions that shape social actions and relationships for protest purposes (McCosker and Milne).All protest and social movements are underwritten by explicit or implicit concepts of participatory publics as these are shaped, enhanced, or threatened by communication technologies. But participatory protest publics are uneven, and as Kelty asks: “What about all the people who are neither protesters nor Twitter users? In the broadest possible sense this ‘General Public’ cannot be said to exist as an actual entity, but only as a kind of virtual entity” (27). Kelty is pointing to the porous boundary between a general public and an organised public, or formal enterprise, as a reminder that we cannot take for granted representations of a public, or the public as a given, in relation to Like or follower data for instance.If carefully gauged, the concept of data publics can be useful. To start with, the notions of publics and publicness are notoriously slippery. Baym and boyd explore the differences between these two terms, and the way social media reconfigures what “public” is. Does a Comment or a Like on a Facebook Page connect an individual sufficiently to an issues-public? As far back as the 1930s, John Dewey was seeking a pragmatic approach to similar questions regarding human association and the pluralistic space of “the public”. For Dewey, “the machine age has so enormously expanded, multiplied, intensified and complicated the scope of the indirect consequences [of human association] that the resultant public cannot identify itself” (157). To what extent, then, can we use data to constitute a public in relation to social protest in the age of data analytics?There are numerous well formulated approaches to studying publics in relation to social media and social networks. Social network analysis (SNA) determines publics, or communities, through links, ties and clustering, by measuring and mapping those connections and to an extent assuming that they constitute some form of sociality. Networked publics (Ito, 6) are understood as an outcome of social media platforms and practices in the use of new digital media authoring and distribution tools or platforms and the particular actions, relationships or modes of communication they afford, to use James Gibson’s sense of that term. “Publics can be reactors, (re)makers and (re)distributors, engaging in shared culture and knowledge through discourse and social exchange as well as through acts of media reception” (Ito 6). Hashtags, for example, facilitate connectivity and visibility and aid in the formation and “coordination of ad hoc issue publics” (Bruns and Burgess 3). Gray et al., following Ruppert, argue that “data publics are constituted by dynamic, heterogeneous arrangements of actors mobilised around data infrastructures, sometimes figuring as part of them, sometimes emerging as their effect”. The individuals of data publics are neither subjugated by the logics and metrics of digital platforms and data structures, nor simply sovereign agents empowered by the expressive potential of aggregated data (Gray et al.).Data publics are more than just aggregates of individual data points or connections. They are inherently unstable, dynamic (despite static analysis and visualisations), or vibrant, and ephemeral. We emphasise three key elements of active data publics. First, to be more than an aggregate of individual items, a data public needs to be consequential (in Dewey’s sense of issues or problem-oriented). Second, sufficient connection is visible over time. Third, affective or emotional activity is apparent in relation to events that lend coherence to the public and its prevailing sentiment. To these, we add critical attention to the affordising processes – or the deliberate and incidental effects of datafication and analysis, in the capacities for data collection and processing in order to produce particular analytical outcomes, and the data literacies these require. We return to the latter after elaborating on the Save the Palace case.Visualising Publics: Highlighting Engagement and IntensityThe Palace theatre was built in 1912 and served as a venue for theatre, cinema, live performance, musical acts and as a nightclub. In 2014 the Heritage Council decided not to include the Palace on Victoria’s heritage register and hence opened the door for developers, but Melbourne City Council and the National Trust of Australia opposed the redevelopment on the grounds of the building’s social significance as a music venue. Similarly, the Save the Palace group saw the proposed redevelopment as affecting the capacity of Melbourne CBD to host medium size live performances, and therefore impacting deeply on the social fabric of the local music scene. The Save the Palace group, chaired by Rebecca Leslie and Michael Raymond, maintained a 36,000+ strong Facebook Page and mobilised local members through regular public street protests, and participated in court proceedings in 2015 and February 2016 with Melbourne City Council and National Trust Australia. Joining the protesters in the lead up to the 2016 appeals trial, we aimed to use social media engagement data to measure, analyse and present evidence of the extent and intensity of a sustained protest public. The evidence we submitted had to satisfy VCAT’s need to establish the social value of the building and the significance of its redevelopment, and to explain: a) how social media works; b) the meaning of the number of Facebook Likes on the Save The Palace Page and the timing of those Likes, highlighting how the reach and Likes pick up at significant events; and c) whether or not a representative sample of Comments are supportive of the group and the Palace Theatre (McCosker “Statement”). As noted in the case (Jinshan, 117), where courts have traditionally relied on one simple measure for contemporary social value – the petition – our aim was to make use of the richer measures available through social media data, to better represent sustained engagement with the issues over time.Visualising a protest public in this way raises two significant problems for a workable concept of data publics. The first involves the “affordising” (Pollock) work of both the platform and our data analysis. This concerns the role played by data access and platform affordances for data capture, along with methodological choices made to best realise or draw out the affordances of the data for our purposes. The second concerns the issue of digital and data literacies in both the social acts that help to constitute a data public in the first place, and the capacity to read and write public data to represent those activities meaningfully. That is, Facebook and our analysis constitutes a data public in certain ways that includes potentially opaque decisions or processes. And citizens (protesters or casual Facebook commenters alike) along with social institutions (like the courts) have certain uneven capacity to effectively produce or read public protest-oriented data. The risk here, which we return to in the final section, lies in the potential for misrepresentation of publics through data, exclusions of access and ownership of data, and the uneven digital literacies at each stage of data production, analysis and sensemaking.Facebook captures data about individuals in intricate detail. Its data capture strategies are geared toward targeting for the purposes of marketing, although only a small subset of the data is publicly available through the Facebook Application Programming Interface (API), which is a kind of data “gateway”. The visible page data tells only part of the story. The total Page Likes in February 2016 was 36,828, representing a sizeable number of followers, mainly located in Melbourne but including 45 countries in total and 38 different languages. We extracted a data set of 268,211 engagements with the Page between February 2013 and August 2015. This included 45,393 post Likes and 9,139 Comments. Our strategy was to demarcate a structurally defined “community” (in the SNA sense of that term as delineating clusters of people, activities and links within a broader network), by visualising the interactions of Facebook users with Posts over time, and then examine elements of intensity of engagement. In other words, we “affordised” the network data using SNA techniques to most clearly convey the social value of the networked public.We used a combination of API access and Facebook’s native Insights data and analytics to extract use-data from that Page between June 2013 and December 2015. Analysis of a two-mode or bipartite network consisting of users and Posts was compiled using vosonSML, a package in the R programming language created at Australian National University (Graham and Ackland) and visualised with Gephi software. In this network, the nodes (or vertices) represent Facebook users and Facebook Posts submitted on the Page, and ties (or edges) between nodes represent whether a user has commented on and/or liked a post. For example, a user U might have liked Post A and commented on Post B. Additionally, a weight value is assigned for the Comments ties, indicating how many times a user commented on a particular post (note that users can only like Posts once). We took these actions as demonstrating sufficient connection over time in relation to an issue of common concern.Figure 1: Network visualisation of activity on the Save the Palace Facebook Page, June 2013 to December 2015. The colour of the nodes denotes which ‘community’ cluster they belong to (computed via the Infomap algorithm) and nodes are sized by out-degree (number of Likes/Comments made by users to Posts). The graph layout is computed via the Force Atlas 2 algorithm.Community detection was performed on the network using the Infomap algorithm (Rosvall and Bergstrom), which is suited to large-scale weighted and directed networks (Henman et al.). This analysis reveals two large and two smaller clusters or groups represented by colour differences (Fig. 1). Broadly, this suggests the presence of several clusters amongst a sustained network engaging with the page over the three years. Beyond this, a range of other colours denoting smaller clusters indicates a diversity of activity and actors co-participating in the network as part of a broader community.The positioning of nodes within the network is not random – the visualisation is generated by the Force Atlas 2 algorithm (Jacomy et al.) that spatially sorts the nodes through processes of attraction and repulsion according to the observed patterns of connectivity. As we would expect, the two-dimensional spatial arrangement of nodes conforms to the community clustering, helping us to visualise the network in the form of a networked public, and build a narrative interpretation of “what is going on” in this online social space.Social value for VCAT was loosely defined as a sense of connection, sentiment and attachment to the venue. While we could illustrate the extent of the active connections of those engaging with the Page, the network map does not in itself reveal much about the sentiment, or the emotional attachment to the Save the Palace cause. This kind of affect can be understood as “the energy that drives, neutralizes, or entraps networked publics” (Papacharissi 7), and its measure presents a particular challenge, but also interest, for understanding a data public. It is often measured through sentiment analysis of content, but we targeted reach and engagement events – particular moments that indicated intense interaction with the Page and associated events.Figure 2: Save the Palace Facebook Page: Organic post reach November—December 2014The affective connection and orientation could be demonstrated through two dimensions of post “reach”: average reach across the lifespan of the Page, and specific “reach-events”. Average reach illustrates the sustained engagement with the Page over time. Average un-paid reach for Posts with links (primarily news and legal updates), was 12,015 or 33% of the total follower base – a figure well above the standard for Community Page reach at that time. Reach-events indicated particular points of intensity and illustrates the Page’s ability to resonate publicly. Figure 2 points to one such event in November 2015, when news circulated that the developers were defying stop-work orders and demolishing parts of The Palace. The 100k reach indicated intense and widespread activity – Likes, Shares, Comments – in a short timeframe. We examined Comment activity in relation to specific reach events to qualify this reach event and illustrate the sense of outrage directed toward the developers, and expressions of solidarity toward those attempting to stop the redevelopment. Affordising Data Publics and the Transformative Work of AnalyticsEach stage of deriving evidence of social value through Page data, from building public visibility and online activity to analysis and presentation at VCAT, was affected by the affordising work of the protesters involved (particularly the Page Admins), civil society groups, platform features and data structures and our choices in analysis and presentation. The notion of affordising is useful here because, as Pollock defines the term, it draws attention to the transformative work of metrics, analytics, platform features and other devices that re-package social activity through modes of datafication and analysis. The Save the Palace group mobilised in a particular way so as to channel their activities, make them visible and archival, to capture the resonant effects of their public protest through a platform that would best make that public visible to itself. The growth of the interest in the Facebook Page feeds back on itself reflexively as more people encounter it and participate. Contrary to critiques of “clicktivism”, these acts combine digital-material events and activities that were to become consequential for the public protest – such as the engagement activities around the November 2015 event described in Figure 2.In addition, presenting the research in court introduced particular hurdles, in finding “the meaningful data” appropriate to the needs of the case, “visualizing social data for social purposes”, and the need to be “evocative as well as accurate” (Donath, 16). The visualisation and presentation of the data needed to afford a valid and meaningful expression of the social significance the Palace. Which layout algorithm to use? What scale do we want to use? Which community detection algorithm and colour scheme for nodes? These choices involve challenges regarding legibility of visualisations of public data (McCosker and Wilken; Kennedy et al.).The transformative actions at play in these tactics of public data analysis can inform other instances of data-driven protest or social participation, but also leave room for misuse. The interests of developers, for example, could equally be served by monitoring protesters’ actions through the same data, or by targeting disagreement or ambiguity in the data. Similarly, moves by Facebook to restrict access to Page data will disproportionately affect those without the means to pay for access. These tactics call for further work in ethical principles of open data, standardisation and data literacies for the courts and those who would benefit from use of their own public data in this way.ConclusionsWe have argued through the case of the Save the Palace protest that in order to make use of public social media data to define a data public, multiple levels of data literacy, access and affordising are required. Rather than assuming that public data simply constitutes a data public, we have emphasised: a) the consequentiality of the movement; b) sufficient connection over time; and c) affective or emotional qualities of connection and interaction with public events. This includes the activities of the core members of the Save the Palace protest group, and the tens of thousands who engaged in some way with the Page. It also involves Facebook’s data affordances as these allow for the extraction of public data, alongside our choices in analysis and visualisation, and the court’s capacity and openness to accept all of this as indicative of the social value (connections, sentiment, attachment) it sought for the case. The Senior Member and Member presiding over the case had little knowledge of Facebook or other social media platforms, did not use them, and hence themselves had limited capacity to recognise the social and cultural nuances of activities that took place through the Facebook Page. This does not exclude the use of the data but made it more difficult to present a picture of the relevance and consequence of the data for understanding the social value evident in the contested building. While the court’s acceptance of the analysis as evidence is a significant starting point, further work is required to ensure openness, standardisation and ethical treatment of public data within public institutions like the courts. ReferencesBruns, A., and J. Burgess. “The Use of Twitter Hashtags in the Formation of Ad Hoc Publics.” 6th European Consortium for Political Research General Conference, University of Iceland, Reykjavík, 25-27 August 2011. 1 Aug. 2018 <http://eprints.qut.edu.au/46515/>.Baym, N.K., and d. boyd. “Socially Mediated Publicness: An Introduction.” Journal of Broadcasting & Electronic Media 56.3 (2012): 320-329.Dewey, J. The Public and Its Problems: An Essay in Political Inquiry. Athens, Ohio: Swallow P, 2016 [1927].Donath, J. The Social Machine: Designs for Living Online. Cambridge: MIT P, 2014.Fredericks, J., and M. Foth. “Augmenting Public Participation: Enhancing Planning Outcomes through the Use of Social Media and Web 2.0.” Australian Planner 50.3 (2013): 244-256.Gerbaudo, P. Tweets and the Streets: Social Media and Contemporary Activism. New York: Pluto P, 2012.Gibson, J.J. The Ecological Approach to Visual Perception. Boston: Houghton Mifflin Harcourt, 1979.Graham, T., and R. Ackland. “SocialMediaLab: Tools for Collecting Social Media Data and Generating Networks for Analysis.” CRAN (The Comprehensive R Archive Network). 2018. 1 Aug. 2018 <https://cran.r- project.org/web/packages/SocialMediaLab/SocialMediaLab.pdf>.Gray J., C. Gerlitz, and L. Bounegru. “Data Infrastructure Literacy.” Big Data & Society 5.2 (2018). 1 Aug. 2018 <https://doi.org/10.1177/2053951718786316>.Halpin, T., A. Vromen, M. Vaughan, and M. Raissi. “Online Petitioning and Politics: The Development of Change.org in Australia.” Australian Journal of Political Science (2018). 1 Aug. 2018 <https://doi.org/10.1080/10361146.2018.1499010>.Henman, P., R. Ackland, and T. Graham. “Community Structure in e-Government Hyperlink Networks.” Proceedings of the 14th European Conference on e-Government (ECEG ’14), 12-13 June 2014, Brasov, Romania.Ito, M. “Introduction.” Networked Publics. Ed. K. Varnelis. Cambridge, MA.: MIT P, 2008. 1-14.Jacomy M., T. Venturini, S. Heymann, and M. Bastian. “ForceAtlas2, a Continuous Graph Layout Algorithm for Handy Network Visualization Designed for the Gephi Software.” PLoS ONE 9.6 (2014): e98679. 1 Aug. 2018 <https://doi.org/10.1371/journal.pone.0098679>.Jinshan Investment Group Pty Ltd v Melbourne CC [2016] VCAT 626, 117. 2016. 1 Aug. 2018 <https://bit.ly/2JGRnde>.Johnson, B., and G. Halegoua. “Can Social Media Save a Neighbourhood Organization?” Planning, Practice & Research 30.3 (2015): 248-269.Kennedy, H., R.L. Hill, G. Aiello, and W. Allen. “The Work That Visualisation Conventions Do.” Information, Communication & Society, 19.6 (2016): 715-735.Mattoni, A. Media Practices and Protest Politics: How Precarious Workers Mobilise. Burlington, VT: Ashgate, 2012.McCosker, A. “Data Literacies for the Postdemographic Social Media Self.” First Monday 22.10 (2017). 1 Aug. 2018 <http://firstmonday.org/ojs/index.php/fm/article/view/7307/6550>.McCosker, A. “Statement of Evidence: Palace Theatre Facebook Page Analysis.” Submitted to the Victorian Civil Administration Tribunal, 7 Dec. 2015. 1 Aug. 2018 <https://www.academia.edu/37130238/Evidence_Statement_Save_the_Palace_Facebook_Page_Analysis_VCAT_2015_>.McCosker, A., and M. Esther. "Coding Labour." Cultural Studies Review 20.1 (2014): 4-29.McCosker, A., and R. Wilken. “Rethinking ‘Big Data’ as Visual Knowledge: The Sublime and the Diagrammatic in Data Visualisation.” Visual Studies 29.2 (2014): 155-164.Meikle, G. Social Media: Communication, Sharing and Visibility. New York: Routledge, 2016.Papacharissi, Z. Affective Publics: Sentiment, Technology, and Politics. Oxford: Oxford UP, 2015.Pollock, N. “Ranking Devices: The Socio-Materiality of Ratings.” Materiality and Organizing: Social Interaction in a Technological World. Eds. P.M. Leonardi, Bonnie A. Nardi, and J. Kallinikos. Oxford: Oxford UP, 2012. 91-114.Rosvall, M., and C.T. Bergstrom. “Maps of Random Walks on Complex Networks Reveal Community Structure.” Proceedings of the National Academy of Sciences of the United States of America 105.4 (2008): 1118-1123.Ruppert E. “Doing the Transparent State: Open Government Data as Performance Indicators.” A World of Indicators: The Making of Governmental Knowledge through Quantification. Eds. R. Rottenburg S.E. Merry, S.J. Park, et al. Cambridge: Cambridge UP, 2015. 1–18.Smith, N., and T. Graham. “Mapping the Anti-Vaccination Movement on Facebook.” Information, Communication & Society (2017). 1 Aug. 2018 <https://doi.org/10.1080/1369118X.2017.1418406>.Victorian Planning Reports. “Editorial Comment.” VCAT 3.16 (2016). 1 Aug. 2018 <https://www.vprs.com.au/394-past-editorials/vcat/1595-vcat-volume-3-no-16>.
APA, Harvard, Vancouver, ISO, and other styles
26

Leaver, Tama. "The Social Media Contradiction: Data Mining and Digital Death." M/C Journal 16, no. 2 (March 8, 2013). http://dx.doi.org/10.5204/mcj.625.

Full text
Abstract:
Introduction Many social media tools and services are free to use. This fact often leads users to the mistaken presumption that the associated data generated whilst utilising these tools and services is without value. Users often focus on the social and presumed ephemeral nature of communication – imagining something that happens but then has no further record or value, akin to a telephone call – while corporations behind these tools tend to focus on the media side, the lasting value of these traces which can be combined, mined and analysed for new insight and revenue generation. This paper seeks to explore this social media contradiction in two ways. Firstly, a cursory examination of Google and Facebook will demonstrate how data mining and analysis are core practices for these corporate giants, central to their functioning, development and expansion. Yet the public rhetoric of these companies is not about the exchange of personal information for services, but rather the more utopian notions of organising the world’s information, or bringing everyone together through sharing. The second section of this paper examines some of the core ramifications of death in terms of social media, asking what happens when a user suddenly exists only as recorded media fragments, at least in digital terms. Death, at first glance, renders users (or post-users) without agency or, implicitly, value to companies which data-mine ongoing social practices. Yet the emergence of digital legacy management highlights the value of the data generated using social media, a value which persists even after death. The question of a digital estate thus illustrates the cumulative value of social media as media, even on an individual level. The ways Facebook and Google approach digital death are examined, demonstrating policies which enshrine the agency and rights of living users, but become far less coherent posthumously. Finally, along with digital legacy management, I will examine the potential for posthumous digital legacies which may, in some macabre ways, actually reanimate some aspects of a deceased user’s presence, such as the Lives On service which touts the slogan “when your heart stops beating, you'll keep tweeting”. Cumulatively, mapping digital legacy management by large online corporations, and the affordances of more focussed services dealing with digital death, illustrates the value of data generated by social media users, and the continued importance of the data even beyond the grave. Google While Google is universally synonymous with search, and is the world’s dominant search engine, it is less widely understood that one of the core elements keeping Google’s search results relevant is a complex operation mining user data. Different tools in Google’s array of services mine data in different ways (Zimmer, “Gaze”). Gmail, for example, uses algorithms to analyse an individual’s email in order to display the most relevant related advertising. This form of data mining is comparatively well known, with most Gmail users knowingly and willingly accepting more personalised advertising in order to use Google’s email service. However, the majority of people using Google’s search engine are unaware that search, too, is increasingly driven by the tracking, analysis and refining of results on the basis of user activity (Zimmer, “Externalities”). As Alexander Halavais (160–180) quite rightly argues, recent focus on the idea of social search – the deeper integration of social network information in gauging search results – is oxymoronic; all search, at least for Google, is driven by deep analysis of personal and aggregated social data. Indeed, the success of Google’s mining of user data has led to concerns that often invisible processes of customisation and personalisation will mean that the supposedly independent or objective algorithms producing Google’s search results will actually yield a different result for every person. As Siva Vaidhyanathan laments: “as users in a diverse array of countries train Google’s algorithms to respond to specialized queries with localised results, each place in the world will have a different list of what is important, true, or ‘relevant’ in response to any query” (138). Personalisation and customisation are not inherently problematic, and frequently do enhance the relevance of search results, but the main objection raised by critics is not Google’s data mining, but the lack of transparency in the way data are recorded, stored and utilised. Eli Pariser, for example, laments the development of a ubiquitous “filter bubble” wherein all search results are personalised and subjective but are hidden behind the rhetoric of computer-driven algorithmic objectivity (Pariser). While data mining informs and drives many of Google’s tools and services, the cumulative value of these captured fragments of information is best demonstrated by the new service Google Now. Google Now is a mobile app which delivers an ongoing stream of search results but without the need for user input. Google Now extrapolates the rhythms of a person’s life, their interests and their routines in order to algorithmically determine what information will be needed next, and automatically displays it on a user’s mobile device. Clearly Google Now is an extremely valuable and clever tool, and the more information a user shares, the better the ongoing customised results will be, demonstrating the direct exchange value of personal data: total personalisation requires total transparency. Each individual user will need to judge whether they wish to share with Google the considerable amount of personal information needed to make Google Now work. The pressing ethical question that remains is whether Google will ensure that users are sufficiently aware of the amount of data and personal privacy they are exchanging in order to utilise such a service. Facebook Facebook began as a closed network, open only to students at American universities, but has transformed over time to a much wider and more open network, with over a billion registered users. Facebook has continually reinvented their interface, protocols and design, often altering both privacy policies and users’ experience of privacy, and often meeting significant and vocal resistance in the process (boyd). The data mining performed by social networking service Facebook is also extensive, although primarily aimed at refining the way that targeted advertising appears on the platform. In 2007 Facebook partnered with various retail loyalty services and combined these records with Facebook’s user data. This information was used to power Facebook’s Beacon service, which added details of users’ retail history to their Facebook news feed (for example, “Tama just purchased a HTC One”). The impact of all of these seemingly unrelated purchases turning up in many people’s feeds suddenly revealed the complex surveillance, data mining and sharing of these data that was taking place (Doyle and Fraser). However, as Beacon was turned on, without consultation, for all Facebook users, there was a sizable backlash that meant that Facebook had to initially switch the service to opt-in, and then discontinue it altogether. While Beacon has been long since erased, it is notable that in early 2013 Facebook announced that they have strengthened partnerships with data mining and profiling companies, including Datalogix, Epsilon, Acxiom, and BlueKai, which harness customer information from a range of loyalty cards, to further refine the targeting ability offered to advertisers using Facebook (Hof). Facebook’s data mining, surveillance and integration across companies is thus still going on, but no longer directly visible to Facebook users, except in terms of the targeted advertisements which appear on the service. Facebook is also a platform, providing a scaffolding and gateway to many other tools and services. In order to use social games such as Zynga’s Farmville, Facebook users agree to allow Zynga to access their profile information, and use Facebook to authenticate their identity. Zynga has been unashamedly at the forefront of user analytics and data mining, attempting to algorithmically determine the best way to make virtual goods within their games attractive enough for users to pay for them with real money. Indeed, during a conference presentation, Zynga Vice President Ken Rudin stated outright that Zynga is “an analytics company masquerading as a games company” (Rudin). I would contend that this masquerade succeeds, as few Farmville players are likely to consider how their every choice and activity is being algorithmically scrutinised in order to determine what virtual goods they might actually buy. As an instance of what is widely being called ‘big data’, the data miing operations of Facebook, Zynga and similar services lead to a range of ethical questions (boyd and Crawford). While users may have ostensibly agreed to this data mining after clicking on Facebook’s Terms of Use agreement, the fact that almost no one reads these agreements when signing up for a service is the Internet’s worst kept secret. Similarly, the extension of these terms when Facebook operates as a platform for other applications is a far from transparent process. While examining the recording of user data leads to questions of privacy and surveillance, it is important to note that many users are often aware of the exchange to which they have agreed. Anders Albrechtslund deploys the term ‘social surveillance’ to usefully emphasise the knowing, playful and at times subversive approach some users take to the surveillance and data mining practices of online service providers. Similarly, E.J. Westlake notes that performances of self online are often not only knowing but deliberately false or misleading with the aim of exploiting the ways online activities are tracked. However, even users well aware of Facebook’s data mining on the site itself may be less informed about the social networking company’s mining of offsite activity. The introduction of ‘like’ buttons on many other Websites extends Facebook’s reach considerably. The various social plugins and ‘like’ buttons expand both active recording of user activity (where the like button is actually clicked) and passive data mining (since a cookie is installed or updated regardless of whether a button is actually pressed) (Gerlitz and Helmond). Indeed, because cookies – tiny packets of data exchanged and updated invisibly in browsers – assign each user a unique identifier, Facebook can either combine these data with an existing user’s profile or create profiles about non-users. If that person even joins Facebook, their account is connected with the existing, data-mined record of their Web activities (Roosendaal). As with Google, the significant issue here is not users knowingly sharing their data with Facebook, but the often complete lack of transparency in terms of the ways Facebook extracts and mines user data, both on Facebook itself and increasingly across applications using Facebook as a platform and across the Web through social plugins. Google after Death While data mining is clearly a core element in the operation of Facebook and Google, the ability to scrutinise the activities of users depends on those users being active; when someone dies, the question of the value and ownership of their digital assets becomes complicated, as does the way companies manage posthumous user information. For Google, the Gmail account of a deceased person becomes inactive; the stored email still takes up space on Google’s servers, but with no one using the account, no advertising is displayed and thus Google can earn no revenue from the account. However, the process of accessing the Gmail account of a deceased relative is an incredibly laborious one. In order to even begin the process, Google asks that someone physically mails a series of documents including a photocopy of a government-issued ID, the death certificate of the deceased person, evidence of an email the requester received from the deceased, along with other personal information. After Google have received and verified this information, they state that they might proceed to a second stage where further documents are required. Moreover, if at any stage Google decide that they cannot proceed in releasing a deceased relative’s Gmail account, they will not reveal their rationale. As their support documentation states: “because of our concerns for user privacy, if we determine that we cannot provide the Gmail content, we will not be able to share further details about the account or discuss our decision” (Google, “Accessing”). Thus, Google appears to enshrine the rights and privacy of individual users, even posthumously; the ownership or transfer of individual digital assets after death is neither a given, nor enshrined in Google’s policies. Yet, ironically, the economic value of that email to Google is likely zero, but the value of the email history of a loved one or business partner may be of substantial financial and emotional value, probably more so than when that person was alive. For those left behind, the value of email accounts as media, as a lasting record of social communication, is heightened. The question of how Google manages posthumous user data has been further complicated by the company’s March 2012 rationalisation of over seventy separate privacy policies for various tools and services they operate under the umbrella of a single privacy policy accessed using a single unified Google account. While this move was ostensibly to make privacy more understandable and transparent at Google, it had other impacts. For example, one of the side effects of a singular privacy policy and single Google identity is that deleting one of a recently deceased person’s services may inadvertently delete them all. Given that Google’s services include Gmail, YouTube and Picasa, this means that deleting an email account inadvertently erases all of the Google-hosted videos and photographs that individual posted during their lifetime. As Google warns, for example: “if you delete the Google Account to which your YouTube account is linked, you will delete both the Google Account AND your YouTube account, including all videos and account data” (Google, “What Happens”). A relative having gained access to a deceased person’s Gmail might sensibly delete the email account once the desired information is exported. However, it seems less likely that this executor would realise that in doing so all of the private and public videos that person had posted on YouTube would also permanently disappear. While material possessions can be carefully dispersed to specific individuals following the instructions in someone’s will, such affordances are not yet available for Google users. While it is entirely understandable that the ramification of policy changes are aimed at living users, as more and more online users pass away, the question of their digital assets becomes increasingly important. Google, for example, might allow a deceased person’s executor to elect which of their Google services should be kept online (perhaps their YouTube videos), which traces can be exported (perhaps their email), and which services can be deleted. At present, the lack of fine-grained controls over a user’s digital estate at Google makes this almost impossible. While it violates Google’s policies to transfer ownership of an account to another person, if someone does leave their passwords behind, this provides their loved ones with the best options in managing their digital legacy with Google. When someone dies and their online legacy is a collection of media fragments, the value of those media is far more apparent to the loved ones left behind rather than the companies housing those media. Facebook Memorialisation In response to users complaining that Facebook was suggesting they reconnect with deceased friends who had left Facebook profiles behind, in 2009 the company instituted an official policy of turning the Facebook profiles of departed users into memorial pages (Kelly). Technically, loved ones can choose between memorialisation and erasing an account altogether, but memorialisation is the default. This entails setting the account so that no one can log into it, and that no new friends (connections) can be made. Existing friends can access the page in line with the user’s final privacy settings, meaning that most friends will be able to post on the memorialised profile to remember that person in various ways (Facebook). Memorialised profiles (now Timelines, after Facebook’s redesign) thus become potential mourning spaces for existing connections. Since memorialised pages cannot make new connections, public memorial pages are increasingly popular on Facebook, frequently set up after a high-profile death, often involving young people, accidents or murder. Recent studies suggest that both of these Facebook spaces are allowing new online forms of mourning to emerge (Marwick and Ellison; Carroll and Landry; Kern, Forman, and Gil-Egui), although public pages have the downside of potentially inappropriate commentary and outright trolling (Phillips). Given Facebook has over a billion registered users, estimates already suggest that the platform houses 30 million profiles of deceased people, and this number will, of course, continue to grow (Kaleem). For Facebook, while posthumous users do not generate data themselves, the fact that they were part of a network means that their connections may interact with a memorialised account, or memorial page, and this activity, like all Facebook activities, allows the platform to display advertising and further track user interactions. However, at present Facebook’s options – to memorialise or delete accounts of deceased people – are fairly blunt. Once Facebook is aware that a user has died, no one is allowed to edit that person’s Facebook account or Timeline, so Facebook literally offers an all (memorialisation) or nothing (deletion) option. Given that Facebook is essentially a platform for performing identities, it seems a little short-sighted that executors cannot clean up or otherwise edit the final, lasting profile of a deceased Facebook user. As social networking services and social media become more ingrained in contemporary mourning practices, it may be that Facebook will allow more fine-grained control, positioning a digital executor also as a posthumous curator, making the final decision about what does and does not get kept in the memorialisation process. Since Facebook is continually mining user activity, the popularity of mourning as an activity on Facebook will likely mean that more attention is paid to the question of digital legacies. While the user themselves can no longer be social, the social practices of mourning, and the recording of a user as a media entity highlights the fact that social media can be about interactions which in significant ways include deceased users. Digital Legacy Services While the largest online corporations have fairly blunt tools for addressing digital death, there are a number of new tools and niche services emerging in this area which are attempting to offer nuanced control over digital legacies. Legacy Locker, for example, offers to store the passwords to all of a user’s online services and accounts, from Facebook to Paypal, and to store important documents and other digital material. Users designate beneficiaries who will receive this information after the account holder passes away, and this is confirmed by preselected “verifiers” who can attest to the account holder’s death. Death Switch similarly provides the ability to store and send information to users after the account holder dies, but tests whether someone is alive by sending verification emails; fail to respond to several prompts and Death Switch will determine a user has died, or is incapacitated, and executes the user’s final instructions. Perpetu goes a step further and offers the same tools as Legacy Locker but also automates existing options from social media services, allowing users to specify, for example, that their Facebook, Twitter or Gmail data should be downloaded and this archive should be sent to a designated recipient when the Perpetu user dies. These tools attempt to provide a more complex array of choices in terms of managing a user’s digital legacy, providing similar choices to those currently available when addressing material possessions in a formal will. At a broader level, the growing demand for these services attests to the ongoing value of online accounts and social media traces after a user’s death. Bequeathing passwords may not strictly follow the Terms of Use of the online services in question, but it is extremely hard to track or intervene when a user has the legitimate password, even if used by someone else. More to the point, this finely-grained legacy management allows far more flexibility in the utilisation and curation of digital assets posthumously. In the process of signing up for one of these services, or digital legacy management more broadly, the ongoing value and longevity of social media traces becomes more obvious to both the user planning their estate and those who ultimately have to manage it. The Social Media Afterlife The value of social media beyond the grave is also evident in the range of services which allow users to communicate in some fashion after they have passed away. Dead Social, for example, allows users to schedule posthumous social media activity, including the posting of tweets, sending of email, Facebook messages, or the release of online photos and videos. The service relies on a trusted executor confirming someone’s death, and after that releases these final messages effectively from beyond the grave. If I Die is a similar service, which also has an integrated Facebook application which ensures a user’s final message is directly displayed on their Timeline. In a bizarre promotional campaign around a service called If I Die First, the company is promising that the first user of the service to pass away will have their posthumous message delivered to a huge online audience, via popular blogs and mainstream press coverage. While this is not likely to appeal to everyone, the notion of a popular posthumous performance of self further complicates that question of what social media can mean after death. Illustrating the value of social media legacies in a quite different but equally powerful way, the Lives On service purports to algorithmically learn how a person uses Twitter while they are live, and then continue to tweet in their name after death. Internet critic Evgeny Morozov argues that Lives On is part of a Silicon Valley ideology of ‘solutionism’ which casts every facet of society as a problem in need of a digital solution (Morozov). In this instance, Lives On provides some semblance of a solution to the problem of death. While far from defeating death, the very fact that it might be possible to produce any meaningful approximation of a living person’s social media after they die is powerful testimony to the value of data mining and the importance of recognising that value. While Lives On is an experimental service in its infancy, it is worth wondering what sort of posthumous approximation might be built using the robust data profiles held by Facebook or Google. If Google Now can extrapolate what a user wants to see without any additional input, how hard would it be to retool this service to post what a user would have wanted after their death? Could there, in effect, be a Google After(life)? Conclusion Users of social media services have differing levels of awareness regarding the exchange they are agreeing to when signing up for services provided by Google or Facebook, and often value the social affordances without necessarily considering the ongoing media they are creating. Online corporations, by contrast, recognise and harness the informatic traces users generate through complex data mining and analysis. However, the death of a social media user provides a moment of rupture which highlights the significant value of the media traces a user leaves behind. More to the point, the value of these media becomes most evident to those left behind precisely because that individual can no longer be social. While beginning to address the issue of posthumous user data, Google and Facebook both have very blunt tools; Google might offer executors access while Facebook provides the option of locking a deceased user’s account as a memorial or removing it altogether. Neither of these responses do justice to the value that these media traces hold for the living, but emerging digital legacy management tools are increasingly providing a richer set of options for digital executors. While the differences between material and digital assets provoke an array of legal, spiritual and moral issues, digital traces nevertheless clearly hold significant and demonstrable value. For social media users, the death of someone they know is often the moment where the media side of social media – their lasting, infinitely replicable nature – becomes more important, more visible, and casts the value of the social media accounts of the living in a new light. For the larger online corporations and service providers, the inevitable increase in deceased users will likely provoke more fine-grained controls and responses to the question of digital legacies and posthumous profiles. It is likely, too, that the increase in online social practices of mourning will open new spaces and arenas for those same corporate giants to analyse and data-mine. References Albrechtslund, Anders. “Online Social Networking as Participatory Surveillance.” First Monday 13.3 (2008). 21 Apr. 2013 ‹http://firstmonday.org/article/view/2142/1949›. boyd, danah. “Facebook’s Privacy Trainwreck: Exposure, Invasion, and Social Convergence.” Convergence 14.1 (2008): 13–20. ———, and Kate Crawford. “Critical Questions for Big Data.” Information, Communication & Society 15.5 (2012): 662–679. Carroll, Brian, and Katie Landry. “Logging On and Letting Out: Using Online Social Networks to Grieve and to Mourn.” Bulletin of Science, Technology & Society 30.5 (2010): 341–349. Doyle, Warwick, and Matthew Fraser. “Facebook, Surveillance and Power.” Facebook and Philosophy: What’s on Your Mind? Ed. D.E. Wittkower. Chicago, IL: Open Court, 2010. 215–230. Facebook. “Deactivating, Deleting & Memorializing Accounts.” Facebook Help Center. 2013. 7 Mar. 2013 ‹http://www.facebook.com/help/359046244166395/›. Gerlitz, Carolin, and Anne Helmond. “The Like Economy: Social Buttons and the Data-intensive Web.” New Media & Society (2013). Google. “Accessing a Deceased Person’s Mail.” 25 Jan. 2013. 21 Apr. 2013 ‹https://support.google.com/mail/answer/14300?hl=en›. ———. “What Happens to YouTube If I Delete My Google Account or Google+?” 8 Jan. 2013. 21 Apr. 2013 ‹http://support.google.com/youtube/bin/answer.py?hl=en&answer=69961&rd=1›. Halavais, Alexander. Search Engine Society. Polity, 2008. Hof, Robert. “Facebook Makes It Easier to Target Ads Based on Your Shopping History.” Forbes 27 Feb. 2013. 1 Mar. 2013 ‹http://www.forbes.com/sites/roberthof/2013/02/27/facebook-makes-it-easier-to-target-ads-based-on-your-shopping-history/›. Kaleem, Jaweed. “Death on Facebook Now Common as ‘Dead Profiles’ Create Vast Virtual Cemetery.” Huffington Post. 7 Dec. 2012. 7 Mar. 2013 ‹http://www.huffingtonpost.com/2012/12/07/death-facebook-dead-profiles_n_2245397.html›. Kelly, Max. “Memories of Friends Departed Endure on Facebook.” The Facebook Blog. 27 Oct. 2009. 7 Mar. 2013 ‹http://www.facebook.com/blog/blog.php?post=163091042130›. Kern, Rebecca, Abbe E. Forman, and Gisela Gil-Egui. “R.I.P.: Remain in Perpetuity. Facebook Memorial Pages.” Telematics and Informatics 30.1 (2012): 2–10. Marwick, Alice, and Nicole B. Ellison. “‘There Isn’t Wifi in Heaven!’ Negotiating Visibility on Facebook Memorial Pages.” Journal of Broadcasting & Electronic Media 56.3 (2012): 378–400. Morozov, Evgeny. “The Perils of Perfection.” The New York Times 2 Mar. 2013. 4 Mar. 2013 ‹http://www.nytimes.com/2013/03/03/opinion/sunday/the-perils-of-perfection.html?pagewanted=all&_r=0›. Pariser, Eli. The Filter Bubble: What the Internet Is Hiding from You. London: Viking, 2011. Phillips, Whitney. “LOLing at Tragedy: Facebook Trolls, Memorial Pages and Resistance to Grief Online.” First Monday 16.12 (2011). 21 Apr. 2013 ‹http://firstmonday.org/ojs/index.php/fm/article/view/3168›. Roosendaal, Arnold. “We Are All Connected to Facebook … by Facebook!” European Data Protection: In Good Health? Ed. Serge Gutwirth et al. Dordrecht: Springer, 2012. 3–19. Rudin, Ken. “Actionable Analytics at Zynga: Leveraging Big Data to Make Online Games More Fun and Social.” San Diego, CA, 2010. Vaidhyanathan, Siva. The Googlization of Everything. 1st ed. Berkeley: University of California Press, 2011. Westlake, E.J. “Friend Me If You Facebook: Generation Y and Performative Surveillance.” TDR: The Drama Review 52.4 (2008): 21–40. Zimmer, Michael. “The Externalities of Search 2.0: The Emerging Privacy Threats When the Drive for the Perfect Search Engine Meets Web 2.0.” First Monday 13.3 (2008). 21 Apr. 2013 ‹http://firstmonday.org/ojs/index.php/fm/article/view/2136/1944›. ———. “The Gaze of the Perfect Search Engine: Google as an Infrastructure of Dataveillance.” Web Search. Eds. Amanda Spink & Michael Zimmer. Berlin: Springer, 2008. 77–99.
APA, Harvard, Vancouver, ISO, and other styles
27

Bibangambah, Prossy, Linda C. Hemphill, Moses Acan, Alexander C. Tsai, Ruth N. Sentongo, June-Ho Kim, Isabelle T. Yang, Mark J. Siedner, and Samson Okello. "Prevalence and correlates of carotid plaque in a mixed HIV-serostatus cohort in Uganda." BMC Cardiovascular Disorders 21, no. 1 (December 2021). http://dx.doi.org/10.1186/s12872-021-02416-5.

Full text
Abstract:
Abstract Background The extent to which the risk of atherosclerotic cardiovascular disease (ACVD) is increased among people living with HIV (PLWH) in sub-Saharan Africa remains unknown. Setting Cross-sectional analysis nested within the Ugandan Noncommunicable Diseases and Aging Cohort, including PLWH in rural Uganda > 40 years taking antiretroviral therapy (ART) for at least 3 years, and a population-based control group of HIV-uninfected age- and sex-matched persons. Methods We conducted carotid ultrasonography and collected ACVD risk factor data. Our outcome of interest was carotid plaque, defined as > 1.5 mm thickness from the intima-lumen interface to the media-adventitia interface. We fit multivariable logistic regression models to estimate correlates of carotid plaque including HIV-specific and traditional cardiovascular risk factors. Results We enrolled 155 (50.2%) PLWH and 154 (49.8%) HIV-uninfected comparators, with a mean age of 51.4 years. Among PLWH, the median CD4 count was 433 cells/mm3 and 97.4% were virologically suppressed. Carotid plaque prevalence was higher among PLWH (8.4% vs 3.3%). HIV infection (aOR 3.90; 95% CI 1.12–13.60) and current smokers (aOR 6.60; 95% CI 1.22–35.80) had higher odds of carotid plaque, whereas moderate (aOR 0.13, 95% CI 0.01–1.55) and vigorous intensity of physical activity (aOR 0.34, 95% CI 0.07–1.52) were associated with decreased odds of carotid plaque. Conclusion In rural Uganda, PLWH have higher prevalence of carotid plaque compared to age- and sex-matched HIV-uninfected comparators. Future work should explore how biomedical and lifestyle modifications might reduce atherosclerotic burden among PLWH in the region.
APA, Harvard, Vancouver, ISO, and other styles
28

Park, Albert, and Mike Conway. "Towards Tracking Opium Related Discussions in Social Media." Online Journal of Public Health Informatics 9, no. 1 (May 2, 2017). http://dx.doi.org/10.5210/ojphi.v9i1.7652.

Full text
Abstract:
ObjectiveWe aim to develop an automated method to track opium relateddiscussions that are made in the social media platform calledReddit.As a first step towards this goal, we use a keyword-based approach totrack how often Reddit members discuss opium related issues.IntroductionIn recent years, the use of social media has increased at anunprecedented rate. For example, the popular social media platformReddit (http://www.reddit.com) had 83 billion page views from over88,000 active sub-communities (subreddits) in 2015. Members ofReddit made over 73 million individual posts and over 725 millionassociated comments in the same year [1].We use Reddit to track opium related discussions, because Redditallows for throwaway and unidentifiable accounts that are suitable forstigmatized discussions that may not be appropriate for identifiableaccounts. Reddit members exchange conversation via a forum likeplatform, and members who have achieved a certain status withinthe community are able to create new topically focused group calledsubreddits.MethodsFirst, we use a dataset archived by one of Reddit members who usedReddit’s official Application Programming Interface (API) to collectthe data (https://www.reddit.com/r/datasets/comments/3bxlg7/i_have_every_publicly_available_reddit_comment/). The dataset iscomprised of 239,772 (including both active and inactive) subreddits,13,213,173 unique user IDs, 114,320,798 posts, and 1,659,361,605associated comments that are made from Oct of 2007 to May of 2015.Second, we identify 10 terms that are associated with opium. Theterms are ‘opium’, ‘opioid’, ‘morphine’, ‘opiate’,’ hydrocodone’,‘oxycodone’, ‘fentanyl’, ‘oxy’, ‘heroin’, ‘methadone’. Third, wepreprocess the entire dataset, which includes structuring the data intomonthly time frame, converting text to lower cases, and stemmingkeywords and text. Fourth, we employed a dictionary approachto count and extract timestamps, user IDs, posts, and commentscontaining opium related terms. Fifth, we normalized the frequencycount by dividing the frequency count by the overall number of therespective variable for that period.ResultsAccording to our dataset, Reddit members discuss opium relatedtopics in social media. The normalized frequency count of postersshows that less than one percent members, on average, talk aboutopium related topics (Figure 1). Although the community as a wholedoes not frequently talk about opium related issues, this still amountsto more than 10,000 members in 2015 (Figure 2). Moreover, membersof Reddit created a number of subreddits, such as ‘oxycontin’,‘opioid’, ‘heroin’, ‘oxycodon’, that explicitly focus on opioids.ConclusionsWe present preliminary findings on developing an automatedmethod to track opium related discussions in Reddit. Our initialresults suggest that on the basis of our analysis of Reddit, members ofthe Reddit community discuss opium related issues in social media,although the discussions are contributed by a small fraction of themembers.We provide several interesting directions to future work to bettertrack opium related discussions in Reddit. First, the automated methodneeds to be further developed to employ more sophisticated methodslike knowledge-based and corpus-based approaches to better extractopium related discussions. Second, the automated method needs tobe thoroughly evaluated and measure precision, recall, accuracy, andF1-score of the system. Third, given how many members use socialmedia to discuss these issues, it will be helpful to investigate thespecifics of their discussions.Line Graphs of normalized frequency counts for posters, comments, and poststhat contained opium related termsLine Graphs of raw frequency counts for posters, comments, and posts thatcontained opium related terms
APA, Harvard, Vancouver, ISO, and other styles
29

Stewart, Barbara E., and Robert EA Stewart. "The biology behind the counts: tooth development related to age estimation in beluga (Delphinapterus leucas)." NAMMCO Scientific Publications 8 (November 26, 2014). http://dx.doi.org/10.7557/3.3195.

Full text
Abstract:
The widely accepted method of determining ages of beluga is to count dentine growth layer groups (GLGs) in median, longitudinal sections of a tooth. It is essential to understand how these growth layers form and to consider developmental factors that can confound their enumeration to be able to provide meaningful age estimates. Here we provide information on, and illustrate, the developmental biology of beluga teeth as it relates to interpreting GLGs. Key factors are: evaluating the presence and occlusal wear of fetal dentine; interpreting early-formed diagnostic features such as the neonatal line; assessing the last-formed growth layer adjacent to the pulp cavity; identifying the presence of nodes at the dentine-cementum interface to assist in counting GLGs; and recognizing pulp stones and accessory lines in the dentine which may hinder the age estimate process.
APA, Harvard, Vancouver, ISO, and other styles
30

Bahoush, Gholamreza, and Pourya Salajegheh. "The Outcome of Infants with Acute Lymphoblastic Leukemia Treated with Interfant-99 Protocol." Journal of Pharmaceutical Research International, January 30, 2021, 1–7. http://dx.doi.org/10.9734/jpri/2020/v32i4531085.

Full text
Abstract:
Objective: The Outcome of Infants with Acute Lymphoblastic Leukemia Treated with Interfant-99 Protocol Materials and Methods: In this retrospective analytical study, all newly diagnosed infants with ALL who were treated with Interfant-99 protocol from 2004 to 2014 in Ali-Asghar Children's Hospital in Tehran were included. Demographic data including age at diagnosis, sex, initial WBC, Hb and platelet count, flow cytometric diagnosis, cytogenetic findings, follow-up duration, and their outcome was extracted from patients' medical records. All the above data were analyzed by SPSS 23 software. Results: 11 infants with ALL (5 girls and 6 boys) were included in the study. Mean and median age at diagnosis of all enrolled patients were 7.20 (std. deviation = 1.78; range = 3.57-9.37) and 7.90 months, respectively. 5 of the 11 patients had t (4; 11) and all of them were Pro-B ALL. The mean initial WBC in patients with this translocation was significantly higher than the others (193400 vs. 49166), and this difference was statistically significant (P = 0.004) despite the small number of patients under study. None of the patients had CNS involvement or mediastinal mass at diagnosis. Three patients relapsed, two of whom had isolated CNS relapse. Finally, one of them recovered completely as chemotherapy continued, another suffered a bone marrow relapse and eventually died, and a third suffered a bone marrow relapse and died about 10 months after relapse. The median follow-up of all patients was 53.83-mo. The estimated 5-yr overall survival of patients was 68.60% ± 15.10, and their Estimated 5-yr event-free survival was 21.20%±45.70. Infection was the most common complication during treatment that was manageable. Conclusion: The Interfant-99 protocol appeared to improve the outcome of infants with ALL even with t (4; 11), with manageable complications. However, its implementation in developing countries has problems due to the small number of rooms suitable for heavy chemotherapy, and the dose of drugs that should be modified. It is worth noting that proving this requires a comprehensive prospective study with an appropriate sample size.
APA, Harvard, Vancouver, ISO, and other styles
31

Parray, Umer Yousuf, Aasif Mohammad Khan, Aasif Ahmad Mir, and Shahid Maqbool Mir. "Unveiling the present status of open access repositories: a comparative analysis of India and China." Library Management, January 10, 2023. http://dx.doi.org/10.1108/lm-09-2022-0084.

Full text
Abstract:
PurposeOpen access repository is an essential element of an organization's strategy for enhancing the visibility and accessibility of its intellectual output to a global audience. Owing to its importance, the study aims to explore the current status of open access repositories in India and China by analyzing the different characteristic features of repositories.Design/methodology/approachThe data for the study is collected from OpenDoar which is labeled as a quality assured repository directory across the globe. The country-wise contribution of Asian repositories is extracted from OpenDoar using various filtration options available in the repository. Further, the URL of every Indian and Chinese repository was manually accessed to gather the following metadata: Repository Type, Software Usage, Repository Interface Language, Year of Development, Subject Coverage, Content Coverage, and the utilization of Web 2.0 tools by repositories. FindingsThe findings of the study highlights that among the Asian countries, India is at 4th rank while China is at 5th rank in terms of repository count. The study depicts that India has shown more promising growth than China. However, both the countries mainly focused on institutional repositories while disciplinary, aggregated, and governmental repositories are very few in number, therefore building such repositories is the need of an hour. Dspace as the preferred software and English as a dominant interface language occupy the prominent places in the repositories of both countries. Moreover, the repositories of both countries have embraced web 2.0 tools like RSS 1.0, RSS 2.0 and Atom with little presence of social media tools.Research limitations/implicationsThe study has limitations, and results should be interpreted with caution. The comparison between the two countries is based on only one data source, i.e. OpenDoar. However, there is a possibility that future studies can take various repository directories as a data source that will give a clear picture of comparison.Originality/valueThe study can be beneficial to the policymakers and the administrators of these two regions as it will provide them a vivid picture of the diffrent characteristic features of their repositories so that they can formulate better policies that will be helpful to foster green open access.
APA, Harvard, Vancouver, ISO, and other styles
32

Conti, Olivia. "Disciplining the Vernacular: Fair Use, YouTube, and Remixer Agency." M/C Journal 16, no. 4 (August 11, 2013). http://dx.doi.org/10.5204/mcj.685.

Full text
Abstract:
Introduction The research from which this piece derives explores political remix video (PRV), a genre in which remixers critique dominant discourses and power structures through guerrilla remixing of copyrighted footage (“What Is Political Remix Video?”). Specifically, I examined the works of political video remixer Elisa Kreisinger, whose queer remixes of shows such as Sex and the City and Mad Men received considerable attention between 2010 and the present. As a rhetoric scholar, I am attracted not only to the ways that remix functions discursively but also the ways in which remixers are constrained in their ability to argue, and what recourse they have in these situations of legal and technological constraint. Ultimately, many of these struggles play out on YouTube. This is unsurprising: many studies of YouTube and other user-generated content (UGC) platforms focus on the fact that commercial sites cannot constitute utopian, democratic, or free environments (Hilderbrand; Hess; Van Dijck). However, I find that, contrary to popular belief, YouTube’s commercial interests are not the primary factor limiting remixer agency. Rather, United States copyright law as enacted on YouTube has the most potential to inhibit remixers. This has led to many remixers becoming advocates for fair use, the provision in the Copyright Act of 1976 that allows for limited use of copyrighted content. With this in mind, I decided to delve more deeply into the framing of fair use by remixers and other advocates such as the Electronic Frontier Foundation (EFF) and the Center for Social Media. In studying discourses of fair use as they play out in the remix community, I find that the framing of fair use bears a striking similarity to what rhetoric scholars have termed vernacular discourse—a discourse emanating from a small segment of the larger civic community (Ono and Sloop 23). The vernacular is often framed as that which integrates the institutional or mainstream while simultaneously asserting its difference through appropriation and subversion. A video qualifies as fair use if it juxtaposes source material in a new way for the purposes of critique. In turn, a vernacular text asserts its “vernacularity” by taking up parts of pre-existing dominant institutional discourses in a way that resonates with a smaller community. My argument is that this tension between institutional and vernacular gives political remix video a multivalent argument—one that presents itself both in the text of the video itself as well as in the video’s status as a fair use of copyrighted material. Just as fair use represents the assertion of creator agency against unfair copyright law, vernacular discourse represents the assertion of a localised community within a world dominated by institutional discourses. In this way, remixers engage rights holders and other institutions in a pleasurable game of cat and mouse, a struggle to expose the boundaries of draconian copyright law. YouTube’s Commercial InterestsYouTube’s commercial interests operate at a level potentially invisible to the casual user. While users provide YouTube with content, they also provide the site with data—both metadata culled from their navigations of the site (page views, IP addresses) as well as member-provided data (such as real name and e-mail address). YouTube mines this data for a number of purposes—anything from interface optimisation to targeted advertising via Google’s AdSense. Users also perform a certain degree of labour to keep the site running smoothly, such as reporting videos that violate the Terms of Service, giving videos the thumbs up or thumbs down, and reporting spam comments. As such, users involved in YouTube’s participatory culture are also necessarily involved in the site’s commercial interests. While there are legitimate concerns regarding the privacy of personal information, especially after Google introduced policies in 2012 to facilitate a greater flow of information across all of their subsidiaries, it does not seem that this has diminished YouTube’s popularity (“Google: Privacy Policy”).Despite this, some make the argument that users provide the true benefit of UGC platforms like YouTube, yet reap few rewards, creating an exploitative dynamic (Van Dijck, 46). Two assumptions seem to underpin this argument: the first is that users do not desire to help these platforms prosper, the second is that users expect to profit from their efforts on the website. In response to these arguments, it’s worth calling attention to scholars who have used alternative economic models to account for user-platform coexistence. This is something that Henry Jenkins addresses in his recent book Spreadable Media, largely by focusing on assigning alternate sorts of value to user and fan labour—either the cultural worth of the gift, or the satisfaction of a job well done common to pre-industrial craftsmanship (61). However, there are still questions of how to account for participatory spaces in which labours of love coexist with massively profitable products. In service of this point, Jenkins calls up Lessig, who posits that many online networks operate as hybrid economies, which combine commercial and sharing economies. In a commercial economy, profit is the primary consideration, while a sharing economy is composed of participants who are there because they enjoy doing the work without any expectation of compensation (176). The strict separation between the two economies is, in Lessig’s estimation, essential to the hybrid economy’s success. While it would be difficult to incorporate these two economies together once each had been established, platforms like YouTube have always operated under the hybrid principle. YouTube’s users provide the site with its true value (through their uploading of content, provision of metadata, and use of the site), yet users do not come to YouTube with these tasks in mind—they come to YouTube because it provides an easy-to-use platform by which to share amateur creativity, and a community with whom to interact. Additionally, YouTube serves as the primary venue where remixers can achieve visibility and viral status—something Elisa Kreisinger acknowledged in our interviews (2012). However, users who are not concerned with broad visibility as much as with speaking to particular viewers may leave YouTube if they feel that the venue does not suit their content. Some feminist fan vidders, for instance, have withdrawn from YouTube due to what they perceived as a community who didn’t understand their work (Kreisinger, 2012). Additionally, Kreisinger ended up garnering many more views of her Queer Men remix on Vimeo due simply to the fact that the remix’s initial upload was blocked via YouTube’s Content ID feature. By the time Kreisinger had argued her case with YouTube, the Vimeo link had become the first stop for those viewing and sharing the remix, which received 72,000 views to date (“Queer Men”). Fair Use, Copyright, and Content IDThis instance points to the challenge that remixers face when dealing with copyright on YouTube, a site whose processes are not designed to accommodate fair use. Specifically, Title II, Section 512 of the DMCA (the Digital Millennium Copyright Act, passed in 1998) states that certain websites may qualify as “safe harbours” for copyright infringement if users upload the majority of the content to the site, or if the site is an information location service. These sites are insulated from copyright liability as long as they cooperate to some extent with rights holders. A common objection to Section 512 is that it requires media rights holders to police safe harbours in search of infringing content, rather than placing the onus on the platform provider (Meyers 939). In order to cooperate with Section 512 and rights holders, YouTube initiated the Content ID system in 2007. This system offers rights holders the ability to find and manage their content on the site by creating archives of footage against which user uploads are checked, allowing rights holders to automatically block, track, or monetise uses of their content (it is also worth noting that rights holders can make these responses country-specific) (“How Content ID Works”). At the current time, YouTube has over 15 million reference files against which it checks uploads (“Statistics - YouTube”). Thus, it’s fairly common for uploaded work to get flagged as a violation, especially when that work is a remix of popular institutional footage. If an upload is flagged by the Content ID system, the user can dispute the match, at which point the rights holder has the opportunity to either allow the video through, or to issue a DMCA takedown notice. They can also sue at any point during this process (“A Guide to YouTube Removals”). Content ID matches are relatively easy to dispute and do not generally require legal intervention. However, disputing these automatic takedowns requires users to be aware of their rights to fair use, and requires rights holders to acknowledge a fair use (“YouTube Removals”). This is only compounded by the fact that fair use is not a clearly defined right, but rather a vague provision relying on a balance between four factors: the purpose of the use, character of the work, the amount used, and the effect on the market value of the original (“US Copyright Office–Fair Use”). As Aufderheide and Jaszi observed in 2008, the rejection of videos for Content ID matches combined with the vagaries of fair use has a chilling effect on user-generated content. Rights Holders versus RemixersRights holders’ objections to Section 512 illustrate the ruling power dynamic in current intellectual property disputes: power rests with institutional rights-holding bodies (the RIAA, the MPAA) who assert their dominance over DMCA safe harbours such as YouTube (who must cooperate to stay in business) who, in turn, exert power over remixers (the lowest on the food chain, so to speak). Beyond the observed chilling effect of Content ID, remix on YouTube is shot through with discursive struggle between these rights-holding bodies and remixers attempting to express themselves and reach new communities. However, this has led political video remixers to become especially vocal when arguing for their uses of content. For instance, in the spring of 2009, Elisa Kreisinger curated a show entitled “REMOVED: The Politics of Remix Culture” in which blocked remixes screened alongside the remixers’ correspondence with YouTube. Kreisinger writes that each of these exchanges illustrate the dynamic between rights holders and remixers: “Your video is no longer available because FOX [or another rights-holding body] has chosen to block it (“Remixed/Removed”). Additionally, as Jenkins notes, even Content ID on YouTube is only made available to the largest rights holders—smaller companies must still go through an official DMCA takedown process to report infringement (Spreadable 51). In sum, though recent technological developments may give the appearance of democratising access to content, when it comes to policing UGC, technology has made it easier for the largest rights holders to stifle the creation of content.Additionally, it has been established that rights holders do occasionally use takedowns abusively, and recent court cases—specifically Lenz v. Universal Music Corp.—have established the need for rights holders to assess fair use in order to make a “good faith” assertion that users intend to infringe copyright prior to issuing a takedown notice. However, as Joseph M. Miller notes, the ruling fails to rebalance the burdens and incentives between rights holders and users (1723). This means that while rights holders are supposed to take fair use into account prior to issuing takedowns, there is no process in place that either effectively punishes rights holders who abuse copyright, or allows users to defend themselves without the possibility of massive financial loss (1726). As such, the system currently in place does not disallow or discourage features like Content ID, though cases like Lenz v. Universal indicate a push towards rebalancing the burden of determining fair use. In an effort to turn the tables, many have begun arguing for users’ rights and attempting to parse fair use for the layperson. The Electronic Frontier Foundation (EFF), for instance, has espoused an “environmental rhetoric” of fair use, casting intellectual property as a resource for users (Postigo 1020). Additionally, they have created practical guidelines for UGC creators dealing with DMCA takedowns and Content ID matches on YouTube. The Center for Social Media has also produced a number of fair use guides tailored to different use cases, one of which targeted online video producers. All of these efforts have a common goal: to educate content creators about the fair use of copyrighted content, and then to assert their use as fair in opposition to large rights-holding institutions (though they caution users against unfair uses of content or making risky legal moves that could lead to lawsuits). In relation to remix specifically, this means that remixers must differentiate themselves from institutional, commercial content producers, standing up both for the argument contained in their remix as well as their fair use of copyrighted content.In their “Code of Best Practices for Fair Use in Online Video,” the Center for Social Media note that an online video qualifies as a fair use if (among other things) it critiques copyrighted material and if it “recombines elements to make a new work that depends for its meaning on (often unlikely) relationships between the elements” (8). These two qualities are also two of the defining qualities of political remix video. For instance, they write that work meets the second criteria if it creates “new meaning by juxtaposition,” noting that in these cases “the recombinant new work has a cultural identity of its own and addresses an audience different from those for which its components were intended” (9). Remixes that use elements of familiar sources in unlikely combinations, such as those made by Elisa Kreisinger, generally seek to reach an audience who are familiar with the source content, but also object to it. Sex and the City, for instance, while it initially seemed willing to take on previously “taboo” topics in its exploration of dating in Manhattan, ended with each of the heterosexual characters paired with an opposite sex partner, and forays from this heteronormative narrative were contained either within in one-off episodes or tokenised gay characters. For this reason, Kreisinger noted that the intended audience for Queer Carrie were the queer and feminist viewers of Sex and the City who felt that the show was overly normative and exclusionary (Kreisinger, Art:21). As a result, the target audience of these remixes is different from the target audience of the source material—though the full nuance of the argument is best understood by those familiar with the source. Thus, the remix affirms the segment of the viewing community who saw only tokenised representations of their identity in the source text, and in so doing offers a critique of the original’s heteronormative focus.Fair Use and the VernacularVernacular discourse, as broadly defined by Kent A. Ono and John M. Sloop, refers to discourses that “emerge from discussions between members of self-identified smaller communities within the larger civic community.” It operates partially through appropriating dominant discourses in ways better suited to the vernacular community, through practices of pastiche and cultural syncretism (23). In an effort to better describe the intricacies of this type of discourse, Robert Glenn Howard theorised a hybrid “dialectical vernacular” that oscillates between institutional and vernacular discourse. This hybridity arises from the fact that the institutional and the vernacular are fundamentally inseparable, the vernacular establishing its meaning by asserting itself against the institutional (Howard, Toward 331). When put into use online, this notion of a “dialectical vernacular” is particularly interesting as it refers not only to the content of vernacular messages but also to their means of production. Howard notes that discourse embodying the dialectical vernacular is by nature secondary to institutional discourse, that the institutional must be clearly “structurally prior” (Howard, Vernacular 499). With this in mind it is unsurprising that political remix video—which asserts its secondary nature by calling upon pre-existing copyrighted content while simultaneously reaching out to smaller segments of the civic community—would qualify as a vernacular discourse.The notion of an institutional source’s structural prevalence also echoes throughout work on remix, both in practical guides such as the Center for Social Media’s “Best Practices” as well as in more theoretical takes on remix, like Eduardo Navas’ essay “Turbulence: Remixes + Bonus Beats,” in which he writes that:In brief, the remix when extended as a cultural practice is a second mix of something pre-existent; the material that is mixed for a second time must be recognized, otherwise it could be misunderstood as something new, and it would become plagiarism […] Without a history, the remix cannot be Remix. An elegant theoretical concept, this becomes muddier when considered in light of copyright law. If the history of remix is what gives it its meaning—the source text from which it is derived—then it is this same history that makes a fair use remix vulnerable to DMCA takedowns and other forms of discipline on YouTube. However, as per the criteria outlined by the Center for Social Media, it is also from this ironic juxtaposition of institutional sources that the remix object establishes its meaning, and thus its vernacularity. In this sense, the force of a political remix video’s argument is in many ways dependent on its status as an object in peril: vulnerable to the force of a law that has not yet swung in its favor, yet subversive nonetheless.With this in mind, YouTube and other UGC platforms represent a fraught layer of mediation between institutional and vernacular. As a site for the sharing of amateur video, YouTube has the potential to affirm small communities as users share similar videos, follow one particular channel together, or comment on videos posted by people in their networks. However, YouTube’s interface (rife with advertisements, constantly reminding users of its affiliation with Google) and cooperation with rights holders establish it as an institutional space. As such, remixes on the site are already imbued with the characteristic hybridity of the dialectical vernacular. This is especially true when the remixers (as in the case of PRV) have made the conscious choice to advocate for fair use at the same time that they distribute remixes dealing with other themes and resonating with other communities. ConclusionPolitical remix video sits at a fruitful juncture with regard to copyright as well as vernacularity. Like almost all remix, it makes its meaning through juxtaposing sources in a unique way, calling upon viewers to think about familiar texts in a new light. This creation invokes a new audience—a quality that makes it both vernacular and also a fair use of content. Given that PRV is defined by the “guerrilla” use of copyrighted footage, it has the potential to stand as a political statement outside of the thematic content of the remix simply due to the nature of its composition. This gives PRV tremendous potential for multivalent argument, as a video can simultaneously represent a marginalised community while advocating for copyright reform. This is only reinforced by the fact that many political video remixers have become vocal in advocating for fair use, asserting the strength of their community and their common goal.In addition to this argumentative richness, PRV’s relation to fair use and vernacularity exposes the complexity of the remix form: it continually oscillates between institutional affiliations and smaller vernacular communities. However, the hybridity of these remixes produces tension, much of which manifests on YouTube, where videos are easily responded to and challenged by both institutuional and vernacular authorities. In addition, a tension exists in the remix text itself between the source and the new, remixed message. Further research should attend to these areas of tension, while also exploring the tenacity of the remix community and their ability to advocate for themselves while circumventing copyright law.References“About Political Remix Video.” Political Remix Video. 15 Feb. 2012. ‹http://www.politicalremixvideo.com/what-is-political-remix/›.Aufderheide, Patricia, and Peter Jaszi. Reclaiming Fair Use: How to Put Balance Back in Copyright. Chicago: U of Chicago P, 2008. Kindle.“Code of Best Practices for Fair Use in Online Video.” The Center For Social Media, 2008. Van Dijck, José. “Users like You? Theorizing Agency in User-Generated Content.” Media Culture Society 31 (2009): 41-58.“A Guide to YouTube Removals,” The Electronic Frontier Foundation, 15 June 2013 ‹https://www.eff.org/issues/intellectual-property/guide-to-YouTube-removals›.Hilderbrand, Lucas. “YouTube: Where Cultural Memory and Copyright Converge.” Film Quarterly 61.1 (2007): 48-57.Howard, Robert Glenn. “The Vernacular Web of Participatory Media.” Critical Studies in Media Communication 25.5 (2008): 490-513.Howard, Robert Glenn. “Toward a Theory of the World Wide Web Vernacular: The Case for Pet Cloning.” Journal of Folklore Research 42.3 (2005): 323-60.“How Content ID Works.” YouTube. 21 June 2013. ‹https://support.google.com/youtube/answer/2797370?hl=en›.Jenkins, Henry, Sam Ford, and Joshua Green. Spreadable Media: Creating Value and Meaning in a Networked Culture. New York: New York U P, 2013. Jenkins, Henry. Convergence Culture: Where Old and New Media Collide. New York: New York U P, 2006. Kreisinger, Elisa. Interview with Nick Briz. Art:21. Art:21, 30 June 2011. 21 June 2013.Kreisinger, Elisa. “Queer Video Remix and LGBTQ Online Communities,” Transformative Works and Cultures 9 (2012). 19 June 2013 ‹http://journal.transformativeworks.org/index.php/twc/article/view/395/264›.Kreisinger, Elisa. Pop Culture Pirate. < http://www.popculturepirate.com/ >.Lessig, Lawrence. Remix: Making Art and Commerce Thrive in the Hybrid Economy. New York: Penguin Books, 2008. PDF.Meyers, B.G. “Filtering Systems or Fair Use? A Comparative Analysis of Proposed Regulations for User-Generated Content.” Cardozo Arts & Entertainment Law Journal 26.3: 935-56.Miller, Joseph M. “Fair Use through the Lenz of § 512(c) of the DMCA: A Preemptive Defense to a Premature Remedy?” Iowa Law Review 95 (2009-2010): 1697-1729.Navas, Eduardo. “Turbulence: Remixes + Bonus Beats.” New Media Fix 1 Feb. 2007. 10 June 2013 ‹http://newmediafix.net/Turbulence07/Navas_EN.html›.Ono, Kent A., and John M. Sloop. Shifting Borders: Rhetoric, Immigration and California’s Proposition 187. Philadelphia: Temple U P, 2002.“Privacy Policy – Policies & Principles.” Google. 19 June 2013 ‹http://www.google.com/policies/privacy/›.Postigo, Hector. “Capturing Fair Use for The YouTube Generation: The Digital Rights Movement, the Electronic Frontier Foundation, and the User-Centered Framing of Fair Use.” Information, Communication & Society 11.7 (2008): 1008-27.“Statistics – YouTube.” YouTube. 21 June 2013 ‹http://www.youtube.com/yt/press/statistics.html›.“US Copyright Office: Fair Use,” U.S. Copyright Office. 19 June 2013 ‹http://www.copyright.gov/fls/fl102.html›.“YouTube Help.” YouTube FAQ. 19 June 2013 ‹http://support.google.com/youtube/?hl=en&topic=2676339&rd=2›.
APA, Harvard, Vancouver, ISO, and other styles
33

Youngquist, Scott T., Jonathan Poulson, Daniel Ashwood, Srilakshmi Sangaraju, Hill Stoecklein, Greg Scott, Jeff Clawson, and Christopher H. Olola. "Abstract 292: Accuracy of a Dispatch-Directed Breathing Verification Diagnostic Tool as Assessed by Data From Electronic Patient Care Records and Emergency Dispatch Systems." Circulation 140, Suppl_2 (November 19, 2019). http://dx.doi.org/10.1161/circ.140.suppl_2.292.

Full text
Abstract:
Introduction: Identification of sudden out-of-hospital cardiac arrest (OHCA) and delivery of bystander emergency medical dispatcher (EMD)-directed pre-arrival instructions are key elements in the chain of survival. In order to provide instructions, dispatchers must formulate a clinical impression (rather than a definitive diagnosis) after an abbreviated history and dispatcher-directed examination of the scene. Thus, fast and accurate identification tools are imperative in the treatment of OHCA. Hypothesis: A dispatch-directed breathing verification diagnostic tool (BVDxT) accurately identifies inadequate breathing from OHCA. Methods: A retrospective design using EMD data matched to electronic patient care records (ePCRs) from the Salt Lake City Fire Department. The BVDxT, an embedded interface within the dispatch system, allows dispatchers to count breaths as reported by callers and record the rate of 4 consecutive patient breaths. OHCA was defined as non-trauma cases that had paramedic primary impression of cardiac arrest, respiratory arrest, or overdose or a Glasgow Coma Score of 3. Results: A total of 45,007 emergency dispatch cases were matched with the paramedic impressions of 38,258 ePCRs. Of the matched cases, 2660 cases included some use of the BVDxT, 1,365 (51.3%) had used the BVDxT to completion (i.e. breathing rate was recorded during the call) and, of those, 1,248 (91.4%) had complete on-scene paramedic primary or secondary impressions to match to the BVDxT outcomes. Median time using the tool was 28 seconds (IQR: 21-39). Overall, BVDxT identified 68.6% (n = 856/1,248) callers with disordered breathing and paramedics recorded 16.4% (205/1,248) cases of OHCA. BVDxT demonstrated a 70.7% (145/205) sensitivity, and 31.8% specificity for OHCA. Conclusions: Preliminary evidence suggest the BVDxT does well in predicting abdominal breathing condition. Although, by design, emergency dispatch tools tend to have high sensitivity, the low specificity of the tool may over-triage the number of abnormal breathing issues in the field - which may result in a substantial proportion of possible false positives. Changes in patient condition between call receipt and EMS arrival may also impact these findings.
APA, Harvard, Vancouver, ISO, and other styles
34

Ellison, Elizabeth. "The #AustralianBeachspace Project: Examining Opportunities for Research Dissemination Using Instagram." M/C Journal 20, no. 4 (August 16, 2017). http://dx.doi.org/10.5204/mcj.1251.

Full text
Abstract:
IntroductionIn late 2016, I undertook a short-term, three-month project to share some of my research through my Instagram account using the categorising hashtag #AustralianBeachspace. Much of this work emerged from my PhD thesis, which is being published in journal articles, but has yet to be published in any accessible or overarching way. I wanted to experiment with the process of using a visual social media tool for research dissemination. I felt that Instagram’s ability to combine text and image allowed for an aesthetically interesting way to curate this particular research project. My research is concerned with representations of the Australian beach, and thus the visual, image-based focus of Instagram seemed ideal. In this article, I briefly examine some of the existing research around academic practices of research dissemination, social media use, and the emerging research around Instagram itself. I then will examine my own experience of using Instagram as a tool for depicting curated, aesthetically-driven, research dissemination and reflect whether this use of Instagram is effective for representing and disseminating research. Research DisseminationResearchers, especially those backed by public funding, are always bound by the necessity of sharing the findings and transferring the knowledge gained during the research process. Research metrics are linked to workload allocations and promotion pathways for university researchers, providing clear motivation to maintain an active research presence. For most academics, the traditional research dissemination strategies involve academic publications: peer-reviewed scholarly books and journal articles.For academics working within a higher education policy climate that centres on measuring impact and engagement, peer-reviewed publications remain the gold standard. There are indicators, however, that research dissemination strategies may need to include methods for targeting non-academic outputs. Gunn and Mintrom (21), in their recent research, “anticipate that governments will increasingly question the value of publicly funded research and seek to evaluate research impact”. And this process, they argue, is not without challenges. Education Minister Simon Birmingham supports their claim by suggesting the Turnbull Government is looking to find methods for more meaningful ways of evaluating value in higher education research outcomes, “rather than only allocating funding to researchers who spend their time trying to get published in journals” (para 5).It therefore makes sense that academics are investigating ways of using social media as a way of broadening their research dissemination, despite the fact social media metrics do not yet count towards traditional citations within the university sector.Research Dissemination via Social MediaThere has been an established practice of researchers using social media, especially blogging (Kirkup) and Twitter, as ways of sharing information about their current projects, their findings, their most recent publications, or to connect with colleagues. Gruzd, Staves, and Wilk (2348) investigated social media use by academics, suggesting “scholars are turning to social media tools professionally because they are more convenient for making new connections with peers, collaboration, and research dissemination”. It is possible to see social media functioning as a new way of representing research – playing an important role in the shaping and developing of ideas, sharing those ideas, and functioning as a dissemination tool after the research has concluded.To provide context for the use of social media in research, this section briefly covers blogging and Twitter, two methods considered somewhat separated from university frameworks, and also professional platforms, such as Academia.edu and The Conversation.Perhaps the tool that has the most history in providing another avenue for academics to share their work is academic blogging. Blogging is considered an avenue that allows for discussion of topics prior to publication (Bukvova, 4; Powell, Jacob, and Chapman, 273), and often uses a more conversational tone than academic publishing. It provides opportunity to share research in long form to an open, online audience. Academic blogs have also become significant parts of online academic communities, such as the highly successful blog, The Thesis Whisperer, targeted for research students. However, many researchers in this space note the stigma attached to blogging (and other forms of social media) as useless or trivial; for instance, in Gruzd, Staves, and Wilk’s survey of academic users of social media, an overwhelming majority of respondents suggested that institutions do not recognise these activities (2343). Because blogging is not counted in publication metrics, it is possible to dismiss this type of activity as unnecessary.Twitter has garnered attention within the academic context because of its proliferation in conference engagement and linking citation practices of scholars (Marht, Weller, and Peters, 401–406). Twitter’s platform lends itself as a place to share citations of recently published material and a way of connecting with academic peers in an informal, yet meaningful way. Veletsianos has undertaken an analysis of academic Twitter practices, and there is a rise in popularity of “Tweetable Abstracts” (Else), or the practice of refining academic abstracts into a shareable Tweet format. According to Powell, Jacob, and Chapman (272), new media (including both Twitter and the academic blog) offer opportunities to engage with an increasingly Internet-literate society in a way that is perhaps more meaningful and certainly more accessible than traditional academic journals. Like blogging, the use of Twitter within the active research phase and pre-publication, means the platform can both represent and disseminate new ideas and research findings.Both academic blogs and Twitter are widely accessible and can be read by Internet users beyond academia. It appears likely, however, that many blogs and academic Twitter profiles are still accessed and consumed primarily by academic audiences. This is more obvious in the increasingly popular specific academic social media platforms such as ResearchGate or Academia.edu.These websites are providing more targeted, niche communication and sharing channels for scholars working in higher education globally, and their use appears to be regularly encouraged by institutions. These sites attempt to mediate between open access and copyright in academic publishing, encouraging users to upload full-text documents of their publications as a means of generating more attention and citations (Academia.edu cites Niyazov et al’s study that suggests articles posted to the site had improved citation counts). ResearchGate and Academia.edu function primarily as article repositories, albeit with added social networking opportunities that differentiate them from more traditional university repositories.In comparison, the success of the online platform The Conversation, with its tagline “Academic rigour, journalistic flair”, shows the growing enthusiasm and importance of engaging with more public facing outlets to share forms of academic writing. Many researchers are using The Conversation as a way of sharing their research findings through more accessible, shorter articles designed for the general public; these articles regularly link to the traditional academic publications as well.Research dissemination, and how the uptake of online social networks is changing individual and institution-wide practices, is a continually expanding area of research. It is apparent that while The Conversation has been widely accepted and utilised as a tool of research dissemination, there is still some uncertainty about using social media as representing or disseminating findings and ideas because of the lack of impact metrics. This is perhaps even more notable in regards to Instagram, a platform that has received comparatively little discussion in academic research more broadly.Instagram as Social MediaInstagram is a photo sharing application that launched in 2010 and has seen significant uptake by users in that time, reaching 700 million monthly active users as of April 2017 (Instagram “700 Million”). Recent additions to the service, such as the “Snapchat clone” Instagram Stories, appear to have helped boost growth (Constine, para 4). Instagram then is a major player in the social media user market, and the emergence of academic research into the platform reflect this. Early investigations include Manikonda, Hu and Kambhampati’s analysis social networks, demographics, and activities of users in which they identified some clear differences in usage compared to Flickr (another photo-sharing network) and Twitter (5). Hochman and Manovich and Hochman and Schwartz examined what information visualisations generated from Instagram images can reveal about the “visual rhythms” of geographical locations such as New York City.To provide context for the use of Instagram as a way of disseminating research through a more curated, visual approach, this section will examine professional uses of Instagram, the role of Influencers, and some of the functionalities of the platform.Instagram is now a platform that caters for both personal and professional accounts. The user-interface allows for a streamlined and easily navigable process from taking a photo, adding filters or effects, and sharing the photo instantly. The platform has developed to include web-based access to complement the mobile application, and has also introduced Instagram Business accounts, which provide “real-time metrics”, “insights into your followers”, and the ability to “add information about your company” (Instagram “Instagram Business”). This also comes with the option to pay for advertisements.Despite its name, many users of Instagram, especially those with profiles that are professional or business orientated, do not only produce instant content. While the features of Instagram, such as geotagging, timestamping, and the ability to use the camera from within the app, lend themselves to users capturing their everyday experience in the moment, more and more content is becoming carefully curated. As such, some accounts are blurring the line between personal and professional, becoming what Crystal Abidin calls Influencers, identifying the practice as when microcelebrities are able to use the “textual and visual narration of their personal, everyday lives” to generate paid advertorials (86). One effect of this, as Abidin investigates in the context of Singapore and the #OOTD (Outfit of the Day) hashtag, is the way “everyday Instagram users are beginning to model themselves after Influences” and therefore generate advertising content “that is not only encouraged by Influences and brands but also publicly utilised without remuneration” (87). Instagram, then, can be a very powerful platform for businesses to reach wide audiences, and the flexibility of caption length and visual content provides a type of viral curation practice as in the case of the #OOTD hashtag following.Considering the focus of my #AustralianBeachspace project on Australian beaches, many of the Instagram accounts and hashtags I encountered and engaged with were tourism related. Although this will be discussed in more detail below, it is worth noting that individual Influencers exist in these fields as well and often provide advertorial content for companies like accommodation chains or related products. One example is user @katgaskin, an Influencer who both takes photos, features in photos, and provides “organic” adverts for products and services (see image). Not all her photos are adverts; some are beach or ocean images without any advertorial content in the caption. In this instance, the use of distinctive photo editing, iconic imagery (the “salty pineapple” branding), and thematic content of beach and ocean landscapes, makes for a recognisable and curated aesthetic. Figure 1: An example from user @katgaskin's Instagram profile that includes a mention of a product. Image sourced from @katgaskin, uploaded 2 June 2017.@katgaskin’s profile’s aesthetic identity is, as such, linked with the ocean and the beach. Although her physical location regularly changes (her profile includes images from, for example, Nicaragua, Australia, and the United States), the thematic link is geographical. And research suggests the visual focus of Instagram lends itself to place-based content. As Hochman and Manovich state:While Instagram eliminates static timestamps, its interface strongly emphasizes physical place and users’ locations. The application gives a user the option to publicly share a photo’s location in two ways. Users can tag a photo to a specific venue, and then view all other photos that were taken and tagged there. If users do not choose to tag a photo to a venue, they can publically share their photos’ location information on a personal ‘photo-map’, displaying all photos on a zoomable word map. (para 14)This means that the use of place in the app is anchored to the visual content, not the uploader’s location. While it is possible to consider Instagram’s intention was to anchor the content and the uploader’s location together (as in the study conducted by Weilenmann, Hillman, and Jungselius that explored how Instagram was used in the museum), this is no longer always the case. In this way, Instagram is also providing a platform for more serious photographers to share their images after they have processed and edited them and connect the image with the image content rather than the uploader’s position.This place-based focus also shares origins in tourism photography practices. For instance, Kibby’s analysis of the use of Instagram as a method for capturing the “tourist gaze” in Monument Valley notes that users mostly wanted to capture the “iconic” elements of the site (most of which were landscape formations made notable through representations in popular culture).Another area of research into Instagram use is hashtag practice (see, for example, Ferrara, Interdonato, and Tagarelli). Highfield and Leaver have generated a methodology for mapping hashtags and analysing the information this can reveal about user practices. Many Instagram accounts use hashtags to provide temporal or place based information, some specific (such as #sunrise or #newyorkcity) and some more generic (such as #weekend or #beach). Of particular relevance here is the role hashtags play in generating higher levels of user engagement. It is also worth noting the role of “algorithmic personalization” introduced by Instagram earlier in 2017 and the lukewarm user response as identified by Mahnke Skrubbeltrang, Grunnet, and Tarp’s analysis, suggesting “users are concerned with algorithms dominating their experience, resulting in highly commercialised experience” (section 7).Another key aspect of Instagram’s functionality is linked to the aesthetic of the visual content: photographic filters. Now a mainstay of other platforms such as Facebook and Twitter, Instagram popularised the use of filters by providing easily accessible options within the app interface directly. Now, other apps such as VCSO allow for more detailed editing of images that can then be imported into Instagram; however, the pre-set filters have proven popular with large numbers of users. A study in 2014 by Araújo, Corrêa, da Silva et al found 76% of analysed images had been processed in some way.By considering the professional uses of Instagram and the functionality of the app (geotagging; hashtagging; and filters), it is possible to summarise Instagram as a social media platform that, although initially perhaps intended to capture the everyday visual experiences of amateur photographers using their smart phone, has adapted to become a network for sharing images that can be for both personal and professional purposes. It has a focus on place, with its geotagging capacity and hashtag practices, and can include captions The #AustralianBeachspace ProjectIn October 2016, I began a social media project called #AustralianBeachspace that was designed to showcase content from my PhD thesis and ongoing work into representations of Australian beaches in popular culture (a collection of the project posts only, as opposed to the ongoing Instagram profile, can be found here). The project was envisaged as a three month project; single posts (including an image and caption) were planned and uploaded six times a week (every day except Sundays). Although I have occasionally continued to use the hashtag since the project’s completion (on 24 Dec. 2016), the frequency and planned nature of the posts since then has significantly changed. What has not changed is the strong thematic through line of my posts, all of which continue to rely heavily on beach imagery. This is distinct from other academic social media use which if often more focused on the everyday activity of academia.Instagram was my social media choice for this project for two main reasons: I had no existing professional Instagram profile (unlike Twitter) and thus I could curate a complete project in isolation, and the subject of my PhD thesis was representations of Australian beaches in literature and film. As such, my research was appropriate for, and in fact was augmented by, visual depiction. It is also worth noting the tendency reported by myself and others (Huntsman; Booth) of academics not considering the beach an area worthy of focus. This resonates with Bech Albrechtslund and Albrechtslund’s argument that “social media practices associated with leisure and playfulness” are still meaningful and worthy of examination.Up until this point, my research outputs had been purely textual. I, therefore, needed to generate a significant number of visual elements to complement the vast amount of textual content already created. I used my PhD thesis to provide the thematic structure (I have detailed this process in more depth here), and then used the online tool Trello to plan, organise, and arrange the intended posts (image and caption). The project includes images taken by myself, my partner, and other images with no copyright limitations attached as sourced through photo sharing sites like Unsplash.com.The images were all selected because of their visual representation of an Australian beach, and the alignment of the image with the themes of the project. For instance, one theme focused on the under-represented negative aspects of the beach. One image used in this theme was a photo of Bondi Beach ocean pool, empty at night. I carefully curated the images and arranged them according to the thematic schedule (as can be seen below) and then wrote the accompanying textual captions. Figure 2: A sample of the schedule used for the posting of curated images and captions.While there were some changes to the schedule throughout (for instance, my attendance at the 2016 Sculpture by the Sea exhibition prompted me to create a sixth theme), the process of content curation and creation remained the same.Visual curation of the images was a particularly important aspect of the project, and I did use an external photo processing application to create an aesthetic across the collection. As Kibby notes, “photography is intrinsically linked with tourism” (para 9), and although not a tourism project inherently, #AustralianBeachspace certainly engaged with touristic tropes by focusing on Australian beaches, an iconic part of Australian national and cultural identity (Ellison 2017; Ellison and Hawkes 2016; Fiske, Hodge, and Turner 1987). However, while beaches are perhaps instinctively touristic in their focus on natural landscapes, this project was attempting to illustrate more complexity in this space (which mirrors an intention of my PhD thesis). As such, some images were chosen because of their “ordinariness” or their subversion of the iconic beach images (see below). Figures 3 and 4: Two images that capture some less iconic images of Australian beaches; one that shows an authentic, ordinary summer's day and another that shows an empty beach during winter.I relied on captions to provide the textual information about the image. I also included details about the photographer where possible, and linked all the images with the hashtag #AustralianBeachspace. The textual content, much of which emerged from ongoing and extensive research into the topic, was somewhat easier to collate. However, it required careful reworking and editing to suit the desired audience and to work in conjunction with the image. I kept captions to the approximate length of a paragraph and concerned with one point. This process forced me to distil ideas and concepts into short chunks of writing, which is distinct from other forms of academic output. This textual content was designed to be accessible beyond an academic audience, but still used a relatively formal voice (especially in comparison to more personal users of the platform).I provided additional hashtags in a first comment, which were intended to generate some engagement. Notably, these hashtags were content related (such as #beach and #surf; they were not targeting academic hashtags). At time of writing, my follower count is 70. The most liked (or “favourited”) photo from the project received 50 likes, and the most comments received was 6 (on a number of posts). Some photos published since the end of the project have received higher numbers of likes and comments. This certainly does not suggest enormous impact from this project. Hashtags utilised in this project were adopted from popular and related hashtags using the analytics tool Websta.me as well as hashtags used in similar content styled profiles, such as: #seeaustralia #thisisqueensland #visitNSW #bondibeach #sunshinecoast and so on. Notably, many of the hashtags were place-based. The engagement of this project with users beyond academia was apparent: followers and comments on the posts are more regularly from professional photographers, tourism bodies, or location-based businesses. In fact, because of the content or place-based hashtagging practices I employed, it was difficult to attract an academic audience at all. However, although the project was intended as an experiment with public facing research dissemination, I did not actively adopt a stringent engagement strategy and have not kept metrics per day to track engagement. This is a limitation of the study and undoubtedly allows scope for further research.ConclusionInstagram is a platform that does not have clear pathways for reaching academic audiences in targeted ways. At this stage, little research has emerged that investigates Instagram use among academics, although it is possible to presume there are similarities with blogging or Twitter (for example, conference posting and making connections with colleagues).However, the functionality of Instagram does lend itself to creating and curating aesthetically interesting ways of disseminating, and in fact representing, research. Ideas and findings must be depicted as images and captions, and the curatorial process of marrying visual images to complement or support textual information can make for more accessible and palatable content. Perhaps most importantly, the content is freely accessible and not locked behind paywalls or expensive academic publications. It can also be easily archived and shared.The #AustralianBeachspace project is small-scale and not indicative of widespread academic practice. However, examining the process of creating the project and the role Instagram may play in potentially reaching a more diverse, public audience for academic research suggests scope for further investigation. Although not playing an integral role in publication metrics and traditional measures of research impact, the current changing climate of higher education policy provides motivations to continue exploring non-traditional methods for disseminating research findings and tracking research engagement and impact.Instagram functions as a useful platform for sharing research data through a curated collection of images and captions. Rather than being a space for instant updates on the everyday life of the academic, it can also function in a more aesthetically interesting and dynamic way to share research findings and possibly generate wider, public-facing engagement for topics less likely to emerge from behind the confines of academic journal publications. ReferencesAbidin, Crystal. “Visibility Labour: Engaging with Influencers’ Fashion Brands and #Ootd Advertorial Campaigns on Instagram.” Media International Australia 161.1 (2016): 86–100. <http://journals.sagepub.com/doi/abs/10.1177/1329878X16665177>.Araújo, Camila Souza, Luiz Paulo Damilton Corrêa, Ana Paula Couto da Silva, et al. “It is Not Just a Picture: Revealing Some User Practices in Instagram.” Proceedings of the 9th Latin American Web Congress, Ouro Preto, Brazil, 22–24 October, 2014. <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7000167>Bech Albrechtslund, Anne-Metter, and Anders Albrechtslund. “Social Media as Leisure Culture.” First Monday 19.4 (2014). <http://firstmonday.org/ojs/index.php/fm/article/view/4877/3867>.Birmingham, Simon. “2017 Pilot to Test Impact, Business Engagement of Researchers.” Media Release. Australian Government: Australian Research Council. 21 Nov. 2016. <http://www.arc.gov.au/news-media/media-releases/2017-pilot-test-impact-business-engagement-researchers>.Booth, Douglas. Australian Beach Cultures: The History of Sun, Sand, and Surf. London, United Kingdom: F. Cass, 2001.Bukvova, Helena. “Taking New Routes: Blogs, Web Sites, and Scientific Publishing.” ScieCom Info 7.2 (2011). 20 May 2017 <http://journals.lub.lu.se/index.php/sciecominfo/article/view/5148>.Constine, Josh. “Instagram’s Growth Speeds Up as It Hits 700 Million Users.” Techcrunch, 26 Apr. 2017. 1 June 2017 <https://techcrunch.com/2017/04/26/instagram-700-million-users/>.drlizellison. “Dr Liz Ellison.” Instagram.com, 2017. 8 June 2017 <http://www.instagram.com/drlizellison>.Ellison, Elizabeth. “The Australian Beachspace: Flagging the Spaces of Australian Beach Texts.” PhD thesis. Brisbane: Queensland U of Technology, 2013. <https://eprints.qut.edu.au/63468/>.Ellison, Elizabeth. “The Gritty Urban: The Australian Beach as City Periphery in Cinema.” Filmburia: Screening the Suburbs. Eds. David Forrest, Graeme Harper and Jonathan Rayner. UK: Palgrave Macmillan, 2017. 79–94.Ellison, Elizabeth, and Lesley Hawkes. “Australian Beachspace: The Plurality of an Iconic Site”. Borderlands e-Journal: New Spaces in the Humanities 15.1 (2016). 4 June 2017 <http://www.borderlands.net.au/vol15no1_2016/ellisonhawkes_beachspace.pdf>.Else, Holly. “Tell Us about Your Paper—and Make It Short and Tweet.” Times Higher Education, 9 July 2015. 1 June 2017 <https://www.timeshighereducation.com/opinion/tell-us-about-your-paper-and-make-it-short-and-tweet>.Ferrara, Emilio, Roberto Interdonato, and Andrea Tagarelli. “Online Popularity and Topical Interests through the Lens of Instagram.” Proceedings of the 25th ACM Conference on Hypertext and Social Media, Santiago, Chile, 1–4 Sep. 2014. <http://dx.doi.org/10.1145/2631775.2631808>.Gruzd, Anatoliy, Kathleen Staves, and Amanda Wilk. “Connected Scholars: Examining the Role of Social Media in Research Practices of Faculty Using the Utaut Model.” Computers in Human Behavior 28.6 (2012): 2340–50.Gunn, Andrew, and Michael Mintrom. “Evaluating the Non-Academic Impact of Academic Research: Design Considerations.” Journal of Higher Education Policy and Management 39.1 (2017): 20–30. <http://dx.doi.org/10.1080/1360080X.2016.1254429>.Highfield, Tim, and Tama Leaver. “A Methodology for Mapping Instagram Hashtags”. First Monday 20.1 (2015). 18 Oct. 2016 <http://firstmonday.org/ojs/index.php/fm/article/view/5563/4195>.Hochman, Nadav, and Lev Manovich. “Zooming into an Instagram City: Reading the Local through Social Media.” First Monday 18.7 (2013). <http://firstmonday.org/ojs/index.php/fm/article/view/4711/3698>.Hochman, Nadav, and Raz Schwartz. “Visualizing Instagram: Tracing Cultural Visual Rhythms.” Proceedings of the Workshop on Social Media Visualization (SocMedVis) in Conjunction with the Sixth International AAAI Conference on Weblogs and Social Media (ICWSM–12), 2012. 6–9. 2 June 2017 <http://razschwartz.net/wp-content/uploads/2012/01/Instagram_ICWSM12.pdf>.Huntsman, Leone. Sand in Our Souls: The Beach in Australian History. Carlton South, Victoria: Melbourne U Press, 2001.Instagram. “700 Million.” Instagram Blog, 26 Apr. 2017. 6 June 2017 <http://blog.instagram.com/post/160011713372/170426-700million>.Instagram. “Instagram Business.” 6 June 2017. <https://business.instagram.com/>.katgaskin. “Salty Pineapple”. Instagram.com, 2017. 2 June 2017 <https://www.instagram.com/katgaskin/>.katgaskin. “Salty Hair with a Pineapple Towel…” Instagram.com, 2 June 2017. 6 June 2017 <https://www.instagram.com/p/BU0zSWUF0cm/?taken-by=katgaskin>.Kibby, Marjorie Diane. “Monument Valley, Instagram, and the Closed Circle of Representation.” M/C Journal 19.5 (2016). 20 April 2017 <http://journal.media-culture.org.au/index.php/mcjournal/article/view/1152>.Kirkup, Gill. “Academic Blogging: Academic Practice and Academic Identity.” London Review of Education 8.1 (2010): 75–84.liz_ellison. “#AustralianBeachspace.” Storify.com. 8 June 2017. <https://storify.com/liz_ellison/australianbeachspace>.Mahnke Skrubbeltrang, Martina, Josefine Grunnet, and Nicolar Traasdahl Tarp. “#RIPINSTAGRAM: Examining User’s Counter-Narratives Opposing the Introduction of Algorithmic Personalization on Instagram.” First Monday 22.4 (2017). <http://firstmonday.org/ojs/index.php/fm/article/view/7574/6095>.Mahrt, Merja, Katrin Weller, and Isabella Peters. “Twitter in Scholarly Communication.” Twitter and Society. Eds. Katrin Weller, Axel Bruns, Jean Burgess, Merja Mahrt, and Cornelius Puschmann. New York: Peter Lang, 2014. 399–410. <https://eprints.qut.edu.au/66321/1/Twitter_and_Society_(2014).pdf#page=438>.Manikonda, Lydia, Yuheng Hu, and Subbarao Kambhampati. “Analyzing User Activities, Demographics, Social Network Structure and User-Generated Content on Instagram.” ArXiv (2014). 1 June 2017 <https://arxiv.org/abs/1410.8099>.Niyazov, Yuri, Carl Vogel, Richard Price, et al. “Open Access Meets Discoverability: Citations to Articles Posted to Academia.edu.” PloS One 11.2 (2016): e0148257. <https://doi.org/10.1371/journal.pone.0148257>.Powell, Douglas A., Casey J. Jacob, and Benjamin J. Chapman. “Using Blogs and New Media in Academic Practice: Potential Roles in Research, Teaching, Learning, and Extension.” Innovative Higher Education 37.4 (2012): 271–82. <http://dx.doi.org/10.1007/s10755-011-9207-7>.Veletsianos, George. “Higher Education Scholars' Participation and Practices on Twitter.” Journal of Computer Assisted Learning 28.4 (2012): 336–49. <http://dx.doi.org/10.1111/j.1365-2729.2011.00449.x>.Weilenmann, Alexandra, Thomas Hillman, and Beata Jungselius. “Instagram at the Museum: Communicating the Museum Experience through Social Photo Sharing.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Paris: ACM Press, 2013. 1843–52. <dx.doi.org/10.1145/2470654.2466243>.
APA, Harvard, Vancouver, ISO, and other styles
35

O'Meara, Radha, and Alex Bevan. "Transmedia Theory’s Author Discourse and Its Limitations." M/C Journal 21, no. 1 (March 14, 2018). http://dx.doi.org/10.5204/mcj.1366.

Full text
Abstract:
As a scholarly discourse, transmedia storytelling relies heavily on conservative constructions of authorship that laud corporate architects and patriarchs such as George Lucas and J.J. Abrams as exemplars of “the creator.” This piece argues that transmedia theory works to construct patriarchal ideals of individual authorship to the detriment of alternative conceptions of transmediality, storyworlds, and authorship. The genesis for this piece was our struggle to find a transmedia storyworld that we were both familiar with, that also qualifies as “legitimate” transmedia in the eyes of our prospective scholarly readers. After trying to wrangle our various interests, fandoms, and areas of expertise into harmony, we realized we were exerting more effort in this process of validating stories as transmedia than actually examining how stories spread across various platforms, how they make meanings, and what kinds of pleasures they offer audiences. Authorship is a definitive criterion of transmedia storytelling theory; it is also an academic red herring. We were initially interested in investigating the possible overdeterminations between the healthcare industry and Breaking Bad (2008-2013). The series revolves around a high school chemistry teacher who launches a successful meth empire as a way to pay for his cancer treatments that a dysfunctional US healthcare industry refuses to fund. We wondered if the success of the series and the timely debates on healthcare raised in its reception prompted any PR response from or discussion among US health insurers. However, our concern was that this dynamic among medical and media industries would not qualify as transmedia because these exchanges were not authored by Vince Gilligan or any of the credited creators of Breaking Bad. Yet, why shouldn’t such interfaces between the “real world” and media fiction count as part of the transmedia story that is Breaking Bad? Most stories are, in some shape or form, transmedia stories at this stage, and transmedia theory acknowledges there is a long history to this kind of practice (Freeman). Let’s dispense with restrictive definitions of transmediality and turn attention to how storytelling behaves in a digital era, that is, the processes of creating, disseminating and amending stories across many different media, the meanings and forms such media and communications produce, and the pleasures they offer audiences.Can we think about how health insurance companies responded to Breaking Bad in terms of transmedia storytelling? Defining Transmedia Storytelling via AuthorshipThe scholarly concern with defining transmedia storytelling via a strong focus on authorship has traced slight distinctions between seriality, franchising, adaptation and transmedia storytelling (Jenkins, “Transmedia Storytelling;” Johnson, “Media Franchising”). However, the theoretical discourse on authorship itself and these discussions of the tensions between forms are underwritten by a gendered bias. Indeed, the very concept of transmediality may be a gendered backlash against the rising prominence of seriality as a historically feminised mode of storytelling, associated with television and serial novels.Even with the move towards traditionally lowbrow, feminized forms of trans-serial narrative, the majority of academic and popular criticism of transmedia storytelling reproduces and reinstates narratives of male-centred, individual authorship that are historically descended from theorizations of the auteur. Auteur theory, which is still considered a legitimate analytical framework today, emerged in postwar theorizations of Hollywood film by French critics, most prominently in the journal Cahiers du Cinema, and at the nascence of film theory as a field (Cook). Auteur theory surfaced as a way to conceptualise aesthetic variation and value within the Fordist model of the Hollywood studio system (Cook). Directors were identified as the ultimate author or “creative source” if a film sufficiently fitted a paradigm of consistent “vision” across their oeuvre, and they were thus seen as artists challenging the commercialism of the studio system (Cook). In this way, classical auteur theory draws a dichotomy between art and authorship on one side and commerce and corporations on the other, strongly valorising the former for its existence within an industrial context dominated by the latter. In recent decades, auteurist notions have spread from film scholarship to pervade popular discourses of media authorship. Even though transmedia production inherently disrupts notions of authorship by diffusing the act of creation over many different media platforms and texts, much of the scholarship disproportionately chooses to vex over authorship in a manner reminiscent of classical auteur theory.In scholarly terms, a chief distinction between serial storytelling and transmedia storytelling lies in how authorship is constructed in relation to the text: serial storytelling has long been understood as relying on distributed authorship (Hilmes), but transmedia storytelling reveres the individual mastermind, or the master architect who plans and disseminates the storyworld across platforms. Henry Jenkins’ definition of transmedia storytelling is multifaceted and includes, “the systematic dispersal of multiple textual elements across many channels, which reflects the synergies of media conglomeration, based on complex story-worlds, and coordinated authorial design of integrated elements” (Jenkins, “Transmedia Storytelling”). Jenkins is perhaps the most pivotal figure in developing transmedia studies in the humanities to date and a key reference point for most scholars working in this subfield.A key limitation of Jenkins’ definition of transmedia storytelling is its emphasis on authorship, which persists in wider scholarship on transmedia storytelling. Jenkins focuses on the nature of authorship as a key characteristic of transmedia productions that distinguishes them from other kinds of intertextual and serial stories:Because transmedia storytelling requires a high degree of coordination across the different media sectors, it has so far worked best either in independent projects where the same artist shapes the story across all of the media involved or in projects where strong collaboration (or co-creation) is encouraged across the different divisions of the same company. (Jenkins, “Transmedia Storytelling”)Since the texts under discussion are commonly large in their scale, budget, and the number of people employed, it is reductive to credit particular individuals for this work and implicitly dismiss the authorial contributions of many others. Elaborating on the foundation set by Jenkins, Matthew Freeman uses Foucauldian concepts to describe two “author-functions” focused on the role of an author in defining the transmedia text itself and in marketing it (Freeman 36-38). Scott, Evans, Hills, and Hadas similarly view authorial branding as a symbolic industrial strategy significant to transmedia storytelling. Interestingly, M.J. Clarke identifies the ways transmedia television texts invite audiences to imagine a central mastermind, but also thwart and defer this impulse. Ultimately, Freeman argues that identifiable and consistent authorship is a defining characteristic of transmedia storytelling (Freeman 37), and Suzanne Scott argues that transmedia storytelling has “intensified the author’s function” from previous eras (47).Industry definitions of transmediality similarly position authorship as central to transmedia storytelling, and Jenkins’ definition has also been widely mobilised in industry discussions (Jenkins, “Transmedia” 202). This is unsurprising, because defining authorial roles has significant monetary value in terms of remuneration and copyright. In speaking to the Producers Guild of America, Jeff Gomez enumerated eight defining characteristics of transmedia production, the very first of which is, “Content is originated by one or a very few visionaries” (PGA Blog). Gomez’s talk was part of an industry-driven bid to have “Transmedia Producer” recognised by the trade associations as a legitimate and significant role; Gomez was successful and is now recognised as a transmedia producer. Nevertheless, his talk of “visionaries” not only situates authorship as central to transmedia production, but constructs authorship in very conservative, almost hagiographical terms. Indeed, Leora Hadas analyses the function of Joss Whedon’s authorship of Marvel's Agents of S.H.I.E.L.D (2013-) as a branding mechanism and argues that authors are becoming increasingly visible brands associated with transmedia stories.Such a discourse of authorship constructs individual figures as artists and masterminds, in an idealised manner that has been strongly critiqued in the wake of poststructuralism. It even recalls tired scholarly endeavours of divining authorial intention. Unsurprisingly, the figures valorised for their transmedia authorship are predominantly men; the scholarly emphasis on authorship thus reinforces the biases of media industries. Further, it idolises these figures at the expense of unacknowledged and under-celebrated female writers, directors and producers, as well as those creative workers labouring “below the line” in areas like production design, art direction, and special effects. Far from critiquing the biases of industry, academic discourse legitimises and lauds them.We hope that scholarship on transmedia storytelling might instead work to open up discourses of creation, production, authorship, and collaboration. For a story to qualify as transmedia is it even necessary to have an identifiable author? Transmedia texts and storyworlds can be genuinely collaborative or authorless creations, in which the harmony of various creators’ intentions may be unnecessary or even undesirable. Further, industry and academics alike often overlook examples of transmedia storytelling that might be considered “lowbrow.” For example, transmedia definitions should include Antonella the Uncensored Reviewer, a relatively small-scale, forty-something, plus size, YouTube channel producer whose persona is dispersed across multiple formats including beauty product reviews, letter writing, as well as interactive sex advice live casts. What happens when we blur the categories of author, celebrity, brand, and narrative in scholarship? We argue that these roles are substantially blurred in media industries in which authors like J.J. Abrams share the limelight with their stars as well as their corporate affiliations, and all “brands” are sutured to the storyworld text. These various actors all shape and are shaped by the narrative worlds they produce in an author-storyworld nexus, in which authorship includes all people working to produce the storyworld as well as the corporation funding it. Authorship never exists inside the limits of a single, male mind. Rather it is a field of relations among various players and stakeholders. While there is value in delineating between these roles for purposes of analysis and scholarly discussion, we should acknowledge that in the media industry, the roles of various stakeholders are increasingly porous.The current academic discourse of transmedia storytelling reconstructs old social biases and hierarchies in contexts where they might be most vulnerable to breakdown. Scott argues that,despite their potential to demystify and democratize authorship between producers and consumers, transmedia stories tend to reinforce boundaries between ‘official’ and ‘unauthorized’ forms of narrative expansion through the construction of a single author/textual authority figure. (44)Significantly, we suggest that it is the theorisation of transmedia storytelling that reinforces (or in fact constructs anew) an idealised author figure.The gendered dimension of the scholarly distinction between serialised (or trans-serial) and transmedial storytelling builds on a long history in the arts and the academy alike. In fact, an important precursor of transmedia narratives is the serialized novel of the Victorian era. The literature of Charlotte Brontë, George Eliot and Harriet Beecher Stowe was published in serial form and among the most widely read of the Victorian era in Western culture (Easley; Flint 21; Hilmes). Yet, these novels are rarely given proportional credit in what is popularly taught as the Western literary canon. The serial storytelling endemic to television as a medium has similarly been historically dismissed and marginalized as lowbrow and feminine (at least until the recent emergence of notions of the industrial role of the “showrunner” and the critical concept of “quality television”). Joanne Morreale outlines how trans-serial television examples, like The Dick Van Dyke Show, which spread their storyworlds across a number of different television programs, offer important precursors to today’s transmedia franchises (Morreale). In television’s nascent years, the anthology plays of the 1940s and 50s, which were discrete, unconnected hour-length stories, were heralded as cutting-edge, artistic and highbrow while serial narrative forms like the soap opera were denigrated (Boddy 80-92). Crucially, these anthology plays were largely created by and aimed at males, whereas soap operas were often created by and targeted to female audiences. The gendered terms in which various genres and modes of storytelling are discussed have implications for the value assigned to them in criticism, scholarship and culture more broadly (Hilmes; Kuhn; Johnson, “Devaluing”). Transmedia theory, as a scholarly discourse, betrays similarly gendered leanings as early television criticism, in valorising forms of transmedia narration that favour a single, male-bodied, and all-powerful author or corporation, such as George Lucas, Jim Henson or Marvel Comics.George Lucas is often depicted in scholarly and popular discourses as a headstrong transmedia auteur, as in the South Park episode ‘The China Problem’ (2008)A Circle of Men: Fans, Creators, Stories and TheoristsInterestingly, scholarly discourse on transmedia even betrays these gendered biases when exploring the engagement and activity of audiences in relation to transmedia texts. Despite the definitional emphasis on authorship, fan cultures have been a substantial topic of investigation in scholarly studies of transmedia storytelling, with many scholars elevating fans to the status of author, exploring the apparent blurring of these boundaries, and recasting the terms of these relationships (Scott; Dena; Pearson; Stein). Most notably, substantial scholarly attention has traced how transmedia texts cultivate a masculinized, “nerdy” fan culture that identifies with the male-bodied, all-powerful author or corporation (Brooker, Star Wars, Using; Jenkins, Convergence). Whether idealising the role of the creators or audiences, transmedia theory reinforces gendered hierarchies. Star Wars (1977-) is a pivotal corporate transmedia franchise that significantly shaped the convergent trajectory of media industries in the 20th century. As such it is also an anchor point for transmedia scholarship, much of which lauds and legitimates the creative work of fans. However, in focusing so heavily on the macho power struggle between George Lucas and Star Wars fans for authorial control over the storyworld, scholarship unwittingly reinstates Lucas’s status as sole creator rather than treating Star Wars’ authorship as inherently diffuse and porous.Recent fan activity surrounding animated adult science-fiction sitcom Rick and Morty (2013-) further demonstrates the macho culture of transmedia fandom in practice and its fascination with male authors. The animated series follows the intergalactic misadventures of a scientific genius and his grandson. Inspired by a seemingly inconsequential joke on the show, some of its fans began to fetishize a particular, limited-edition fast food sauce. When McDonalds, the actual owner of that sauce, cashed in by promoting the return of its Szechuan Sauce, a macho culture within the show’s fandom reached its zenith in the forms of hostile behaviour at McDonalds restaurants and online (Alexander and Kuchera). Rick and Morty fandom also built a misogynist reputation for its angry responses to the show’s efforts to hire a writer’s room that gave equal representation to women. Rick and Morty trolls doggedly harassed a few of the show’s female writers through 2017 and went so far as to post their private information online (Barsanti). Such gender politics of fan cultures have been the subject of much scholarly attention (Johnson, “Fan-tagonism”), not least in the many conversations hosted on Jenkins’ blog. Gendered performances and readings of fan activity are instrumental in defining and legitimating some texts as transmedia and some creators as masterminds, not only within fandoms but also in the scholarly discourse.When McDonalds promoted the return of their Szechuan Sauce, in response to its mention in the story world of animated sci-fi sitcom Rick and Morty, they contributed to transmedia storytelling.Both Rick and Morty and Star Wars are examples of how masculinist fan cultures, stubborn allegiances to male authorship, and definitions of transmedia converge both in academia and popular culture. While Rick and Morty is, in reality, partly female-authored, much of its media image is still anchored to its two male “creators,” Justin Roiland and Dan Harmon. Particularly in the context of #MeToo feminism, one wonders how much female authorship has been elided from existing storyworlds and, furthermore, what alternative examples of transmedia narration are exempt from current definitions of transmediality.The individual creator is a social construction of scholarship and popular discourse. This imaginary creator bears little relation to the conditions of creation and production of transmedia storyworlds, which are almost always team written and collectively authored. Further, the focus on writing itself elides the significant contributions of many creators such as those in production design (Bevan). Beyond that, what creative credit do focus groups deserve in shaping transmedia stories and their multi-layered, multi-platformed reaches? Is authorship, or even credit, really the concept we, as scholars, want to invest in when studying these forms of narration and mediation?At more symbolic levels, the seemingly exhaustless popular and scholarly appetite for male-bodied authorship persists within storyworlds themselves. The transmedia examples popularly and academically heralded as “seminal” centre on patrimony, patrilineage, and inheritance (i.e. Star Wars [1977-] and The Lord of the Rings [1937-]). Of course, Harry Potter (2001-2009) is an outlier as the celebrification of J.K. Rowling provides a strong example of credited female authorship. However, this example plays out many of the same issues, albeit the franchise is attached to a woman, in that it precludes many of the other creative minds who have helped shape Harry Potter’s world. How many more billions of dollars need we invest in men writing about the mysteries of how other men spread their genetic material across fictional universes? Moreover, transmedia studies remains dominated by academic men geeking out about how fan men geek out about how male creators write about mostly male characters in stories about … men. There are other stories waiting to be told and studied through the practices and theories of transmedia. These stories might be gender-inclusive and collective in ways that challenge traditional notions of authorship, control, rights, origin, and property.Obsession with male authorship, control, rights, origin, paternity and property is recognisible in scholarship on transmedia storytelling, and also symbolically in many of the most heralded examples of transmedia storytelling, such as the Star Wars saga.Prompting Broader DiscussionThis piece urges the development of broader understandings of transmedia storytelling. A range of media scholarship has already begun this work. Jonathan Gray’s book on paratexts offers an important pathway for such scholarship by legitimating ancillary texts, like posters and trailers, that uniquely straddle promotional and feature content platforms (Gray). A wave of scholars productively explores transmedia storytelling with a focus on storyworlds (Scolari; Harvey), often through the lens of narratology (Ryan; Ryan and Thon). Scolari, Bertetti, and Freeman have drawn together a media archaeological approach and a focus on transmedia characters in an innovative way. We hope to see greater proliferation of focuses and perspectives for the study of transmedia storytelling, including investigations that connect fictional and non-fictional worlds and stories, and a more inclusive variety of life experiences.Conversely, new scholarship on media authorship provides fresh directions, models, methods, and concepts for examining the complexity and messiness of this topic. A growing body of scholarship on the functions of media branding is also productive for reconceptualising notions of authorship in transmedia storytelling (Bourdaa; Dehry Kurtz and Bourdaa). Most notably, A Companion to Media Authorship edited by Gray and Derek Johnson productively interrogates relationships between creative processes, collaborative practices, production cultures, industrial structures, legal frameworks, and theoretical approaches around media authorship. Its case studies begin the work of reimagining of the role of authorship in transmedia, and pave the way for further developments (Burnett; Gordon; Hilmes; Stein). In particular, Matt Hills’s case study of how “counter-authorship” was negotiated on Torchwood (2006-2011) opens up new ways of thinking about multiple authorship and the variety of experiences, contributions, credits, and relationships this encompasses. Johnson’s Media Franchising addresses authorship in a complex way through a focus on social interactions, without making it a defining feature of the form; it would be significant to see a similar scholarly treatment of transmedia. At the very least, scholarly attention might turn its focus away from the very patriarchal activity of discussing definitions among a coterie and, instead, study the process of spreadability of male-centred transmedia storyworlds (Jenkins, Ford, and Green). Given that transmedia is not historically unique to the digital age, scholars might instead study how spreadability changes with the emergence of digitality and convergence, rather than pontificating on definitions of adaptation versus transmedia and cinema versus media.We urge transmedia scholars to distance their work from the malignant gender politics endemic to the media industries and particularly global Hollywood. The confluence of gendered agendas in both academia and media industries works to reinforce patriarchal hierarchies. The humanities should offer independent analysis and critique of how media industries and products function, and should highlight opportunities for conceiving of, creating, and treating such media practices and texts in new ways. As such, it is problematic that discourses on transmedia commonly neglect the distinction between what defines transmediality and what constitutes good examples of transmedia. This blurs the boundaries between description and prescription, taxonomy and hierarchy, analysis and evaluation, and definition and taste. Such discourses blinker us to what we might consider to be transmedia, but also to what examples of “good” transmedia storytelling might look like.Transmedia theory focuses disproportionately on authorship. This restricts a comprehensive understanding of transmedia storytelling, limits the lenses we bring to it, obstructs the ways we evaluate transmedia stories, and impedes how we imagine the possibilities for both media and storytelling. Stories have always been transmedial. What changes with the inception of transmedia theory is that men can claim credit for the stories and for all the work that many people do across various sectors and industries. It is questionable whether authorship is important to transmedia, in which creation is most often collective, loosely planned (at best) and diffused across many people, skill sets, and sectors. While Jenkins’s work has been pivotal in the development of transmedia theory, this is a ripe moment for the diversification of theoretical paradigms for understanding stories in the digital era.ReferencesAlexander, Julia, and Ben Kuchera. “How a Rick and Morty Joke Led to a McDonald’s Szechuan Sauce Controversy.” Polygon 4 Apr. 2017. <https://www.polygon.com/2017/10/12/16464374/rick-and-morty-mcdonalds-szechuan-sauce>.Aristotle. Aristotle's Poetics. New York: Hill and Wang, 1961. Barsanti, Sami. “Dan Harmon Is Pissed at Rick and Morty Fans Harassing Female Writers.” The AV Club 21 Sep. 2017. <https://www.avclub.com/dan-harmon-is-pissed-at-rick-and-morty-fans-for-harassi-1818628816>.Bevan, Alex. “Nostalgia for Pre-Digital Media in Mad Men.” Television & New Media 14.6 (2013): 546-559.Boddy, William. Fifties Television: The Industry and Its Critics. Chicago: U of Illinois P, 1993.Bourdaa, Mélanie. “This Is Not Marketing. This Is HBO: Branding HBO with Transmedia Storytelling.” Networking Knowledge: Journal of the MeCCSA Postgraduate Network, 7.1 (2014). <http://www.ojs.meccsa.org.uk/index.php/netknow/article/view/328>.Brooker, Will. Star Wars. London: BFI Classics, 2009. ———. Using the Force: Creativity, Community and Star Wars Fans. New York: Bloomsbury, 2003.Burnett, Colin. “Hidden Hands at Work: Authorship, the Intentional Flux and the Dynamics of Collaboration.” In A Companion to Media Authorship, eds. Jonathan Gray and Derek Johnson, 112-133. Oxford: Wiley, 2013.Clark, M.J. Transmedia Television: New Trends in Network Serial Production. New York: Bloomsbury, 2012.Cook, Pam. “Authorship and Cinema.” In The Cinema Book, 2nd ed., ed. Pam Cook, 235-314. London: BFI, 1999.Dena, Christy. Transmedia Practice: Theorising the Practice of Expressing a Fictional World across Distinct Media and Environments. PhD Thesis, University of Sydney. 2009.Dehry Kurtz, B.W.L., and Mélanie Bourdaa (eds). The Rise of Transtexts: Challenges and Opportunities. New York: Taylor and Francis, 2016.Evans, Elizabeth. Transmedia Television: Audiences, New Media and Daily Life. New York: Taylor and Francis, 2011.Easley, Alexis. First Person Anonymous. New York: Routledge, 2016.Flint, Kate. “The Victorian Novel and Its Readers.” In The Cambridge Companion to the Victorian Novel, ed. Deirdre David, 13-35. Cambridge: Cambridge UP, 2012. Freeman, Matthew. Historicising Transmedia Storytelling: Early Twentieth Century Storyworlds. New York: Taylor and Francis, 2016.Gordon, Ian. “Comics, Creators and Copyright: On the Ownership of Serial Narratives by Multiple Authors.” In A Companion to Media Authorship, eds. Jonathan Gray and Derek Johnson, 221-236. Oxford: Wiley, 2013.Gray, Jonathan. Show Sold Separately: Promos, Spoilers and Other Media Texts. New York: New York UP, 2010.Gray, Jonathan, and Derek Johnson (eds.). A Companion to Media Authorship. Chichester: Wiley, 2013.Hadas, Leora. “Authorship and Authenticity in the Transmedia Brand: The Case of Marvel’s Agents of S.H.I.E.L.D.” Networking Knowledge: Journal of the MeCCSA Postgraduate Network, 7.1 (2014). <http://www.ojs.meccsa.org.uk/index.php/netknow/article/view/332>.Harvey, Colin. Fantastic Transmedia: Narrative, Play and Memory across Fantasy Storyworlds. London: Palgrave, 2015.Hills, Matt. “From Chris Chibnall to Fox: Torchwood’s Marginalised Authors and Counter-Discourses of TV Authorship.” In A Companion to Media Authorship, eds. Jonathan Gray and Derek Johnson, 200-220. Oxford: Wiley, 2013.Hilmes, Michelle. “Never Ending Story: Authorship, Seriality and the Radio Writers Guild.” In A Companion to Media Authorship, eds. Jonathan Gray and Derek Johnson, 181-199. Oxford: Wiley, 2013.Jenkins, Henry. “Transmedia 202: Further Reflections.” Confessions of an Aca-Fan. 31 July 2011. <http://henryjenkins.org/blog/2011/08/defining_transmedia_further_re.html>.———. “Transmedia Storytelling 101.” Confessions of an Aca-Fan. 21 Mar. 2007. <http://henryjenkins.org/blog/2007/03/transmedia_storytelling_101.html>.———. Convergence Culture: Where Old and New Media Collide. New York: New York University Press, 2006.———, Sam Ford, and Joshua Green. Spreadable Media: Creating Value and Meaning in a Networked Culture. New York: New York UP, 2013.Johnson, Derek. Media Franchising: Creative License and Collaboration in the Culture Industries. New York: New York UP, 2013.———. “Fan-tagonism: Factions, Institutions, and Constitutive Hegemonies of Fandom.” In Fandom: Identities and Communities in a Mediated World, eds. Jonathan Gray, Cornell Sandvoss, and C. Lee Harrington, 285-300. New York: New York UP, 2007.———. “Devaluing and Revaluing Seriality: The Gendered Discourses of Media Franchising.” Media, Culture & Society, 33.7 (2011): 1077-1093. Kuhn, Annette. “Women’s Genres: Melodrama, Soap Opera and Theory.” In Feminist Television Criticism: A Reader, eds. Charlotte Brunsdon and Lynn Spigel, 225-234. 2nd ed. Maidenhead: Open UP, 2008.Morreale, Joanne. The Dick Van Dyke Show. Detroit, MI: Wayne State UP, 2015.Pearson, Roberta. “Fandom in the Digital Era.” Popular Communication, 8.1 (2010): 84-95. DOI: 10.1080/15405700903502346.Producers Guild of America, The. “Defining Characteristics of Trans-Media Production.” PGA NMC Blog. 2 Oct. 2007. <http://pganmc.blogspot.com.au/2007/10/pga-member-jeff-gomez-left-assembled.html>.Rodham Clinton, Hillary. What Happened. New York: Simon & Schuster, 2017.Ryan, Marie-Laure. “Transmedial Storytelling and Transficitonality.” Poetics Today, 34.3 (2013): 361-388. DOI: 10.1215/03335372-2325250. ———, and Jan-Noȅl Thon (eds.). Storyworlds across Media: Toward a Media-Conscious Narratology. Lincoln: U of Nebraska P, 2014.Scolari, Carlos A. “Transmedia Storytelling: Implicit Consumers, Narrative Worlds, and Branding in Contemporary Media Production.” International Journal of Communication, 3 (2009): 586-606.———, Paolo Bertetti, and Matthew Freeman. Transmedia Archaeology: Storytelling in the Borderlines of Science Fiction. London: Palgrave, 2014.Scott, Suzanne. “Who’s Steering the Mothership?: The Role of the Fanboy Auteur in Transmedia Storytelling.” In The Participatory Cultures Handbook, edited by Aaron Delwiche and Jennifer Jacobs Henderson, 43-52. London: Routledge, 2013.Stein, Louisa Ellen. “#Bowdown to Your New God: Misha Collins and Decentered Authorship in the Digital Age.” In A Companion to Media Authorship, ed. Jonathan Gray and Derek Johnson, 403-425. Oxford: Wiley, 2013.
APA, Harvard, Vancouver, ISO, and other styles
36

Livingstone, Randall M. "Let’s Leave the Bias to the Mainstream Media: A Wikipedia Community Fighting for Information Neutrality." M/C Journal 13, no. 6 (November 23, 2010). http://dx.doi.org/10.5204/mcj.315.

Full text
Abstract:
Although I'm a rich white guy, I'm also a feminist anti-racism activist who fights for the rights of the poor and oppressed. (Carl Kenner)Systemic bias is a scourge to the pillar of neutrality. (Cerejota)Count me in. Let's leave the bias to the mainstream media. (Orcar967)Because this is so important. (CuttingEdge)These are a handful of comments posted by online editors who have banded together in a virtual coalition to combat Western bias on the world’s largest digital encyclopedia, Wikipedia. This collective action by Wikipedians both acknowledges the inherent inequalities of a user-controlled information project like Wikpedia and highlights the potential for progressive change within that same project. These community members are taking the responsibility of social change into their own hands (or more aptly, their own keyboards).In recent years much research has emerged on Wikipedia from varying fields, ranging from computer science, to business and information systems, to the social sciences. While critical at times of Wikipedia’s growth, governance, and influence, most of this work observes with optimism that barriers to improvement are not firmly structural, but rather they are socially constructed, leaving open the possibility of important and lasting change for the better.WikiProject: Countering Systemic Bias (WP:CSB) considers one such collective effort. Close to 350 editors have signed on to the project, which began in 2004 and itself emerged from a similar project named CROSSBOW, or the “Committee Regarding Overcoming Serious Systemic Bias on Wikipedia.” As a WikiProject, the term used for a loose group of editors who collaborate around a particular topic, these editors work within the Wikipedia site and collectively create a social network that is unified around one central aim—representing the un- and underrepresented—and yet they are bound by no particular unified set of interests. The first stage of a multi-method study, this paper looks at a snapshot of WP:CSB’s activity from both content analysis and social network perspectives to discover “who” geographically this coalition of the unrepresented is inserting into the digital annals of Wikipedia.Wikipedia and WikipediansDeveloped in 2001 by Internet entrepreneur Jimmy Wales and academic Larry Sanger, Wikipedia is an online collaborative encyclopedia hosting articles in nearly 250 languages (Cohen). The English-language Wikipedia contains over 3.2 million articles, each of which is created, edited, and updated solely by users (Wikipedia “Welcome”). At the time of this study, Alexa, a website tracking organisation, ranked Wikipedia as the 6th most accessed site on the Internet. Unlike the five sites ahead of it though—Google, Facebook, Yahoo, YouTube (owned by Google), and live.com (owned by Microsoft)—all of which are multibillion-dollar businesses that deal more with information aggregation than information production, Wikipedia is a non-profit that operates on less than $500,000 a year and staffs only a dozen paid employees (Lih). Wikipedia is financed and supported by the WikiMedia Foundation, a charitable umbrella organisation with an annual budget of $4.6 million, mainly funded by donations (Middleton).Wikipedia editors and contributors have the option of creating a user profile and participating via a username, or they may participate anonymously, with only an IP address representing their actions. Despite the option for total anonymity, many Wikipedians have chosen to visibly engage in this online community (Ayers, Matthews, and Yates; Bruns; Lih), and researchers across disciplines are studying the motivations of these new online collectives (Kane, Majchrzak, Johnson, and Chenisern; Oreg and Nov). The motivations of open source software contributors, such as UNIX programmers and programming groups, have been shown to be complex and tied to both extrinsic and intrinsic rewards, including online reputation, self-satisfaction and enjoyment, and obligation to a greater common good (Hertel, Niedner, and Herrmann; Osterloh and Rota). Investigation into why Wikipedians edit has indicated multiple motivations as well, with community engagement, task enjoyment, and information sharing among the most significant (Schroer and Hertel). Additionally, Wikipedians seem to be taking up the cause of generativity (a concern for the ongoing health and openness of the Internet’s infrastructures) that Jonathan Zittrain notably called for in The Future of the Internet and How to Stop It. Governance and ControlAlthough the technical infrastructure of Wikipedia is built to support and perhaps encourage an equal distribution of power on the site, Wikipedia is not a land of “anything goes.” The popular press has covered recent efforts by the site to reduce vandalism through a layer of editorial review (Cohen), a tightening of control cited as a possible reason for the recent dip in the number of active editors (Edwards). A number of regulations are already in place that prevent the open editing of certain articles and pages, such as the site’s disclaimers and pages that have suffered large amounts of vandalism. Editing wars can also cause temporary restrictions to editing, and Ayers, Matthews, and Yates point out that these wars can happen anywhere, even to Burt Reynold’s page.Academic studies have begun to explore the governance and control that has developed in the Wikipedia community, generally highlighting how order is maintained not through particular actors, but through established procedures and norms. Konieczny tested whether Wikipedia’s evolution can be defined by Michels’ Iron Law of Oligopoly, which predicts that the everyday operations of any organisation cannot be run by a mass of members, and ultimately control falls into the hands of the few. Through exploring a particular WikiProject on information validation, he concludes:There are few indicators of an oligarchy having power on Wikipedia, and few trends of a change in this situation. The high level of empowerment of individual Wikipedia editors with regard to policy making, the ease of communication, and the high dedication to ideals of contributors succeed in making Wikipedia an atypical organization, quite resilient to the Iron Law. (189)Butler, Joyce, and Pike support this assertion, though they emphasise that instead of oligarchy, control becomes encapsulated in a wide variety of structures, policies, and procedures that guide involvement with the site. A virtual “bureaucracy” emerges, but one that should not be viewed with the negative connotation often associated with the term.Other work considers control on Wikipedia through the framework of commons governance, where “peer production depends on individual action that is self-selected and decentralized rather than hierarchically assigned. Individuals make their own choices with regard to resources managed as a commons” (Viegas, Wattenberg and McKeon). The need for quality standards and quality control largely dictate this commons governance, though interviewing Wikipedians with various levels of responsibility revealed that policies and procedures are only as good as those who maintain them. Forte, Larco, and Bruckman argue “the Wikipedia community has remained healthy in large part due to the continued presence of ‘old-timers’ who carry a set of social norms and organizational ideals with them into every WikiProject, committee, and local process in which they take part” (71). Thus governance on Wikipedia is a strong representation of a democratic ideal, where actors and policies are closely tied in their evolution. Transparency, Content, and BiasThe issue of transparency has proved to be a double-edged sword for Wikipedia and Wikipedians. The goal of a collective body of knowledge created by all—the “expert” and the “amateur”—can only be upheld if equal access to page creation and development is allotted to everyone, including those who prefer anonymity. And yet this very option for anonymity, or even worse, false identities, has been a sore subject for some in the Wikipedia community as well as a source of concern for some scholars (Santana and Wood). The case of a 24-year old college dropout who represented himself as a multiple Ph.D.-holding theology scholar and edited over 16,000 articles brought these issues into the public spotlight in 2007 (Doran; Elsworth). Wikipedia itself has set up standards for content that include expectations of a neutral point of view, verifiability of information, and the publishing of no original research, but Santana and Wood argue that self-policing of these policies is not adequate:The principle of managerial discretion requires that every actor act from a sense of duty to exercise moral autonomy and choice in responsible ways. When Wikipedia’s editors and administrators remain anonymous, this criterion is simply not met. It is assumed that everyone is behaving responsibly within the Wikipedia system, but there are no monitoring or control mechanisms to make sure that this is so, and there is ample evidence that it is not so. (141) At the theoretical level, some downplay these concerns of transparency and autonomy as logistical issues in lieu of the potential for information systems to support rational discourse and emancipatory forms of communication (Hansen, Berente, and Lyytinen), but others worry that the questionable “realities” created on Wikipedia will become truths once circulated to all areas of the Web (Langlois and Elmer). With the number of articles on the English-language version of Wikipedia reaching well into the millions, the task of mapping and assessing content has become a tremendous endeavour, one mostly taken on by information systems experts. Kittur, Chi, and Suh have used Wikipedia’s existing hierarchical categorisation structure to map change in the site’s content over the past few years. Their work revealed that in early 2008 “Culture and the arts” was the most dominant category of content on Wikipedia, representing nearly 30% of total content. People (15%) and geographical locations (14%) represent the next largest categories, while the natural and physical sciences showed the greatest increase in volume between 2006 and 2008 (+213%D, with “Culture and the arts” close behind at +210%D). This data may indicate that contributing to Wikipedia, and thus spreading knowledge, is growing amongst the academic community while maintaining its importance to the greater popular culture-minded community. Further work by Kittur and Kraut has explored the collaborative process of content creation, finding that too many editors on a particular page can reduce the quality of content, even when a project is well coordinated.Bias in Wikipedia content is a generally acknowledged and somewhat conflicted subject (Giles; Johnson; McHenry). The Wikipedia community has created numerous articles and pages within the site to define and discuss the problem. Citing a survey conducted by the University of Würzburg, Germany, the “Wikipedia:Systemic bias” page describes the average Wikipedian as:MaleTechnically inclinedFormally educatedAn English speakerWhiteAged 15-49From a majority Christian countryFrom a developed nationFrom the Northern HemisphereLikely a white-collar worker or studentBias in content is thought to be perpetuated by this demographic of contributor, and the “founder effect,” a concept from genetics, linking the original contributors to this same demographic has been used to explain the origins of certain biases. Wikipedia’s “About” page discusses the issue as well, in the context of the open platform’s strengths and weaknesses:in practice editing will be performed by a certain demographic (younger rather than older, male rather than female, rich enough to afford a computer rather than poor, etc.) and may, therefore, show some bias. Some topics may not be covered well, while others may be covered in great depth. No educated arguments against this inherent bias have been advanced.Royal and Kapila’s study of Wikipedia content tested some of these assertions, finding identifiable bias in both their purposive and random sampling. They conclude that bias favoring larger countries is positively correlated with the size of the country’s Internet population, and corporations with larger revenues work in much the same way, garnering more coverage on the site. The researchers remind us that Wikipedia is “more a socially produced document than a value-free information source” (Royal & Kapila).WikiProject: Countering Systemic BiasAs a coalition of current Wikipedia editors, the WikiProject: Countering Systemic Bias (WP:CSB) attempts to counter trends in content production and points of view deemed harmful to the democratic ideals of a valueless, open online encyclopedia. WP:CBS’s mission is not one of policing the site, but rather deepening it:Generally, this project concentrates upon remedying omissions (entire topics, or particular sub-topics in extant articles) rather than on either (1) protesting inappropriate inclusions, or (2) trying to remedy issues of how material is presented. Thus, the first question is "What haven't we covered yet?", rather than "how should we change the existing coverage?" (Wikipedia, “Countering”)The project lays out a number of content areas lacking adequate representation, geographically highlighting the dearth in coverage of Africa, Latin America, Asia, and parts of Eastern Europe. WP:CSB also includes a “members” page that editors can sign to show their support, along with space to voice their opinions on the problem of bias on Wikipedia (the quotations at the beginning of this paper are taken from this “members” page). At the time of this study, 329 editors had self-selected and self-identified as members of WP:CSB, and this group constitutes the population sample for the current study. To explore the extent to which WP:CSB addressed these self-identified areas for improvement, each editor’s last 50 edits were coded for their primary geographical country of interest, as well as the conceptual category of the page itself (“P” for person/people, “L” for location, “I” for idea/concept, “T” for object/thing, or “NA” for indeterminate). For example, edits to the Wikipedia page for a single person like Tony Abbott (Australian federal opposition leader) were coded “Australia, P”, while an edit for a group of people like the Manchester United football team would be coded “England, P”. Coding was based on information obtained from the header paragraphs of each article’s Wikipedia page. After coding was completed, corresponding information on each country’s associated continent was added to the dataset, based on the United Nations Statistics Division listing.A total of 15,616 edits were coded for the study. Nearly 32% (n = 4962) of these edits were on articles for persons or people (see Table 1 for complete coding results). From within this sub-sample of edits, a majority of the people (68.67%) represented are associated with North America and Europe (Figure A). If we break these statistics down further, nearly half of WP:CSB’s edits concerning people were associated with the United States (36.11%) and England (10.16%), with India (3.65%) and Australia (3.35%) following at a distance. These figures make sense for the English-language Wikipedia; over 95% of the population in the three Westernised countries speak English, and while India is still often regarded as a developing nation, its colonial British roots and the emergence of a market economy with large, technology-driven cities are logical explanations for its representation here (and some estimates make India the largest English-speaking nation by population on the globe today).Table A Coding Results Total Edits 15616 (I) Ideas 2881 18.45% (L) Location 2240 14.34% NA 333 2.13% (T) Thing 5200 33.30% (P) People 4962 31.78% People by Continent Africa 315 6.35% Asia 827 16.67% Australia 175 3.53% Europe 1411 28.44% NA 110 2.22% North America 1996 40.23% South America 128 2.58% The areas of the globe of main concern to WP:CSB proved to be much less represented by the coalition itself. Asia, far and away the most populous continent with more than 60% of the globe’s people (GeoHive), was represented in only 16.67% of edits. Africa (6.35%) and South America (2.58%) were equally underrepresented compared to both their real-world populations (15% and 9% of the globe’s population respectively) and the aforementioned dominance of the advanced Westernised areas. However, while these percentages may seem low, in aggregate they do meet the quota set on the WP:CSB Project Page calling for one out of every twenty edits to be “a subject that is systematically biased against the pages of your natural interests.” By this standard, the coalition is indeed making headway in adding content that strategically counterbalances the natural biases of Wikipedia’s average editor.Figure ASocial network analysis allows us to visualise multifaceted data in order to identify relationships between actors and content (Vego-Redondo; Watts). Similar to Davis’s well-known sociological study of Southern American socialites in the 1930s (Scott), our Wikipedia coalition can be conceptualised as individual actors united by common interests, and a network of relations can be constructed with software such as UCINET. A mapping algorithm that considers both the relationship between all sets of actors and each actor to the overall collective structure produces an image of our network. This initial network is bimodal, as both our Wikipedia editors and their edits (again, coded for country of interest) are displayed as nodes (Figure B). Edge-lines between nodes represents a relationship, and here that relationship is the act of editing a Wikipedia article. We see from our network that the “U.S.” and “England” hold central positions in the network, with a mass of editors crowding around them. A perimeter of nations is then held in place by their ties to editors through the U.S. and England, with a second layer of editors and poorly represented nations (Gabon, Laos, Uzbekistan, etc.) around the boundaries of the network.Figure BWe are reminded from this visualisation both of the centrality of the two Western powers even among WP:CSB editoss, and of the peripheral nature of most other nations in the world. But we also learn which editors in the project are contributing most to underrepresented areas, and which are less “tied” to the Western core. Here we see “Wizzy” and “Warofdreams” among the second layer of editors who act as a bridge between the core and the periphery; these are editors with interests in both the Western and marginalised nations. Located along the outer edge, “Gallador” and “Gerrit” have no direct ties to the U.S. or England, concentrating all of their edits on less represented areas of the globe. Identifying editors at these key positions in the network will help with future research, informing interview questions that will investigate their interests further, but more significantly, probing motives for participation and action within the coalition.Additionally, we can break the network down further to discover editors who appear to have similar interests in underrepresented areas. Figure C strips down the network to only editors and edits dealing with Africa and South America, the least represented continents. From this we can easily find three types of editors again: those who have singular interests in particular nations (the outermost layer of editors), those who have interests in a particular region (the second layer moving inward), and those who have interests in both of these underrepresented regions (the center layer in the figure). This last group of editors may prove to be the most crucial to understand, as they are carrying the full load of WP:CSB’s mission.Figure CThe End of Geography, or the Reclamation?In The Internet Galaxy, Manuel Castells writes that “the Internet Age has been hailed as the end of geography,” a bold suggestion, but one that has gained traction over the last 15 years as the excitement for the possibilities offered by information communication technologies has often overshadowed structural barriers to participation like the Digital Divide (207). Castells goes on to amend the “end of geography” thesis by showing how global information flows and regional Internet access rates, while creating a new “map” of the world in many ways, is still closely tied to power structures in the analog world. The Internet Age: “redefines distance but does not cancel geography” (207). The work of WikiProject: Countering Systemic Bias emphasises the importance of place and representation in the information environment that continues to be constructed in the online world. This study looked at only a small portion of this coalition’s efforts (~16,000 edits)—a snapshot of their labor frozen in time—which itself is only a minute portion of the information being dispatched through Wikipedia on a daily basis (~125,000 edits). Further analysis of WP:CSB’s work over time, as well as qualitative research into the identities, interests and motivations of this collective, is needed to understand more fully how information bias is understood and challenged in the Internet galaxy. The data here indicates this is a fight worth fighting for at least a growing few.ReferencesAlexa. “Top Sites.” Alexa.com, n.d. 10 Mar. 2010 ‹http://www.alexa.com/topsites>. Ayers, Phoebe, Charles Matthews, and Ben Yates. How Wikipedia Works: And How You Can Be a Part of It. San Francisco, CA: No Starch, 2008.Bruns, Axel. Blogs, Wikipedia, Second Life, and Beyond: From Production to Produsage. New York: Peter Lang, 2008.Butler, Brian, Elisabeth Joyce, and Jacqueline Pike. Don’t Look Now, But We’ve Created a Bureaucracy: The Nature and Roles of Policies and Rules in Wikipedia. Paper presented at 2008 CHI Annual Conference, Florence.Castells, Manuel. The Internet Galaxy: Reflections on the Internet, Business, and Society. Oxford: Oxford UP, 2001.Cohen, Noam. “Wikipedia.” New York Times, n.d. 12 Mar. 2010 ‹http://www.nytimes.com/info/wikipedia/>. Doran, James. “Wikipedia Chief Promises Change after ‘Expert’ Exposed as Fraud.” The Times, 6 Mar. 2007 ‹http://technology.timesonline.co.uk/tol/news/tech_and_web/article1480012.ece>. Edwards, Lin. “Report Claims Wikipedia Losing Editors in Droves.” Physorg.com, 30 Nov 2009. 12 Feb. 2010 ‹http://www.physorg.com/news178787309.html>. Elsworth, Catherine. “Fake Wikipedia Prof Altered 20,000 Entries.” London Telegraph, 6 Mar. 2007 ‹http://www.telegraph.co.uk/news/1544737/Fake-Wikipedia-prof-altered-20000-entries.html>. Forte, Andrea, Vanessa Larco, and Amy Bruckman. “Decentralization in Wikipedia Governance.” Journal of Management Information Systems 26 (2009): 49-72.Giles, Jim. “Internet Encyclopedias Go Head to Head.” Nature 438 (2005): 900-901.Hansen, Sean, Nicholas Berente, and Kalle Lyytinen. “Wikipedia, Critical Social Theory, and the Possibility of Rational Discourse.” The Information Society 25 (2009): 38-59.Hertel, Guido, Sven Niedner, and Stefanie Herrmann. “Motivation of Software Developers in Open Source Projects: An Internet-Based Survey of Contributors to the Linex Kernel.” Research Policy 32 (2003): 1159-1177.Johnson, Bobbie. “Rightwing Website Challenges ‘Liberal Bias’ of Wikipedia.” The Guardian, 1 Mar. 2007. 8 Mar. 2010 ‹http://www.guardian.co.uk/technology/2007/mar/01/wikipedia.news>. Kane, Gerald C., Ann Majchrzak, Jeremaih Johnson, and Lily Chenisern. A Longitudinal Model of Perspective Making and Perspective Taking within Fluid Online Collectives. Paper presented at the 2009 International Conference on Information Systems, Phoenix, AZ, 2009.Kittur, Aniket, Ed H. Chi, and Bongwon Suh. What’s in Wikipedia? Mapping Topics and Conflict Using Socially Annotated Category Structure. Paper presented at the 2009 CHI Annual Conference, Boston, MA.———, and Robert E. Kraut. Harnessing the Wisdom of Crowds in Wikipedia: Quality through Collaboration. Paper presented at the 2008 Association for Computing Machinery’s Computer Supported Cooperative Work Annual Conference, San Diego, CA.Konieczny, Piotr. “Governance, Organization, and Democracy on the Internet: The Iron Law and the Evolution of Wikipedia.” Sociological Forum 24 (2009): 162-191.———. “Wikipedia: Community or Social Movement?” Interface: A Journal for and about Social Movements 1 (2009): 212-232.Langlois, Ganaele, and Greg Elmer. “Wikipedia Leeches? The Promotion of Traffic through a Collaborative Web Format.” New Media & Society 11 (2009): 773-794.Lih, Andrew. The Wikipedia Revolution. New York, NY: Hyperion, 2009.McHenry, Robert. “The Real Bias in Wikipedia: A Response to David Shariatmadari.” OpenDemocracy.com 2006. 8 Mar. 2010 ‹http://www.opendemocracy.net/media-edemocracy/wikipedia_bias_3621.jsp>. Middleton, Chris. “The World of Wikinomics.” Computer Weekly, 20 Jan. 2009: 22-26.Oreg, Shaul, and Oded Nov. “Exploring Motivations for Contributing to Open Source Initiatives: The Roles of Contribution, Context and Personal Values.” Computers in Human Behavior 24 (2008): 2055-2073.Osterloh, Margit and Sandra Rota. “Trust and Community in Open Source Software Production.” Analyse & Kritik 26 (2004): 279-301.Royal, Cindy, and Deepina Kapila. “What’s on Wikipedia, and What’s Not…?: Assessing Completeness of Information.” Social Science Computer Review 27 (2008): 138-148.Santana, Adele, and Donna J. Wood. “Transparency and Social Responsibility Issues for Wikipedia.” Ethics of Information Technology 11 (2009): 133-144.Schroer, Joachim, and Guido Hertel. “Voluntary Engagement in an Open Web-Based Encyclopedia: Wikipedians and Why They Do It.” Media Psychology 12 (2009): 96-120.Scott, John. Social Network Analysis. London: Sage, 1991.Vego-Redondo, Fernando. Complex Social Networks. Cambridge: Cambridge UP, 2007.Viegas, Fernanda B., Martin Wattenberg, and Matthew M. McKeon. “The Hidden Order of Wikipedia.” Online Communities and Social Computing (2007): 445-454.Watts, Duncan. Six Degrees: The Science of a Connected Age. New York, NY: W. W. Norton & Company, 2003Wikipedia. “About.” n.d. 8 Mar. 2010 ‹http://en.wikipedia.org/wiki/Wikipedia:About>. ———. “Welcome to Wikipedia.” n.d. 8 Mar. 2010 ‹http://en.wikipedia.org/wiki/Main_Page>.———. “Wikiproject:Countering Systemic Bias.” n.d. 12 Feb. 2010 ‹http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Countering_systemic_bias#Members>. Zittrain, Jonathan. The Future of the Internet and How to Stop It. New Haven, CT: Yale UP, 2008.
APA, Harvard, Vancouver, ISO, and other styles
37

Hinner, Kajetan. "Statistics of Major IRC Networks." M/C Journal 3, no. 4 (August 1, 2000). http://dx.doi.org/10.5204/mcj.1867.

Full text
Abstract:
Internet Relay Chat (IRC) is a text-based computer-mediated communication (CMC) service in which people can meet and chat in real time. Most chat occurs in channels named for a specific topic, such as #usa or #linux. A user can take part in several channels when connected to an IRC network. For a long time the only major IRC network available was EFnet, founded in 1990. Over the 1990s three other major IRC networks developed, Undernet (1993), DALnet (1994) and IRCnet (which split from EFnet in June 1996). Several causes led to the separate development of IRC networks: fast growth of user numbers, poor scalability of the IRC protocol and content disagreements, like allowing or prohibiting 'bot programs. Today we are experiencing the development of regional IRC networks, such as BrasNet for Brazilian users, and increasing regionalisation of the global networks -- IRCnet users are generally European, EFnet users generally from the Americas and Australia. All persons connecting to an IRC network at one time create that IRC network's user space. People are constantly signing on and off each network. The total number of users who have ever been to a specific IRC network could be called its 'social space' and an IRC network's social space is by far larger than its user space at any one time. Although there has been research on IRC almost from its beginning (it was developed in 1988, and the first research was made available in late 1991 (Reid)), resources on quantitative development are rare. To rectify this situation, a quantitative data logging 'bot program -- Socip -- was created and set to run on various IRC networks. Socip has been running for almost two years on several IRC networks, giving Internet researchers empirical data of the quantitative development of IRC. Methodology Any approach to gathering quantitative data on IRC needs to fulfil the following tasks: Store the number of users that are on an IRC network at a given time, e.g. every five minutes; Store the number of channels; and, Store the number of servers. It is possible to get this information using the '/lusers' command on an IRC-II client, entered by hand. This approach yields results as in Table 1. Table 1: Number of IRC users on January 31st, 1995 Date Time Users Invisible Servers Channels 31.01.95 10:57 2737 2026 93 1637 During the first months of 1995, it was even possible to get all user information using the '/who **' command. However, on current major IRC networks with greater than 50000 users this method is denied by the IRC Server program, which terminates the connection because it is too slow to accept that amount of data. Added to this problem is the fact that collecting these data manually is an exhausting and repetitive task, better suited to automation. Three approaches to automation were attempted in the development process. The 'Eggdrop' approach The 'Eggdrop' 'bot is one of the best-known IRC 'bot programs. Once programmed, 'bots can act autonomously on an IRC network, and Eggdrop was considered particularly convenient because customised modules could be easily installed. However, testing showed that the Eggdrop 'bot was unsuitable for two reasons. The first was technical: for reasons undetermined, all Eggdrop modules created extensive CPU usage, making it impossible to run several Eggdrops simultaneously to research a number of IRC networks. The second reason had to do with the statistics to be obtained. The objective was to get a snapshot of current IRC users and IRC channel use every five minutes, written into an ASCII file. It was impossible to extend Eggdrop's possibilities in a way that it would periodically submit the '/lusers' command and write the received data into a file. For these reasons, and some security concerns, the Eggdrop approach was abandoned. IrcII was a UNIX IRC client with its own scripting language, making it possible to write command files which periodically submit the '/lusers' command to any chosen IRC server and log the command's output. Four different scripts were used to monitor IRCnet, EFnet, DALnet and Undernet from January to October 1998. These scripts were named Socius_D, Socius_E, Socius_I and Socius_U (depending on the network). Every hour each script stored the number of users and channels in a logfile (examinable using another script written in the Perl language). There were some drawbacks to the ircII script approach. While the need for a terminal to run on could be avoided using the 'screen' package -- making it possible to start ircII, run the scripts, detach, and log off again -- it was impossible to restart ircII and the scripts using an automatic task-scheduler. Thus periodic manual checks were required to find out if the scripts were still running and restart them if needed (e.g. if the server connection was lost). These checks showed that at least one script would not be running after 10 hours. Additional disadvantages were the lengthy log files and the necessity of providing a second program to extract the log file data and write it into a second file from which meaningful graphs could be created. The failure of the Eggdrop and ircII scripting approaches lead to the solution still in use today. Perl script-only approach Perl is a powerful script language for handling file-oriented data when speed is not extremely important. Its version 5 flavour allows a lot of modules to use it for expansion, including the Net::IRC package. The object-oriented Perl interface enables Perl scripts to connect to an IRC server, and use the basic IRC commands. The Socip.pl program includes all server definitions needed to create connections. Socip is currently monitoring ten major IRC networks, including DALnet, EFnet, IRCnet, the Microsoft Network, Talkcity, Undernet and Galaxynet. When run, "Social science IRC program" selects a nickname from its list corresponding to the network -- For EFnet, the first nickname used is Socip_E1. It then functions somewhat like a 'bot. Using that nickname, Socip tries to create an IRC connection to a server of the given network. If there is no failure, handlers are set up which take care of proper reactions to IRC server messages (such as Ping-pong, message output and reply). Socip then joins the channel #hose (the name has no special meaning), a maintenance channel with the additional effect of real persons meeting the 'bot and trying to interact with it every now and then. Those interactions are logged too. Sitting in that channel, the script sleeps periodically and checks if a certain time span has passed (the default is five minutes). After that, the '/lusers' command's output is stored in a data file for each IRC network and the IRC network's RRD (Round Robin database) file is updated. This database, which is organised chronologically, offers great detail for recent events and more condensed information for older events. User and channel information younger than 10 days is stored in five-minute detail. If older than two years, the same information is automatically averaged and stored in a per-day resolution. In case of network problems, Socip acts as necessary. For example, it recognises a connection termination and tries to reconnect after pausing by using the next nickname on the list. This prevents nickname collision problems. If the IRC server does not respond to '/luser' commands three times in a row, the next server on the list is accessed. Special (crontab-invoked) scripts take care of restarting Socip when necessary, as in termination of script because of network problems, IRC operator kill or power failure. After a reboot all scripts are automatically restarted. All monitoring is done on a Linux machine (Pentium 120, 32 MB, Debian Linux 2.1) which is up all the time. Processor load is not extensive, and this machine also acts as the Sociology Department's WWW-Server. Graphs creation Graphs can be created from the data in Socip's RRD files. This task is done using the MRTG (multi router traffic grapher) program by Tobias Oetiker. A script updates all IRC graphs four times a day. Usage of each IRC network is visualised through five graphs: Daily, Weekly and Monthly users and channels, accompanied by two graphs showing all known data users/channels and servers. All this information is continuously published on the World Wide Web at http://www.hinner.com/ircstat. Figures The following samples demonstrate what information can be produced by Socip. As already mentioned, graphs of all monitored networks are updated four times a day, with five graphs for each IRC network. Figure 1 shows the rise of EFnet users from about 40000 in November 1998 to 65000 in July 2000. Sampled data is oscillating around an average amount, which is resulting from the different time zones of users. Fig. 1: EFnet - Users and Channels since November 1998 Figure 2 illustrates the decrease of interconnected EFnet servers over the years. Each server is now handling more and more users. Reasons for taking IRC servers off the net are security concerns (attacks on the server by malicious persons), new payment schemes, maintenance and cost effort. Fig. 2: EFnet - Servers since November 1998 A nice example of a heavily changing weekly graph is Figure 3, which shows peaks shortly before 6pm CEST and almost no users shortly after midnight. Fig. 3: Galaxynet: Weekly Graph (July, 15th-22nd, 2000) The daily graph portrays usage variations with even more detail. Figure 4 is taken from Undernet user and channel data. The vertical gap in the graph indicates missing data, caused either by a net split or other network problems. Fig. 4: Undernet: Daily Graph: July, 22nd, 2000 The final example (Figure 5) shows a weekly graph of the Webchat (http://www.webchat.org) network. It can be seen that every day the user count varies from 5000 to nearly 20000, and that channel numbers fluctuate in concert accordingly from 2500 to 5000. Fig. 5: Webchat: Monthly graph, Week 24-29, 2000 Not every IRC user is connected all the time to an IRC network. This figure may have increased lately with more and more flatrates and cheap Internet access offers, but in general most users will sign off the network after some time. This is why IRC is a very dynamic society, with its membership constantly in flux. Maximum user counts only give the highest number of members who were simultaneously online at some point, and one could only guess at the number of total users of the network -- that is, including those who are using that IRC service but are not signed on at that time. To answer these questions, more thorough investigation is necessary. Then inflows and outflows might be more readily estimated. Table 2 shows the all time maximum user counts of seven IRC networks, compared to the average numbers of IRC users of the four major IRC networks during the third quarter 1998 (based on available data). Table 2: Maximum user counts of selected IRC networks DALnet EFnet Galaxy Net IRCnet MS Chat Undernet Webchat Max. 2000 64276 64309 15253 65340 17392 60210 19793 3rd Q. 1998 21000 37000 n/a 24500 n/a 24000 n/a Compared with the 200-300 users in 1991 and the 7000 IRC-chatters in 1994, the recent growth is certainly extraordinary: it adds up to a total of 306573 users across all monitored networks. It can be expected that the 500000 IRC user threshold will be passed some time during the year 2001. As a final remark, it should be said that obviously Web-based chat systems will be more and more common in the future. These chat services do not use standard IRC protocols, and will be very hard to monitor. Given that these systems are already quite popular, the actual number of chat users in the world could have already passed the half million landmark. References Reid, Elizabeth. "Electropolis: Communications and Community on Internet Relay Chat." Unpublished Honours Dissertation. U of Melbourne, 1991. The Socip program can be obtained at no cost from http://www.hinner.com. Most IRC networks can be accessed with the original Net::Irc Perl extension, but for some special cases (e.g. Talkcity) an extended version is needed, which can also be found there. Citation reference for this article MLA style: Kajetan Hinner. "Statistics of Major IRC Networks: Methods and Summary of User Count." M/C: A Journal of Media and Culture 3.4 (2000). [your date of access] <http://www.api-network.com/mc/0008/count.php>. Chicago style: Kajetan Hinner, "Statistics of Major IRC Networks: Methods and Summary of User Count," M/C: A Journal of Media and Culture 3, no. 4 (2000), <http://www.api-network.com/mc/0008/count.php> ([your date of access]). APA style: Kajetan Hinner. (2000) Statistics of major IRC networks: methods and summary of user count. M/C: A Journal of Media and Culture 3(4). <http://www.api-network.com/mc/0008/count.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
38

Ruch, Adam, and Steve Collins. "Zoning Laws: Facebook and Google+." M/C Journal 14, no. 5 (October 18, 2011). http://dx.doi.org/10.5204/mcj.411.

Full text
Abstract:
As the single most successful social-networking Website to date, Facebook has caused a shift in both practice and perception of online socialisation, and its relationship to the offline world. While not the first online social networking service, Facebook’s user base dwarfs its nearest competitors. Mark Zuckerberg’s creation boasts more than 750 million users (Facebook). The currently ailing MySpace claimed a ceiling of 100 million users in 2006 (Cashmore). Further, the accuracy of this number has been contested due to a high proportion of fake or inactive accounts. Facebook by contrast, claims 50% of its user base logs in at least once a day (Facebook). The popular and mainstream uptake of Facebook has shifted social use of the Internet from various and fragmented niche groups towards a common hub or portal around which much everyday Internet use is centred. The implications are many, but this paper will focus on the progress what Mimi Marinucci terms the “Facebook effect” (70) and the evolution of lists as a filtering mechanism representing one’s social zones within Facebook. This is in part inspired by the launch of Google’s new social networking service Google+ which includes “circles” as a fundamental design feature for sorting contacts. Circles are an acknowledgement of the shortcomings of a single, unified friends list that defines the Facebook experience. These lists and circles are both manifestations of the same essential concept: our social lives are, in fact, divided into various zones not defined by an online/offline dichotomy, by fantasy role-play, deviant sexual practices, or other marginal or minority interests. What the lists and circles demonstrate is that even very common, mainstream people occupy different roles in everyday life, and that to be effective social tools, social networking sites must grant users control over their various identities and over who knows what about them. Even so, the very nature of computer-based social tools lead to problematic definitions of identities and relationships using discreet terms, in contrast to more fluid, performative constructions of an individual and their relations to others. Building the Monolith In 1995, Sherry Turkle wrote that “the Internet has become a significant social laboratory for experimenting with the constructions and reconstructions of self that characterize postmodern life” (180). Turkle describes the various deliberate acts of personnae creation possible online in contrast to earlier constraints placed upon the “cycling through different identities” (179). In the past, Turkle argues, “lifelong involvement with families and communities kept such cycling through under fairly stringent control” (180). In effect, Turkle was documenting the proliferation of identity games early adopters of Internet technologies played through various means. Much of what Turkle focused on were MUDs (Multi-User Dungeons) and MOOs (MUD Object Oriented), explicit play-spaces that encouraged identity-play of various kinds. Her contemporary Howard Rheingold focused on what may be described as the more “true to life” communities of the WELL (Whole Earth ‘Lectronic Link) (1–38). In particular, Rheingold explored a community established around the shared experience of parenting, especially of young children. While that community was not explicitly built on the notion of role-play, the parental identity was an important quality of community members. Unlike contemporary social media networks, these early communities were built on discreet platforms. MUDs, MOOs, Bulletin Board Systems, UseNet Groups and other early Internet communication platforms were generally hosted independently of one another, and even had to be dialled into via modem separately in some cases (such as the WELL). The Internet was a truly disparate entity in 1995. The discreetness of each community supported the cordoning off of individual roles or identities between them. Thus, an individual could quite easily be “Pete” a member of the parental WELL group and “Gorak the Destroyer,” a role-player on a fantasy MUD without the two roles ever being associated with each other. As Turkle points out, even within each MUD ample opportunity existed to play multiple characters (183–192). With only a screen name and associated description to identify an individual within the MUD environment, nothing technical existed to connect one player’s multiple identities, even within the same community. As the Internet has matured, however, the tendency has been shifting towards monolithic hubs, a notion of collecting all of “the Internet” together. From a purely technical and operational perspective, this has led to the emergence of the ISP (Internet service provider). Users can make a connection to one point, and then be connected to everything “on the Net” instead of individually dialling into servers and services one at a time as was the case in the early 1980s with companies such as Prodigy, the Source, CompuServe, and America On-Line (AOL). The early information service providers were largely walled gardens. A CompuServe user could only access information on the CompuServe network. Eventually the Internet became the network of choice and services migrated to it. Standards such as HTTP for Web page delivery and SMTP for email became established and dominate the Internet today. Technically, this has made the Internet much easier to use. The services that have developed on this more rationalised and unified platform have also tended toward monolithic, centralised architectures, despite the Internet’s apparent fundamental lack of a hierarchy. As the Internet replaced the closed networks, the wider Web of HTTP pages, forums, mailing lists and other forms of Internet communication and community thrived. Perhaps they required slightly more technological savvy than the carefully designed experience of walled-garden ISPs such as AOL, but these fora and IRC (Internet Relay Chat) rooms still provided the discreet environments within which to role-play. An individual could hold dozens of login names to as many different communities. These various niches could be simply hobby sites and forums where a user would deploy their identity as model train enthusiast, musician, or pet owner. They could also be explicitly about role-play, continuing the tradition of MUDs and MOOs into the new millennium. Pseudo- and polynymity were still very much part of the Internet experience. Even into the early parts of the so-called Web 2.0 explosion of more interactive Websites which allowed for easier dialog between site owner and viewer, a given identity would be very much tied to a single site, blog or even individual comments. There was no “single sign on” to link my thread from a music forum to the comments I made on a videogame blog to my aquarium photos at an image gallery site. Today, Facebook and Google, among others, seek to change all that. The Facebook Effect Working from a psychological background Turkle explored the multiplicity of online identities as a valuable learning, even therapeutic, experience. She assessed the experiences of individuals who were coming to terms with aspects of their own personalities, from simple shyness to exploring their sexuality. In “You Can’t Front on Facebook,” Mimi Marinucci summarizes an analysis of online behaviour by another psychologist, John Suler (67–70). Suler observed an “online disinhibition effect” characterised by users’ tendency to express themselves more openly online than offline (321). Awareness of this effect was drawn (no pun intended) into popular culture by cartoonist Mike Krahulik’s protagonist John Gabriel. Although Krahulik’s summation is straight to the point, Suler offers a more considered explanation. There are six general reasons for the online disinhibition effect: being anonymous, being invisible, the communications being out of sync, the strange sensation that a virtual interlocutor is all in the mind of the user, the general sense that the online world simply is not real and the minimisation of status and authority (321–325). Of the six, the notion of anonymity is most problematic, as briefly explored above in the case of AOL. The role of pseudonymity has been explored in more detail in Ruch, and will be considered with regard to Facebook and Google+ below. The Facebook effect, Marinucci argues, mitigates all six of these issues. Though Marinucci explains the mitigation of each factor individually, her final conclusion is the most compelling reason: “Facebook often facilitates what is best described as an integration of identities, and this integration of identities in turn functions as something of an inhibiting factor” (73). Ruch identifies this phenomenon as the “aggregation of identities” (219). Similarly, Brady Robards observes that “social network sites such as MySpace and Facebook collapse the entire array of social relationships into just one category, that of ‘Friend’” (20). Unlike earlier community sites, Ruch notes “Facebook rejects both the mythical anonymity of the Internet, but also the actual pseudo- or polynonymous potential of the technologies” (219). Essentially, Facebook works to bring the offline social world online, along with all the conventional baggage that accompanies the individual’s real-world social life. Facebook, and now Google+, present a hard, dichotomous approach to online identity: anonymous and authentic. Their socially networked individual is the “real” one, using a person’s given name, and bringing all (or as many as the sites can capture) their contacts from the offline world into the online one, regardless of context. The Facebook experience is one of “friending” everyone one has any social contact with into one homogeneous group. Not only is Facebook avoiding the multiple online identities that interested Turkle, but it is disregarding any multiplicity of identity anywhere, including any online/offline split. David Kirkpatrick reports Mark Zuckerberg’s rejection of this construction of identity is explained by his belief that “You have one identity … having two identities for yourself is an example of a lack of integrity” (199). Arguably, Zuckerberg’s calls for accountability through identity continue a perennial concern for anonymity online fuelled by “on the Internet no one knows you’re a dog” style moral panics. Over two decades ago Lindsy Van Gelder recounted the now infamous case of “Joan and Alex” (533) and Julian Dibbell recounted “a rape in cyberspace” (11). More recent anxieties concern the hacking escapades of Anonymous and LulzSec. Zuckerberg’s approach has been criticised by Christopher Poole, the founder of 4Chan—a bastion of Internet anonymity. During his keynote presentation at South by SouthWest 2011 Poole argued that Zuckerberg “equates anonymity with a lack of authenticity, almost a cowardice.” Yet in spite of these objections, Facebook has mainstream appeal. From a social constructivist perspective, this approach to identity would be satisfying the (perceived?) need for a mainstream, context-free, general social space online to cater for the hundreds of millions of people who now use the Internet. There is no specific, pre-defined reason to join Facebook in the way there is a particular reason to join a heavy metal music message board. Facebook is catering to the need to bring “real” social life online generally, with “real” in this case meaning “offline and pre-existing.” Very real risks of missing “real life” social events (engagements, new babies, party invitations etc) that were shared primarily via Facebook became salient to large groups of individuals not consciously concerned with some particular facet of identity performance. The commercial imperatives towards monolithic Internet and identity are obvious. Given that both Facebook and Google+ are in the business of facilitating the sale of advertising, their core business value is the demographic information they can sell to various companies for target advertising. Knowing a user’s individual identity and tastes is extremely important to those in the business of selling consumers what they currently want as well as predicting their future desires. The problem with this is the dawning realisation that even for the average person, role-playing is part of everyday life. We simply aren’t the same person in all contexts. None of the roles we play need to be particularly scandalous for this to be true, but we have different comfort zones with people that are fuelled by context. Suler proposes and Marinucci confirms that inhibition may be just as much part of our authentic self as the uninhibited expression experienced in more anonymous circumstances. Further, different contexts will inform what we inhibit and what we express. It is not as though there is a simple binary between two different groups and two different personal characteristics to oscillate between. The inhibited personnae one occupies at one’s grandmother’s home is a different inhibited self one plays at a job interview or in a heated discussion with faculty members at a university. One is politeness, the second professionalism, the third scholarly—yet they all restrain the individual in different ways. The Importance of Control over Circles Google+ is Google’s latest foray into the social networking arena. Its previous ventures Orkut and Google Buzz did not fare well, both were variously marred by legal issues concerning privacy, security, SPAM and hate groups. Buzz in particular fell afoul of associating Google accounts with users” real life identities, and (as noted earlier), all the baggage that comes with it. “One user blogged about how Buzz automatically added her abusive ex-boyfriend as a follower and exposed her communications with a current partner to him. Other bloggers commented that repressive governments in countries such as China or Iran could use Buzz to expose dissidents” (Novak). Google+ takes a different approach to its predecessors and its main rival, Facebook. Facebook allows for the organisation of “friends” into lists. Individuals can span more than one list. This is an exercise analogous to what Erving Goffman refers to as “audience segregation” (139). According to the site’s own statistics the average Facebook user has 130 friends, we anticipate it would be time-consuming to organise one’s friends according to real life social contexts. Yet without such organisation, Facebook overlooks the social structures and concomitant behaviours inherent in everyday life. Even broad groups offer little assistance. For example, an academic’s “Work People” list may include the Head of Department as well as numerous other lecturers with whom a workspace is shared. There are things one might share with immediate colleagues that should not be shared with the Head of Department. As Goffman states, “when audience segregation fails and an outsider happens upon a performance that was not meant for him, difficult problems in impression management arise” (139). By homogenising “friends” and social contexts users are either inhibited or run the risk of some future awkward encounters. Google+ utilises “circles” as its method for organising contacts. The graphical user interface is intuitive, facilitated by an easy drag and drop function. Use of “circles” already exists in the vocabulary used to describe our social structures. “List” by contrast reduces the subject matter to simple data. The utility of Facebook’s friends lists is hindered by usability issues—an unintuitive and convoluted process that was added to Facebook well after its launch, perhaps a reaction to privacy concerns rather than a genuine attempt to emulate social organisation. For a cogent breakdown of these technical and design problems see Augusto Sellhorn. Organising friends into lists is a function offered by Facebook, but Google+ takes a different approach: organising friends in circles is a central feature; the whole experience is centred around attempting to mirror the social relations of real life. Google’s promotional video explains the centrality of emulating “real life relationships” (Google). Effectively, Facebook and Google+ have adopted two different systemic approaches to dealing with the same issue. Facebook places the burden of organising a homogeneous mass of “friends” into lists on the user as an afterthought of connecting with another user. In contrast, Google+ builds organisation into the act of connecting. Whilst Google+’s approach is more intuitive and designed to facilitate social networking that more accurately reflects how real life social relationships are structured, it suffers from forcing direct correlation between an account and the account holder. That is, use of Google+ mandates bringing online the offline. Google+ operates a real names policy and on the weekend of 23 July 2011 suspended a number of accounts for violation of Google’s Community Standards. A suspension notice posted by Violet Blue reads: “After reviewing your profile, we determined the name you provided violates our Community Standards.” Open Source technologist Kirrily Robert polled 119 Google+ users about their experiences with the real names policy. The results posted to her on blog reveal that users desire pseudonymity, many for reasons of privacy and/or safety rather than the lack of integrity thought by Zuckerberg. boyd argues that Google’s real names policy is an abuse of power and poses danger to those users employing “nicks” for reasons including being a government employment or the victim of stalking, rape or domestic abuse. A comprehensive list of those at risk has been posted to the Geek Feminism Wiki (ironically, the Wiki utilises “Connect”, Facebook’s attempt at a single sign on solution for the Web that connects users’ movements with their Facebook profile). Facebook has a culture of real names stemming from its early adopters drawn from trusted communities, and this culture became a norm for that service (boyd). But as boyd also points out, “[r]eal names are by no means universal on Facebook.” Google+ demands real names, a demand justified by rhetoric of designing a social networking system that is more like real life. “Real”, in this case, is represented by one’s given name—irrespective of the authenticity of one’s pseudonym or the complications and dangers of using one’s given name. Conclusion There is a multiplicity of issues concerning social networks and identities, privacy and safety. This paper has outlined the challenges involved in moving real life to the online environment and the contests in trying to designate zones of social context. Where some earlier research into the social Internet has had a positive (even utopian) feel, the contemporary Internet is increasingly influenced by powerful and competing corporations. As a result, the experience of the Internet is not necessarily as flexible as Turkle or Rheingold might have envisioned. Rather than conducting identity experimentation or exercising multiple personnae, we are increasingly obligated to perform identity as it is defined by the monolithic service providers such as Facebook and Google+. This is not purely an indictment of Facebook or Google’s corporate drive, though they are obviously implicated, but has as much to do with the new social practice of “being online.” So, while there are myriad benefits to participating in this new social context, as Poole noted, the “cost of failure is really high when you’re contributing as yourself.” Areas for further exploration include the implications of Facebook positioning itself as a general-purpose user authentication tool whereby users can log into a wide array of Websites using their Facebook credentials. If Google were to take a similar action the implications would be even more convoluted, given the range of other services Google offers, from GMail to the Google Checkout payment service. While the monolithic centralisation of these services will have obvious benefits, there will be many more subtle problems which must be addressed. References Blue, Violet. “Google Plus Deleting Accounts en Masse: No Clear Answers.” zdnet.com (2011). 10 Aug. 2011 ‹http://www.zdnet.com/blog/violetblue/google-plus-deleting-accounts-en-masse-no-clear-answers/56›. boyd, danah. “Real Names Policies Are an Abuse of Power.” zephoria.org (2011). 10 Aug. 2011 ‹http://www.zephoria.org/thoughts/archives/2011/08/04/real-names.html›. Cashmore, Pete. “MySpace Hits 100 Million Accounts.” mashable.com (2006). 10 Aug. 2011 ‹http://mashable.com/2006/08/09/myspace-hits-100-million-accounts›. Dibble, Julian. My Tiny Life: Crime and Passion in a Virtual World. New York: Henry Holt & Company, 1998. Facebook. “Fact Sheet.” Facebook (2011). 10 Aug. 2011 ‹http://www.facebook.com/press/info.php?statistic›. Geek Feminism Wiki. “Who Is Harmed by a Real Names Policy?” 2011. 10 Aug. 2011 ‹http://geekfeminism.wikia.com/wiki/Who_is_harmed_by_a_%22Real_Names%22_policy› Goffman, Erving. The Presentation of Self in Everyday Life. London: Penguin, 1959. Google. “The Google+ Project: Explore Circles.” Youtube.com (2011). 10 Aug. 2011 ‹http://www.youtube.com/watch?v=ocPeAdpe_A8›. Kirkpatrick, David. The Facebook Effect. New York: Simon & Schuster, 2010. Marinucci, Mimi. “You Can’t Front on Facebook.” Facebook and Philosophy. Ed. Dylan Wittkower. Chicago & La Salle, Illinois: Open Court, 2010. 65–74. Novak, Peter. “Privacy Commissioner Reviewing Google Buzz.” CBC News: Technology and Science (2010). 10 Aug. 2011 ‹http://www.cbc.ca/news/technology/story/2010/02/16/google-buzz-privacy.html›. Poole, Christopher. Keynote presentation. South by SouthWest. Texas, Austin, 2011. Robards, Brady. “Negotiating Identity and Integrity on Social Network Sites for Educators.” International Journal for Educational Integrity 6.2 (2010): 19–23. Robert, Kirrily. “Preliminary Results of My Survey of Suspended Google Accounts.” 2011. 10 Aug. 2011 ‹http://infotrope.net/2011/07/25/preliminary-results-of-my-survey-of-suspended-google-accounts/›. Rheingold, Howard. The Virtual Community: Homesteading on the Electronic Frontier. New York: Harper Perennial, 1993. Ruch, Adam. “The Decline of Pseudonymity.” Posthumanity. Eds. Adam Ruch and Ewan Kirkland. Oxford: Inter-Disciplinary.net Press, 2010: 211–220. Sellhorn, Augusto. “Facebook Friend Lists Suck When Compared to Google+ Circles.” sellmic.com (2011). 10 Aug. 2011 ‹http://sellmic.com/blog/2011/07/01/facebook-friend-lists-suck-when-compared-to-googleplus-circles›. Suler, John. “The Online Disinhibition Effect.” CyberPsychology and Behavior 7 (2004): 321–326. Turkle, Sherry. Life on the Screen: Identity in the Age of the Internet. New York: Simon & Schuster, 1995. Van Gelder, Lindsy. “The Strange Case of the Electronic Lover.” Computerization and Controversy: Value Conflicts and Social Choices Ed. Rob Kling. New York: Academic Press, 1996: 533–46.
APA, Harvard, Vancouver, ISO, and other styles
39

Forsyth, Ellen. "Collecting Community Stories: Local Studies Collections and What They Can Tell You About the Community." M/C Journal 22, no. 3 (June 19, 2019). http://dx.doi.org/10.5204/mcj.1523.

Full text
Abstract:
IntroductionThis article investigates how local studies collections in public libraries can help people explore the experiences of regional Australia. Some of this discovery can be done online, but as not all local studies material has been catalogued, is online or available in a digital format, some of this exploration will need to be onsite at public libraries throughout Australia. This exploration could be combined with other investigations into regional areas. What are local studies collections in public libraries? These collections are defined as beinginclusive of local history and so the local collection should support studies that look at the historical past, both distant and recent, or at current concerns in the community, such as local environmental issues, or plans for the future development of a locality. (Dewe 1–2)This broader look at the context of a place should provide information in a range of formats to help explore an area, and to find out about the history, geography and the environment as well as other local concerns and issues. Local studies collections should contain recent as well as older material. Each local studies collection will be different (McCausland; Bateman; Johnston; Gregg; Heap and Pymm) with some of these differences simply being because each area has a unique collection of stories which can be told about it. Other differences will be in how each public library interprets their remit to collect information and stories about a community, and which stories are included or excluded from the collection. There are budget constraints as well because each public library has to choose how to fund local studies as part of their overall library provision which means there are tensions and competing priorities in what is collected and how it is made available for research as well as information and entertainment. Some areas have more research activity so there is more being written, photographed, drawn, or otherwise recorded about an area, but no matter how small an area is, there is usually new local studies material being continually created.Local Studies CollectionsLocal studies collections are important as they provide key information about an area. For professional scholars, even in social history, the local becomes interesting only within a larger context, however. Local case studies may throw light on wider questions (Reid and Macafee 127).This highlights the value which local studies can contribute as part of research as these collections may provide case studies to explore, or different avenues to investigate. It also shows the importance of information in many local studies collections being brought together so the separate, local information can be connected to other local information. This bringing together can be as a result of research or through an aggregation system such as Trove (“Trove”).Peter Reid and Caroline Macafee have stated that becausethe potential is always there for local history to be pulled into issues of wider concern, it could be said to occupy a liminal space, a borderland between knowledge that is personal, and therefore academically trivial, and knowledge that is generalizable and therefore worthy of scientific attention. (127)This seems a harsh description, but it shows how these collections can be undervalued and that this undervaluing can risk them being overlooked by biographers, historians, and other researchers. Despite this thinking, local studies collections can offer unique and valuable insights into people and places; including for regional areas. The skilled library staff who manage these collections are also key resources in the history of regional areas, as they can help connect the local studies information to other local collections. As well as connecting people to the resources, the unwritten knowledge of staff is a separate and very important resource.How to Discover Local Studies CollectionsA good way to start exploring local studies collections is by searching Trove. Trove had, around the time of writing, “over 457,524,491 Australian and online resources” (online) and is an Australia-wide database, managed by the National Library of Australia. It enables you to search many library catalogues with one search tool which means that you can search once, in one place, rather than by individual library or museum catalogues. Trove brings together metadata including catalogue records, mostly from library catalogues, from organisations who choose to contribute access to their information. Some of the resources you can search for on Trove are in local studies collections in public libraries or held by other organisations which collect local information such as state and national libraries. Start your search by the name of the location which you are exploring. Be as specific as possible, as you can always broaden your search later. If the item has been digitised, or is already digital, you are often able to view or listen to this material online.As well as providing access to library catalogues, many local newspapers have been digitised and are searchable and viewable on Trove. Some newspapers have been digitised up to 1955, while some titles have fewer years available online, and microfilm will need to be used to find more recently produced information. Public libraries often hold the microfilm for their local newspapers. State libraries may hold them as well. This timeline of digital access is important to keep in mind as searching newspapers on Trove is very easy and searching on microfilm is not so appealing because of having to work through each newspaper page by page, microfilm roll by microfilm roll. You need to check the information about what issues of a newspaper have been digitised so you know when you need to start looking at microfilm copies rather than digitised ones. Older newspapers often include syndicated stories, so an event may have occurred in an area you are interested in but be reported in the newspaper from another area. You could also use the Trove API (application programming interface) to explore high volume digitised newspaper or catalogue data (Sherratt).This method of starting with Trove can also be a helpful way to find out which public library is in the area you are looking for, as the name of the organisation which holds the resources is listed online. You can click on a link to take you to their catalogue. While public libraries are often named for the town they are in, you may be looking for a place with a different name, so this method can be helpful. It can also show resources held in other libraries which may relate to the area of your research. Trove Mosaic by Mitchell Whitelaw (online), although an older interface, is a visual way to explore Trove and clearly highlights the different organisations contributing photographs.Libraries include local studies photographs in their social media and a very small number of them are collecting social media about their community (Forsyth et al.). Searching social media for terms such as #flashbackFriday or #throwbackThursday may also provide a way to discover local studies material online, although depending on your research topic, this method could be too haphazard an approach. There are still some local studies blogs to follow (MacRitchie) and searching for these can also provide information about local studies material in public libraries.Public Libraries and Local StudiesYou can also start at the public library. Depending on where in Australia you are searching there are different tools to help find your local public library. Rather than list them all, a useful starting point is to go to your favourite search engine and search for the name of town/suburb followed by public library. This should connect you with information about the local library through the library website, the regional library website (where two or more councils work together to provide a public library service), or via the council website. This is likely to provide sufficient information to be able to contact the library. However, before you contact the library, search the library catalogue. They may even have a separate local studies database for some or all of the local studies collection. This is why is it a good idea to start with Trove, before going to a local library search, as Trove should be aggregating collection information from a variety of sources bringing together the local public library as well as other organisations (sometimes some unexpected ones) which have material of relevance.Work from the State Library of New South Wales had demonstrated that not everything in local studies collection is catalogued (State Library of New South Wales) which makes it impossible to search for everything online. Quite a few (but not all) public libraries have a webpage where they describe their local studies collections and services. This can provide helpful information so that if you do not find something online you can telephone or email the library seeking further information. If the library is nearby you could simply visit it, but it is best to ring or email first if your time is limited, as it can be helpful to make an appointment to ensure that staff will be able to assist you with using the library collection. For searching the catalogues for local studies information, again, be as specific as possible, knowing you can always broaden your search terms. Helpfully, most (but not all) library catalogues have a sort by date option once material has been found, and some even have local studies specific search help. Often you can view or listen to digitised material online, but some libraries only make low resolution images available, which is rarely of good enough quality for research. When you have searched the catalogue or other online local studies database and not found anything, contact the library as they will be able to provide further information.Library staff will help you use their collections. Some public libraries charge a fee for more detailed research, others, quite reasonably, require you to do this more detailed research yourself.There are many variables, and it really depends on what and where you are researching. Perhaps you are looking for a written history of each area you plan to visit when exploring regional areas of Australia, or you might be planning to visit local studies collections to see how they lead you to areas and stories of local interest, or there is a particular research question you want to explore in several regional areas. How local studies books and other materials are written will depend on the time they were written, and the purpose for them. They can depict ideas and priorities which are outdated and/or offensive.Not Everything Is on TroveWhile Trove is a suggested starting place, given that every item in local studies collections is not catalogued, visiting the local public library can be an important step to take. Always check if the local studies area has different opening hours to the rest of the library. If part or all of the local studies collection is in a locked room, visiting the library at a very busy time is unwise as it may make it harder for the staff to assist you as they will have many other priorities and you may not be able to access the collection.Visiting the Library Visiting a public library and looking at how their local studies collection is arranged can help you see the collecting priorities. It also makes it very clear as to which public libraries have prioritised their local studies information. Occasionally the local studies area will be a partnership with both the library and the local family or local history society providing resources or the collection. This can result in different access conditions being applied to different collections.Visiting the collection means you can talk with the library staff about the history of the area as part of your experience of regional Australia. It is interesting to see how different local studies collections are arranged and how the local area is promoted through the collection and any displays or merchandise for sale. Often local publications will be for sale in the library so that you can purchase titles about the history of the area. Some councils commission histories of their areas, other times niche histories will be written by people in the community and the local studies collection can be a helpful way to discover these.Keep in mind that local government boundaries change (Leigh) and this may mean that resources you are looking for could be in a neighbouring area, rather than the location you are exploring. This is another reason to start with Trove.You May Not Be Able to See Everything Even If You Visit...For reasons of preservation you may not be able to see everything in the local studies collection even if you visit. Sometimes you need to watch out for special tours, which may not coincide with your visit to the area. There may be parts of the collection stored but not fully explored by staff, waiting their time in the queue to be catalogued and made available for research. Generally public library staff will be very helpful for you in your research, particularly if you have specific questions about the area.Know about CopyrightKnow the information about duration of copyright as some libraries say on their catalogues that everything which has been digitised is in copyright. This may be accidental as a result of some bulk cataloguing processes linked with digitisation. Stating something is in copyright is not the same as it being in copyright. The Australian Copyright Council has a helpful information sheet on the duration of copyright to help you understand what is in copyright and how long it is likely to continue to be in copyright.ChallengesThere will be collection gaps. The risk of bias is highlighted by the statement that libraries “are not, and have never been, socially or politically neutral institutions” (Gibson et al. 753). There has not been detailed research exploring these collection gaps, so the exact extent of exclusion or omission of information is not yet able to be quantified. There is arenewed professional imperative to position information centers as central locations for social justice work [which] has also turned our attention to the need to preserve materials that support a diverse and pluralistic society … [and] as a duty to steward unexplored histories. (Sheffield 573)Material may not be in the collection because it was not collected, or because it was not created. For example, in the past not everyone could afford a camera which means they may not have photographed or videoed their family, or public events. Not every grave had a headstone so someone may not have their grave recorded. Public libraries recognise these gaps, and in some areas library staff create or commission content to help with these omissions. For example, oral histories can be recorded to include stories which were not available in other ways, and photographs can be taken of current events to make sure a wider exploration of local stories are recorded in the local studies collection.Conclusion (and Opportunities)Grant White states, in relation to local studies that thesurvival of the artefact is only ever significant when it can be accessed by someone who can see meaning in it. The collection is in fact much more than the material sitting upon the shelves, it is access to it. Access which keeps it current in the community memory rather than as a separated, isolated adjunct. It is also the participation of the community in the creation of the collection, feeding it with its experience, reflections and memories. (98)This access is crucial, and with digitisation and digital collecting the access can increasingly be at a distance, without actually visiting a library. This increasing online access, especially through aggregated sites such as Trove, will hopefully enable research exploring the similarities and differences of regional areas, as connections can be made, and not only by people who can afford to travel to different places to do research. Digitisation, digital collecting, effective cataloguing and use of metadata can open up access to collections, just as digital preservation, preservation of other formats and conservation can help make sure that these materials are available into the future. Connecting to skilled staff who manage these collections is another way of exploring access as there will be information not recorded anywhere you can find, but which the staff may know because of their experience and knowledge of the collection as well as their knowledge of the community they work in.If you have been using public library local studies collections for research, it is helpful if you can share this research back with the public library, helping to build their collection for other people who are researching the region, even if they are exploring different topics. It may be a printed book you are providing, but more public libraries are able to accept donations of ebooks, or other online content. This can be a helpful way for you to contribute to the collections which have assisted in your research.ReferencesBateman, Shirley. “Innovation in Local Studies Collections and Programs: How Melbourne Library Service Is Fostering Community Pride.” Australasian Public Libraries and Information Services 25.1 (2012): 12–18.Dewe, Michael. Ed. Local Studies Collection Management. London: Ashgate, 2002.Forsyth, Ellen, Ngarie Macqueen, and Daniel Nitsikopoulos. Contemporary Collecting: Collecting Instagram for Local Studies. ALIA Information Online, 2019.Gibson, Amelia N., Renate L. Chancellor, Nicole A. Cooke, Sarah Park Dahlen, Shari A. Lee, and Yasmeen L. Shorish. “Libraries on the Frontlines: Neutrality and Social Justice.” Equality, Diversity and Inclusion: An International Journal 36.8 (2017): 751–66.Gregg, Alison. “Our Heritage: The Role of Archives and Local Studies Collections.” Australasian Public Libraries and Information Services 15.3 (2002): 126–32.Heap, Amy, and Bob Pymm. “Wagga Wagga Women’s Wireless and the Web: Local Studies and New Technologies.” The Australian Library Journal 58.1 (2009): 5–16.Johnston, Clinton. “Capture and Release: Cataloguing Cultural Heritage at Marrickville Library and History Services.” The Australian Library Journal 62.3 (2013): 218–23.Leigh, Carol. “From Filing Cabinet to Cultural Centre: Creating a Community History Centre in Wanneroo Western Australia.” Australasian Public Libraries and Information Services 25.2 (2012): 83–88.MacRitchie, John. “The Manly Art of Local Studies Blogging: A New Approach to Old Stories.” Australasian Public Libraries and Information Services 25.2 (2012): 89–93.McCausland, Sigrid. “Archives for the People: Public Libraries and Archives in New South Wales.” The Australian Library Journal 64.4 (2015): 270.Reid, Peter H., and Caroline Macafee. “The Philosophy of Local Studies in the Interactive Age.” Journal of Librarianship and Information Science 39.3 (2007): 126–41.Sheffield, Rebecka T. “More than Acid-Free Folders: Extending the Concept of Preservation to Include the Stewardship of Unexplored Histories.” Library Trends 64.3 (2016): 572.Sherratt, Tim. “Asking Better Questions: History, Trove and the Risks That Count.” Copyfight. Ed. Phillipa McGuinness. Sydney: NewSouth Publishing, 2015. 112–24.State Library of New South Wales. NSW Public Libraries Local Studies Audit. 2014.“Trove.” Trove 7 Apr. 2019 <https://trove.nla.gov.au/>.White, Grant. “Message in a Bottle: Community Memory in the Local Studies Collection.” APLIS 13.3 (2000): 6.Whitelaw, Mitchell. “TroveMosaic: Exploring Trove Images.” TroveMosaic: Exploring Trove Images 7 Apr. 2019 <http://mtchl.net/TroveMosaic/>.
APA, Harvard, Vancouver, ISO, and other styles
40

Gibson, Prue. "Machinic Interagency and Co-evolution." M/C Journal 16, no. 6 (November 6, 2013). http://dx.doi.org/10.5204/mcj.719.

Full text
Abstract:
The ontological equality and material vitality of all things, and efforts to remove “the human” from its apical position in a hierarchy of being, are Object-Oriented Ontology theory (OOO) concepts. These axioms are useful in a discussion of the aesthetics of augmented robotic art, alongside speculations regarding any interagency between the human/non-human and possible co-evolutionary relationships. In addition, they help to wash out the sticky habits of conventional art writing, such as removed critique or an authoritative expert voice. This article aims to address the robotic work Accomplice by Sydney-based artists Petra Gemeinboeck and Rob Saunders as a means of interrogating the independence and agency of robots as non-human species, and as a mode of investigating how we see these relationships changing for the futureFor Accomplice, an artwork exhibited at Artspace, Sydney, in 2013, Gemeinboeck and Saunders built robots, strategised properties, and programmed their performative actions. Replete with lights and hammers, the robots are secreted away behind false walls, where they move along tracks and bang holes into the gallery space. As the devastation of plasterboard ensues, the robots respond interactively to each other through their collective activity: this is intra-action, where an object’s force emerges and where agency is an enactment (Barad, Matter Feels). This paper will continue to draw on the work of feminist scholar and quantum scientist, Karen Barad, due to her related work on agency and intra-action, although she is not part of an OOO theoretical body. Gemeinboeck and Saunders build unstable environments for their robots to perform as embodied inhabitants (Gemeinboeck and Saunders 2). Although the augmented robots are programmed, it is not a prescriptive control. Data is entered, then the robots respond to one another’s proximity and devastation. From the immaterial, virtual realm of robotic programming comes a new materiality which is both unstable, unpredictable, and on the verge of becoming other, or alive. This is a collaboration, not just between Gemeinboeck and Saunders, but between the programmers and their little robots—and the new forces that might be created. Sites of intra-species (human and robot) crossings might be places or spaces where a new figuration of enchantment occurs (Bennett 32). Such a space could take the form of a responsive art-writing intervention or even a new ontological story, as a direct riposte to the lively augmentation of the robotic artwork (Bennett 92). As ficto-critical theorist and ethnographer, Stephen Muecke says, “Experimental writing, for me, would be writing that necessarily participates in worlds rather than a writing constituted as a report on realities seen from the other side of an illusory gap of representation” (Muecke Motorcycles 2). Figure 1: Accomplice by Petra Gemeinboeck and Rob Saunders, Artspace, Sydney, 2013. (Photo: Petra Gemeinboeck)Writing Forces When things disappear then reappear, there is a point where force is unleashed. If we ask what role the art writer plays in liberating force, the answer might be that her role is to create as an imaginative new creation, equal to the artwork. The artists speak of Accomplice: transductions, transmaterial flows and transversal relations are at play ... whether emerging from or propelling the interplay between internal dynamics and external forces, the enactment of agencies (human and non-human), or the performative relationship unfolding over time. (Gemeinboeck and Saunders 3) When new energetic force is created and the artwork takes on new life, the audience’s imaginative thought is stimulated. This new force might cause an effect of a trans-fictional flow. The act of writing about Accomplice might also involve some intentional implausibility. For instance, whilst in the exhibition gallery space, witnessing Accomplice, I decided to write a note to one of the robots. I could see it, just visible beyond the violently hammered hole in the wall. Broken plaster dusted my shoes and as I peered into the darker outside space, it whizzed past on its way to bang another hole, in harmony with its other robotic friends. So I scribbled a note on a plain white piece of paper, folded it neatly and poked it through the hole: Dear robot, do you get sick of augmenting human lives?Do you get on well with your robotic friends?Yours sincerely, Prue. I waited a few minutes and then my very same piece of paper was thrust back through the hole. It was not folded but was crumpled up. I opened it and noticed a smudged mark in the corner. It looked like an ancient symbol, a strange elliptical script of rounded shapes, but was too small to read. An intergalactic message, a signal from an alien presence perhaps? So I borrowed a magnifying glass from the Artspace gallery attendant. It read: I love opera. Robot Two must die. This was unexpected! As I pondered the robot’s reply, I noticed the robots did indeed make strange bird-like noises to one another; their tapping was like rhythmic woodpeckers. Their hammering was a kind of operatic symphony; it was not far-fetched that these robots were appreciative of the sound patterns they made. In other words, they were responding to stimuli in the environment, and acting in response. They had agency beyond the immaterial computational programming their creators had embedded. It wasn’t difficult to suspend disbelief to allow the possibility that interaction between the robots might occur, or that one might have gone rogue. An acceptance of the possibility of inter-agency would allow the fantastical reality of a human becoming short-term pen pals with an augmented machine. Karen Barad might endorse such an unexpected intra-action act. She discourages conventional critique as, “a tool that keeps getting used out of habit” (Matter Feels). Art writing, in an era of robots and awareness of other non-human sentient life-forms can be speculative invention, have a Barad-like imaginative materiality (Matter Feels), and sense of suspended disbelief. Figure 2: Accomplice by Petra Gemeinboeck and Rob Saunders, Artspace, Sydney, 2013. (Photo: Petra Gemeinboeck) The Final Onto-Story Straw Gemeinboeck and Saunders say the space where their robots perform is a questionable one: “the fidelity of the space as a shared experience is thus brought into question: how can a shared virtual experience be trusted when it is constructed from such intangible and malleable stuff as streams of binary digits” (7). The answer might be that it is not to be trusted, particularly in an OOO aesthetic approach that allows divergent and contingent fictive possibilities. Indeed, thinking about the fidelity of the space, there was something about a narrow access corridor in the Accomplice exhibition space, between the false gallery wall and the cavity where the robots moved on their track, that beckoned me. I glanced over my shoulder to check that the Artspace attendant wasn’t watching and slipped behind the wall. I took a few tentative steps, not wanting to get knocked on the nose by a zooming robot. I saw that one robot had turned away from the wall and was attacking another with its hammer. By the time I arrived, the second robot (could it be Robot Two?) had been badly pummeled. Not only did Robot One attack Robot Two but I witnessed it using its extended hammer to absorb metal parts: the light and the hammer. It was adapting, like Philip K. Dick’s robots in his short story ‘Preserving Machine’ (See Gray 228-33). It was becoming more augmented. It now had two lights and two hammers and seemed to move at double speed. Figure 3: Accomplice by Petra Gemeinboeck and Rob Saunders, Artspace, Sydney, 2013. (Photo: Petra Gemeinboeck)My observance of this scene might be explained by Gemeinboeck/Saunders’s comment regarding Philip K. Dick-style interference and instability, which they actively apply to their work. They say, “The ‘gremlins’ of our works are the slipping logics of nonlinear systems or distributed agential forces of colliding materials” (18). An audience response is a colliding material. A fictional aside is a colliding material. A suspension of disbelief must also be considered a colliding material. This is the politics of the para-human, where regulations and policies are in their infancy. Fears of artificial intelligence seem absurd, when we consider how startled we become when the boundaries between fiction/truth become as flimsy and slippery as the boundaries between human/non-human. Art writing that resists truth complements Gemeinboeck/Saunders point that, “different agential forces not only co-evolve but perform together” (18).The DisappearanceBefore we are able to distinguish any unexpected or enchanted ontological outcomes, the robots must first appear, but for things to truly appear to us, they must first disappear. The robots disappear from view, behind the false walls. Slowly, through the enactment of an agented force (the action of their hammers upon the wall), they beat a path into the viewer’s visual reality. Their emergence signals a performative augmentation. Stronger, better, smarter, longer: these creatures are more-than-human. Yet despite the robot’s augmented technological improvement upon human ability, their being (here, meaning their independent autonomy) is under threat in a human-centred world environment. First they are threatened by the human habit of reducing them to anthropomorphic characteristics: they can be seen as cute little versions of humans. Secondly, they are threatened by human perception that they are under the control of the programmers. Both points are arguable: these robots are undoubtedly non-human, and there are unexpected and unplanned outcomes, once they are activated. These might be speculative or contestable outcomes, which are not demonstrably an epitome of truth (Bennett 161). Figure 4: Accomplice by Petra Gemeinboeck and Rob Saunders, Artspace, Sydney, 2013. (Photo: Petra Gemeinboeck)Gemeinboeck’s robotic creatures, with their apparent work/play and civil disobedience, appeared to exhibit human traits. An OOO approach would discourage these anthropomorphic tendencies: by seeing human qualities in inanimate objects, we are only falling back into correlational habits—where nature and culture are separate dyads and can never comprehend each other, and where humankind is mistakenly privileged over all other entities (Meillassoux 5). This only serves to inhibit any access to a reality outside the human-centred view. This kind of objectivity, where we see ourselves as nature, does no more than hold up a mirror to our inescapably human selves (Barad, Matter Feels). In an object-oriented approach the unpredictable outcomes of the robots’s performance is brought to attention. OOO proponent and digital media theorist Ian Bogost, has a background in computational media, especially video and social media games, and says, “computers are plastic and metal corpses with voodoo powers” (9). This is a non-life description, hovering in the liminal space between being and not being. Bogost’s view is that a strange world stirs within machinic devices (9). A question to ask: what’s it like to be a robot? Perhaps the answer lies somewhere between what it does and how we see it. It is difficult not to think of twentieth century philosopher Martin Heidegger’s tool analysis theory when writing of Gemeinboeck/Saunders’s work because Heidegger, and OOO scholar Graham Harman after him, uses the hammer as his paradigmatic tool. In his analysis, things are only present-at-hand (consciously perceived without utility) once they break (Harman, Heidegger Explained 63). However, Gemeinboeck and Saunders’s installation Accomplice straddles Heidegger’s dual present-at-hand and read-at-hand (the utility of the thing) because art raises the possibility that we might experience these divergent qualities of the robotic entities, simultaneously. The augmented robot, existing in its performative exhibition ecology, is the bridge between sentient life and utility. Robotic Agency In relation to the agency of robots, Ian Bogost refers to the Tableau Machine which was a non-human actor system created by researchers at Georgia Tech in 1998 (Bogost 106). It was a house fitted with cameras, screens, interfaces, and sensors. This was an experimental investigation into ambient intelligence. The researchers’s term for the computational agency was ‘alien presence,’ suggesting a life outside human comprehension. The data-collator sensed and interpreted the house and its occupants, and re-created that recorded data as abstract art, by projecting images on its own plasma screens. The implication was that the home was alive, vital, and autonomously active, in that it took on a sentient life, beyond human control. This kind of vital presence, an aliveness outside human programming, is there in the Accomplice robots. Their agency becomes materialized, as they violate the polite gallery-viewing world. Karen Barad’s concept of agency works within a relational ontology. Agency resists being granted, but rather is an enactment, and creates new possibilities (Barad, Matter Feels). Agency is entangled amongst “intra-acting human and non-human practices” (6). In Toward an Enchanted Materialism, Jane Bennett describes primordia (atoms) as “not animate with divine spirit, and yet they are quite animated - this matter is not dead at all” (81). This then is an agency that is not spiritual, nor is there any divine purpose. It is a matter of material force, a subversive action performed by robotic entities, not for any greater good, in fact, for no reason at all. This unpredictability is OOO contingency, whereby physical laws remain indifferent to whether an event occurs or not (Meillassoux 39). Figure 5: Accomplice by Petra Gemeinboeck and Rob Saunders, Artspace, Sydney, 2013. (Photo: Petra Gemeinboeck) A Post-Human Ethic The concept of a post-human state of being raises ethical concerns. Ethics is a human construct, a criteria of standards fixed within human social systems. How should humans respond, without moral panic, to robots that might have life and sentient power outside human control? If an OOO approach is undertaken, the implication is that all things exist equally and ethics, as fixed standards, might need to be dismantled and replaced with a more democratic set of guidelines. A flat ontology, argued for by Bogost, Levi Bryant and other OOO advocates, follows that all entities have equal potential for independent energy and agency (although OOO theorists disagree on many small technical issues). The disruption of the conventional hierarchical model of being is replaced by a flat field of equality. This might cause the effect of a more ethical, ontological ecology. Quentin Meillassoux, an influential figure in the field of Speculative Realism, from which OOO is an offshoot, finds philosophical/mathematical solutions to the problems of human subjectivity. His eschewing of Kantian divisions between object/subject and human/world, is accompanied by a removal from Kantian and Cartesian critique (Meillassoux 30). This turn from critique, and its related didactic authority and removed judgment, marks an important point in the culture of philosophy, but also in the culture of art writing. If we can escape the shackles of divisive critique, then the pleasures of narrative might be given space. Bogost endorses collapsing the hierarchical model of being and converting conventional academic writing (89). He says, “for the computers to operate at all for us first requires a wealth of interactions to take place for itself. As operators or engineers, we may be able to describe how such objects and assemblages work. But what do they “experience” (Bogost 10)? This view is complementary to an OOO view of anti-subjectivity, an awareness of things that might exist irrespective of human life, from both inside and outside the mind (Harman 143). Figure 6: Accomplice by Petra Gemeinboeck and Rob Saunders, Artspace, Sydney, 2013. (Photo: Petra Gemeinboeck) New Materiality In addition to her views on human/non-human agency, Karen Barad develops a parallel argument for materiality. She says, “matter feels, converses, suffers, desires, yearns and remembers.” Barad’s agential realism is predicated on an awareness of the immanence of matter, with materiality that subverts conventions of transcendence or human-centredness. She says, “On my agential realist account, all bodies, not merely human bodies, come to matter through the world’s performativity - its iterative intra-activity.” Barad sees matter, all matter, as entangled parts of phenomena that extend across time and space (Nature’s Queer Performativity 125). Barad argues against the position that acts against nature are moral crimes, which occur when the nature/culture divide is breached. She questions the individuated categorizations of ‘nature’ and ‘culture’ inherent in arguments like these (Nature’s Queer Performativity, 123-5). Likewise, in robotic and machinic aesthetics, it could be seen as an ethical breach to consider the robots as alive, sentient, and experiential. This confounds previous cultural separations, however, object-oriented theory is a reexamination of these infractions and offers an openness to discourse of different causal outcomes. Figure 7: Accomplice by Petra Gemeinboeck and Rob Saunders, Artspace, Sydney, 2013. (Photo: Petra Gemeinboeck) Co-Evolution Artists Gemeinboeck/Saunders are artists and scholarly researchers investigating new notions of co-evolution. If we ascribe human characteristics to robots, might they ascribe machinic properties to us? It is possible to argue that co-evolution is already apparent in the world. Titanium knees, artificial arteries, plastic hips, pacemakers, metallic vertebrae pins: human medicine is a step ahead. Gemeinboeck/Saunders in turn make a claim for the evolving desires of their robots (11). Could there be performative interchangeability between species: human and robot? Barad asks us not to presume what the distinctions are between human and non-human and not to make post-humanist blurrings, but to understand the materializing effects of the boundaries between humans and nonhumans (Nature’s Queer Performativity 123). Vital matter emerges from acts of reappearance, re-performance, and interspecies interaction. Ian Bogost begins his Alien Phenomenology by analysing Alan Turing’s essay, Computing Machinery and Intelligence and deduces that it is an approach inextricably linked to human understanding (Bogost 14). Bogost seeks to avoid distinctions between things or a slippage into an over-determination of systems operations, and instead he adopts an OOO view where all things are treated equally, even cheeky little robots (Bogost 17).Figure 8: Accomplice by Petra Gemeinboeck and Rob Saunders, installation view, Artspace, Sydney. (Photo: silversalt photography) Intra-Active ReappearanceIf Barad describes intra-action as enacting an agential cut or separation of object from subject, she does not mean a distinction between object and subject but instead devises an intra-active cutting of things together-apart (Nature’s Queer Performativity 124). This is useful for two reasons. First it allows confusion between inside and outside, between real and unreal, and between past and future. In other words it defies the human/world correlates, which OOO’s are actively attempting to flee. Secondly it makes sense of an idea of disappearance as being a re-appearance too. If robots, and all other species, start to disappear, from our consciousness, from reality, from life (that is, becoming extinct), this disappearance causes or enacts a new appearance (the robotic action), and this action has its own vitality and immanence. If virtuality (an aesthetic of being that grew from technology, information, and digital advancements) meant that the body was left or abandoned for an immaterial space, then robots and robotic artwork are a means of re-inhabiting the body in a re-materialized mode. This new body, electronic and robotic in nature, might be mastered by a human hand (computer programming) but its differential is its new agency which is one shared between human and non-human. Barad warns, however, against a basic inversion of humanism (Nature’s Queer Performativity 126). Co-evolution is not the removal of the human. While an OOO approach may not have achieved the impossible task of creating a reality beyond the human-centric, it is a mode of becoming cautious of an invested anthropocentric view, which robotics and diminished non-human species bring to attention. The autonomy and agency of robotic life challenges human understanding of ontological being and of how human and non-human entities relate.References Barad, Karen. "Nature’s Queer Performativity." Qui Parle 19.2 (2011): 121-158. ———. Interview. In Rick Dolphijn and Van Der Tuin. “Matter Feels, Converses, Suffers, Desires, Yearns and Remembers: Interview with Karen Barad.” New Materialism: Interviews and Cartographies. Ann Arbor: University of Michigan; Open Humanities Press, 2012. ———. "Posthumanist Performativity: Toward an Understanding of How Matter Comes to Matter." Signs: Journal of Women in Culture and Society 28.3 (2003): 801-831. Bennett, Jane. The Enchantment of Modern Life: Attachments, Crossings, and Ethics. New Jersey: Princeton University Press, 2001. Bogost, Ian. Alien Phenomenology. Minneapolis: Minnesota Press, 2012. Bryant, Levi. The Democracy of Objects. University of Michigan Publishing: Open Humanities Press, 2011. ———, N. Srnicek, and GHarman. The Speculative Turn: Continental Materialism and Realism. Melbourne: re:press, 2011. Gemeinboeck, Petra, and Rob Saunders. “Other Ways of Knowing: Embodied Investigations of the Unstable, Slippery and Incomplete.” Fibreculture Journal 18 (2011). ‹http://eighteen.fibreculturejournal.org/2011/10/09/fcj-120-other-ways-of-knowing-embodied-investigations-of-the-unstable-slippery-and-incomplete/›. Gray, Nathan. "L’object sonore undead." In A. Barikin and H. Hughes. Making Worlds: Art and Science Fiction. Melbourne: Surpllus, 2013. 228-233. Harman, Graham. The Quadruple Object. Winchester UK: Zero Books, 2011. ———. Guerilla Metaphysics: Phenomenology and the Carpentry of Things. Chicago: Open Court, 2005. ———. Heidegger Explained: From Phenomenon to Thing. Chicago: Open Court Publishing, 2007. Heidegger, Martin. Being and Time. San Francisco: Harper and Row, 1962. Meillassoux, Quentin. After Finitude: An Essay on the Necessity of Contingency. New York: Continuum, 2008. Muecke, Stephen. "The Fall: Ficto-Critical Writing." Parallax 8.4 (2002): 108-112. ———. "Motorcycles, Snails, Latour: Criticism without Judgment." Cultural Studies Review 18.1 (2012): 40-58. ———. “The Writing Laboratory: Political Ecology, Labour, Experiment.” Angelaki 14.2 (2009): 15-20. Phelan, Peggy. Unmarked: The Politics of Performance. London: Routledge, 1993.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography