Siga este link para ver outros tipos de publicações sobre o tema: Email-Driven Processes.

Artigos de revistas sobre o tema "Email-Driven Processes"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Veja os 39 melhores artigos de revistas para estudos sobre o assunto "Email-Driven Processes".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Veja os artigos de revistas das mais diversas áreas científicas e compile uma bibliografia correta.

1

Kumar, Deepak. "AI-DRIVEN AUTOMATION IN ADMINISTRATIVE PROCESSES: ENHANCING EFFICIENCY AND ACCURACY". International Journal of Engineering Science and Humanities 14, Special Issue 1 (27 de maio de 2024): 256–65. http://dx.doi.org/10.62904/qg004437.

Texto completo da fonte
Resumo:
This paper explores the transformative impact of AI-driven automation on administrative processes, emphasizing the dual objectives of enhancing efficiency and accuracy. Through an in-depth examination of various applications, from document management to dynamic task prioritization, the study showcases how artificial intelligence can revolutionize traditional workflows. Special attention is given to the integration of natural language processing for email triage, virtual assistants for administrative support, and facial recognition for secure access control. The implementation of predictive analytics, sentiment analysis, and predictive maintenance further contributes to the paper’s focus on predictive decision-making and improved resource allocation. The abstract underscores the pivotal role of AI in meeting contemporary administrative challenges, offering solutions that streamline tasks, reduce errors, and optimize resource utilization. Additionally, the paper addresses ethical considerations associated with AI implementation and highlights the need for a balanced approach that aligns technological advancements with organizational goals. In essence, thisresearch provides a comprehensive overview of how AI can be harnessed to reshape administrative landscapes, fostering heightened efficiency and accuracy in contemporary workplaces.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Fedor, Julie, e Rolf Fredheim. "“We need more clips about Putin, and lots of them:” Russia's state-commissioned online visual culture". Nationalities Papers 45, n.º 2 (março de 2017): 161–81. http://dx.doi.org/10.1080/00905992.2016.1266608.

Texto completo da fonte
Resumo:
In this article, we examine how the Putin government is attempting to respond and adapt to the YouTube phenomenon and the vibrant oppositional online visual culture on Runet. We show how these processes are giving rise to new forms of state propaganda, shaped and driven above all by the quest for high-ranking search-engine results and the concomitant desire to appeal to the perceived new sensibilities of the Internet generation through the commissioning and production of “viral videos.” We focus in particular on the videos created by Iurii Degtiarev, a pioneer in the development of this genre, whose works we explore in light of the “Kremlingate” email leaks, which offer inside information on the strategies and aims being pursued on the online visual front of the campaign to manage the Russian mediascape, and Degtiarev's own reflections on this subject. Examining the output of young creatives patronized by the Kremlin offers a “bottom-up” view to supplement studies of the Russian ideological and media landscape as shaped by “political technologists” such as Vladislav Surkov and Gleb Pavlovskii.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Dhebe, Shreyas, Harsh Dhalge, Vaishnavi Suryavanshi e Hemangi Shinde. "Flood Monitoring and Alerting System". International Journal for Research in Applied Science and Engineering Technology 11, n.º 8 (30 de agosto de 2023): 223–30. http://dx.doi.org/10.22214/ijraset.2023.55145.

Texto completo da fonte
Resumo:
Abstract: Floods pose significant threats to communities worldwide, necessitating the development of efficient monitoring and alerting systems. This abstract presents an IoT-based Flood Monitoring and Alerting System that leverages the power of LoRaWAN (Long Range Wide Area Network) technology to provide real-time flood detection, monitoring, and timely alerts. The system consists of strategically placed water level sensors that use ultrasonic or pressure-based technology to accurately measure water levels. The sensor data is wirelessly transmitted to a central gateway through the LoRaWAN network, known for its long-range and low-power capabilities. A cloud-based platform processes and analyzes the collected data, identifying abnormal water level patterns and potential flood conditions. In case of a flood, the system triggers alerts via mobile notifications, email, and SMS to relevant authorities and stakeholders. Users can access real-time flood information, historical data, and visualizations through a user-friendly web or mobile interface. The system offers cost-effectiveness, extended range, low power consumption, early flood detection, timely response, data-driven decision-making, and has a brighter potential for future enhancements
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

P. Ahuja, Sanjay, e Naveen Mupparaju. "Performance Evaluation and Comparison of Distributed Messaging Using Message Oriented Middleware". Computer and Information Science 7, n.º 4 (19 de agosto de 2014): 9. http://dx.doi.org/10.5539/cis.v7n4p9.

Texto completo da fonte
Resumo:
Message Oriented Middleware (MOM) is an enabling technology for modern event-driven applications that are typically based on publish/subscribe communication (Eugster, 2003). Enterprises typically contain hundreds of applications operating in environments with diverse databases and operating systems. Integration of these applications is required to coordinate the business process. Unfortunately, this is no easy task. Enterprise Integration, according to the authors in (Brosey et al, 2001), "aims to connect and combines people, processes, systems, and technologies to ensure that the right people and the right processes have the right information and the right resources at the right time”. Communication between different applications can be achieved by using synchronous and asynchronous communication tools. In synchronous communication, both parties involved must be online (for example, a telephone call), whereas in asynchronous communication, only one member needs to be online (email). Middleware is software that helps two applications communicate with one another. Remote Procedure Calls (RPC) and Object Request Brokers (ORB) are two types of synchronous middleware—when they send a request they must wait for an immediate reply. This can decrease an application’s performance when there is no need for synchronous communication. Even though asynchronous distributed messaging using message oriented middleware is widely used in industry, there is not enough work done in evaluating the performance of various open source Message oriented middleware. The objective of this work was to benchmark and evaluate three different open source MOM’s performance in publish/subscribe and point-to-point domains, and provide a functional comparison and qualitative study from developers perspective.
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Bornmann, Lutz, Christian Ganser e Alexander Tekles. "Anchoring effects in the assessment of papers: The proposal for an empirical survey of citing authors". PLOS ONE 16, n.º 9 (29 de setembro de 2021): e0257307. http://dx.doi.org/10.1371/journal.pone.0257307.

Texto completo da fonte
Resumo:
In our planned study, we shall empirically study the assessment of cited papers within the framework of the anchoring-and-adjustment heuristic. We are interested in the question whether citation decisions are (mainly) driven by the quality of cited references. The design of our study is oriented towards the study by Teplitskiy, Duede [10]. We shall undertake a survey of corresponding authors with an available email address in the Web of Science database. The authors are asked to assess the quality of papers that they cited in previous papers. Some authors will be assigned to three treatment groups that receive further information alongside the cited paper: citation information, information on the publishing journal (journal impact factor), or a numerical access code to enter the survey. The control group will not receive any further numerical information. In the statistical analyses, we estimate how (strongly) the quality assessments of the cited papers are adjusted by the respondents to the anchor value (citation, journal, or access code). Thus, we are interested in whether possible adjustments in the assessments can not only be produced by quality-related information (citation or journal), but also by numbers that are not related to quality, i.e. the access code. The results of the study may have important implications for quality assessments of papers by researchers and the role of numbers, citations, and journal metrics in assessment processes.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Sankaram, Mosa, Ms Roopesh, Sasank Rasetti e Nourin Nishat. "A COMPREHENSIVE REVIEW OF ARTIFICIAL INTELLIGENCE APPLICATIONS IN ENHANCING CYBERSECURITY THREAT DETECTION AND RESPONSE MECHANISMS". GLOBAL MAINSTREAM JOURNAL 3, n.º 5 (10 de julho de 2024): 1–14. http://dx.doi.org/10.62304/jbedpm.v3i05.180.

Texto completo da fonte
Resumo:
This literature review explores the transformative impact of artificial intelligence (AI) on enhancing cybersecurity measures across various domains. The study systematically examines the integration of AI in Intrusion Detection Systems (IDS), malware detection, phishing detection, threat intelligence, network security, and endpoint protection. Key findings reveal that AI-driven techniques significantly outperform traditional methods, particularly in real-time threat detection, accuracy, and adaptive response capabilities. Network-based IDS benefit from supervised and unsupervised learning algorithms, improving the identification of malicious network traffic and novel attack patterns. In malware detection, AI-enhanced static and dynamic analysis methods surpass signature-based approaches by detecting previously unknown malware and complex behaviors. Phishing detection has seen substantial improvements with AI applications in email filtering and URL analysis, reducing phishing incidents despite challenges like false positives. AI's role in threat intelligence is critical, automating data analysis to uncover hidden threats and employing predictive analytics to anticipate and mitigate cyber attacks. AI techniques in network security and endpoint protection enhance real-time monitoring and authentication processes, providing robust defenses against cyber intrusions. Despite these advancements, challenges such as handling high data volumes and the need for continuous learning to adapt to emerging threats remain. This review underscores the significant advancements, practical implementations, and ongoing challenges of leveraging AI in cybersecurity, highlighting its potential to fortify digital defenses and address the complexities of contemporary cyber threats.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Schommer, Jon C., Caroline A. Gaither, Nancy A. Alvarez, SuHak Lee, April M. Shaughnessy, Vibhuti Arya, Lourdes G. Planas, Olajide Fadare e Matthew J. Witry. "Pharmacy Workplace Wellbeing and Resilience: Themes Identified from a Hermeneutic Phenomenological Analysis with Future Recommendations". Pharmacy 10, n.º 6 (23 de novembro de 2022): 158. http://dx.doi.org/10.3390/pharmacy10060158.

Texto completo da fonte
Resumo:
This study applied a hermeneutic phenomenological approach to better understand pharmacy workplace wellbeing and resilience using respondents’ written comments along with a blend of the researchers’ understanding of the phenomenon and the published literature. Our goal was to apply this understanding to recommendations for the pharmacy workforce and corresponding future research. Data were obtained from the 2021 APhA/NASPA National State-Based Pharmacy Workplace Survey, launched in the United States in April 2021. Promotion of the online survey to pharmacy personnel was accomplished through social media, email, and online periodicals. Responses continued to be received through the end of 2021. A data file containing 6973 responses was downloaded on 7 January 2022 for analysis. Usable responses were from those who wrote an in-depth comment detailing stories and experiences related to pharmacy workplace and resilience. There were 614 respondents who wrote such comments. The findings revealed that business models driven by mechanized assembly line processes, business metrics that supersede patient outcomes, and reduction of pharmacy personnel’s professional judgement have contributed to the decline in the experience of providing patient care in today’s health systems. The portrait of respondents’ lived experiences regarding pharmacy workplace wellbeing and resilience was beyond the individual level and revealed the need for systems change. We propose several areas for expanded inquiry in this domain: (1) shared trauma, (2) professional responsibility and autonomy, (3) learned subjection, (4) moral injury and moral distress, (5) sociocultural effects, and (6) health systems change.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Chopra, Bhuvi. "Revolutionizing Cybersecurity with Machine Learning: A Comprehensive Review and Future Directions". Journal of Artificial Intelligence General science (JAIGS) ISSN:3006-4023 4, n.º 1 (19 de maio de 2024): 195–207. http://dx.doi.org/10.60087/jaigs.v4i1.133.

Texto completo da fonte
Resumo:
In the realm of computing, data science has revolutionized cybersecurity operations and technologies. The key to creating automated and intelligent security systems lies in extracting patterns or insights from cybersecurity data and building data-driven models. Data science, encompassing various scientific approaches, machine learning techniques, processes, and systems, studies real-world occurrences through data analysis. Machine learning techniques, known for their flexibility, scalability, and adaptability to new and unknown challenges, have been applied across many scientific fields. Cybersecurity is rapidly expanding due to significant advancements in social networks, cloud and web technologies, online banking, mobile environments, smart grids, and more. Various machine learning techniques have effectively addressed a wide range of computer security issues. This article reviews several machine learning applications in cybersecurity, including phishing detection, network intrusion detection, keystroke dynamics authentication, cryptography, human interaction proofs, spam detection in social networks, smart meter energy consumption profiling, and security concerns associated with machine learning techniques themselves. The methodology involves collecting a large dataset of phishing and legitimate instances, extracting relevant features such as email headers, content, and URLs, and training a machine learning model using supervised learning algorithms. These models can effectively identify phishing emails and websites with high accuracy and low false positive rates. To enhance phishing detection, it is recommended to continuously update the training dataset to include new phishing techniques and employ ensemble methods that combine multiple machine learning models for improved performance
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Saketh Reddy Cheruku, (Dr.) Punit Goel e Ujjawal Jain. "Leveraging Salesforce Analytics for Enhanced Business Intelligence". Innovative Research Thoughts 9, n.º 5 (30 de dezembro de 2023): 165–77. http://dx.doi.org/10.36676/irt.v9.i5.1462.

Texto completo da fonte
Resumo:
Salesforce Analytics is a strong business intelligence (BI) solution that turns raw data into actionable insights. Today's data-driven world requires fast, reliable data analysis for business choices. Salesforce Analytics' broad range of solutions helps organizations use data for better decision-making, operational efficiency, and strategic planning. Integration of Salesforce Analytics into company processes has several benefits. First, it gives organizations real-time knowledge to adapt quickly to market developments and client requests. Organizations may track KPIs and trends using customisable dashboards, automatic reporting, and predictive analytics. Real-time visibility empowers decision-makers to prevent concerns from becoming major ones. Salesforce Analytics' unified data access and analysis platform improves departmental cooperation. Team members may easily exchange insights and reports, breaking down silos and promoting data-driven culture. On-demand report creation and sharing guarantees that all stakeholders have the same information, resulting in better aligned and informed decision-making. Customer relationship management is another important Salesforce Analytics function. The software analyzes touchpoint data to help companies understand their consumers. Businesses may detect client preferences, forecast behavior, and tailor marketing by using this data, improving customer happiness and loyalty. Salesforce's AI-powered analytics help foresee client wants and personalize offerings. Salesforce Analytics also improves forecasting and planning. Organizations may forecast future performance better by evaluating previous data and patterns. This skill is crucial in sales forecasting, inventory management, and financial planning, where precise projections ensure operational efficiency and profitability. Another benefit of Salesforce Analytics is third-party data integration. To get a complete picture of their operations, businesses may mix data from social media, email marketing, and e-commerce platforms. Integration improves strategic choices and corporate results by enabling more complete analysis. The platform's versatility lets analytics tools be customized to match corporate demands, boosting its BI solution value. Data security is crucial in the digital era, and Salesforce Analytics takes it seriously. The platform protects critical company data using encryption, access restrictions, and industry standards. Businesses may use their data confidently because of this security assurance. Finally, Salesforce Analytics improves business intelligence by offering real-time insights, collaboration, customer knowledge, and precise forecasting and planning. In today's fast-paced corporate world, its ability to interact with third-party data sources and secure data analysis makes it a useful tool. Salesforce Analytics will allow data-driven decision-making and strategic planning to help companies accomplish their objectives and expand sustainably as they navigate the contemporary market.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Windgassen, S., M. Artom, C. Norton, L. Sweeney e R. Moss-Morris. "N02 The IBD-BOOST programme: developing a digital self-management intervention for symptoms of fatigue, pain and urgency in inflammatory bowel disease". Journal of Crohn's and Colitis 14, Supplement_1 (janeiro de 2020): S657—S658. http://dx.doi.org/10.1093/ecco-jcc/jjz203.986.

Texto completo da fonte
Resumo:
Abstract Background Fatigue, pain and urgency are among the most commonly reported and burdensome symptoms of inflammatory bowel disease (IBD). A disconnect between symptoms and inflammation has been documented and medical management does not always adequately resolve symptoms. Extensive research shows the relationship between IBD symptoms and psychosocial factors. This poster describes the development of a facilitator-supported, theory-driven, tailored web-based intervention for fatigue, pain and urgency in IBD. Methods The Medical Research Council (MRC) guidance and the person-based approach were used to guide intervention development. Literature reviews of psychosocial factors related to fatigue, pain and urgency in IBD and trials of behavioural interventions were used to create a cognitive-behavioural model of symptom perpetuation and impact. The model was tested and refined in large cross-sectional and qualitative studies to understand patients’ experiences of these symptoms and intervention needs. The refined model was mapped onto an intervention logic model to define the psychosocial processes to target in intervention techniques. Patient feedback on the logic model and session content was obtained. Usability of the website was assessed using think-aloud methods and survey data were collected on session content, design and functionality. Results 87 people with IBD and 68 IBD nurses participated in Patient and Public Involvement activities for intervention development. Five interviews were carried out to develop guiding principles and two focus groups provided feedback on a logic model and session plan. 54 people with IBD and 45 IBD nurses completed an initial discovery online survey. Results indicated preferences to receive facilitator support via email/online-messages rather than telephone. Five focus groups included 68 IBD-nurses to assess barriers/facilitators in supporting the intervention. Desirable functionalities included diagrams/aids, email reminders and links to external resources. 31 people with IBD were included in feasibility and acceptability testing. The final intervention includes 8 core sessions with tasks and 4 symptom-specific sessions, and facilitator support of one 30-minute call and in-site messaging. Core to all sessions is understanding and ‘breaking’ personal ‘vicious cycles’ of symptom interference. Conclusion We have used a person-based approach and systematic application of theory, evidence and stakeholder involvement to guide intervention development. BOOST is the first web-based intervention with the primary aim of targeting fatigue, pain and urgency and improving the quality of life of people with IBD. This is now being tested in a large randomised controlled trial.
Estilos ABNT, Harvard, Vancouver, APA, etc.
11

Bharadiya, Jasmin. "Machine Learning in Cybersecurity: Techniques and Challenges". European Journal of Technology 7, n.º 2 (2 de junho de 2023): 1–14. http://dx.doi.org/10.47672/ejt.1486.

Texto completo da fonte
Resumo:
In the computer world, data science is the force behind the recent dramatic changes in cybersecurity's operations and technologies. The secret to making a security system automated and intelligent is to extract patterns or insights related to security incidents from cybersecurity data and construct appropriate data-driven models. Data science, also known as diverse scientific approaches, machine learning techniques, processes, and systems, is the study of actual occurrences via the use of data. Due to its distinctive qualities, such as flexibility, scalability, and the capability to quickly adapt to new and unknowable obstacles, machine learning techniques have been used in many scientific fields. Due to notable advancements in social networks, cloud and web technologies, online banking, mobile environments, smart grids, etc., cyber security is a rapidly expanding sector that requires a lot of attention. Such a broad range of computer security issues have been effectively addressed by various machine learning techniques. This article covers several machine-learning applications in cyber security. Phishing detection, network intrusion detection, keystroke dynamics authentication, cryptography, human interaction proofs, spam detection in social networks, smart meter energy consumption profiling, and security concerns with machine learning techniques themselves are all covered in this study. The methodology involves collecting a large dataset of phishing and legitimate instances, extracting relevant features such as email headers, content, and URLs, and training a machine-learning model using supervised learning algorithms. Machine learning models can effectively identify phishing emails and websites with high accuracy and low false positive rates. To enhance phishing detection, it is recommended to continuously update the training dataset to include new phishing techniques and to employ ensemble methods that combine multiple machine learning models for better performance.
Estilos ABNT, Harvard, Vancouver, APA, etc.
12

Jha, Sumi, e Som Sekhar Bhattacharyya. "Anand Automotive Limited: leadership development process for creating strategic impact". Emerald Emerging Markets Case Studies 3, n.º 3 (28 de junho de 2013): 1–16. http://dx.doi.org/10.1108/eemcs-02-2013-0013.

Texto completo da fonte
Resumo:
Subject area Leadership development for strategic impact in high growth export driven organization. Study level/applicability The case is suitable for second and final year students of a two year post graduate management programme (Master's level) on the following courses: leadership – on development of organization wide leadership processes; talent management – for identifying, nurturing and retaining talent in an organization and for developing leadership capabilities in managers; and strategic human resources (HR) – regarding building leadership development and talent management initiatives for creating a strategic level impact in the organization and its joint ventures. Case overview In about 45 years since its inception Anand Automotive Limited (AAL) has established itself as one of the premium firms in auto ancillary manufacturing and export. This case demonstrates how AAL built its leadership development programme. Further, the case elaborates on the coach/coachee mentorship programme at AAL. The case further explores the various initiatives under the broad umbrella of the Anand Leadership Development Programme (ALDP). The ALDP process has been woven into the fabric of HR practices of the organization. AAL sales turnover was USD1.2 billion in 2012 and it has a goal to achieve a turnover of USD2 billion by 2015. Mr K.C. Bhullar, the group head HR, had to plan an HR system which will embed leadership in the tapestry of AAL as an organization. The amalgamation of ALDP in AAL has to be disseminated across all levels at the 19 plants spread across different locations in India. The ALDP is expected to sprout a large number of leaders in AAL who can usher in an extremely quality focused and conscious organization. Such leaders would in their day-to-day demonstration of leadership at AAL help AAL to become an excellent manufacturing organization. This would help AAL to have a leadership position in the global automobile market. ALDP is also expected to create a band of leaders who would help the organization from very senior level strategic management positions and play leadership roles in its joint ventures. Expected learning outcomes This case can help students to understand how HR practices integrate leadership development programme for the strategic gains of an organization. Students would also understand the role of mentorship in coach/coachee processes. Supplementary materials Teaching notes are available for educators only. Please contact your library to gain login details or email support@emeraldinsight.com to request teaching notes.
Estilos ABNT, Harvard, Vancouver, APA, etc.
13

Patel, Pratish C., Ayne Adenew, Angela McKnight, Kevin Jeng e Angelike P. Liappis. "537. Implementation of a Workflow for COVID-19 Monoclonal Antibody Infusions at a Veterans Affairs Medical Center". Open Forum Infectious Diseases 8, Supplement_1 (1 de novembro de 2021): S370. http://dx.doi.org/10.1093/ofid/ofab466.736.

Texto completo da fonte
Resumo:
Abstract Background In the setting of the global pandemic due to COVID-19, high-risk patients with mild to moderate disease were identified as a group who would benefit from COVID-19 monoclonal antibody (mAB) treatment to mitigate progression to severe disease or hospitalization. The U.S. Food and Drug Administration (FDA), under Emergency Use Authorizations (EUA) approved multiple COVID-19 mAB therapies with specific criteria for eligibility of candidates, documentation of discussion with patients, and reporting of all errors and serious adverse events. Methods A cross discipline working group implemented a mAB clinic at complexity level 1a VA Medical Center in metropolitan Washington, D.C. through collaboration of personnel committed to patient care. The team successfully persuaded hospital leadership to provide space and leveraged technologies for rapid communication and dissemination of education. A stewardship driven medical center wide surveillance system rapidly identified outpatients for screening; primary care and ED providers were engaged through various electronic methods of education, including email, web-based team communication, intranet webpages and other electronic modalities. Within the EMR, an order panel was implemented to assure that the key requirements of the EUA were met and the provider was guided to the appropriate mAB, nursing, and PRN rescue medication orders. Results Of over 17,000 COVID-PCR tests were performed at our medical center, 198 outpatients were screened and 16 received COVID-19 mAB infusions between January 2, 2021 to May 31, 2021. One patient experienced a reaction requiring the infusion to be stopped and supportive medications to be administered; there were no long-term sequalae reported as a result of this event. Conclusion A multidisciplinary collaboration is well suited to implement innovative processes and policies for novel therapies in the middle of a pandemic. An agile workflow, regular communications between members of the workgroup, and commitment of institutional leadership helped facilitate the changes necessary to provide our patients the opportunity to receive potentially life-saving therapies. Disclosures All Authors: No reported disclosures
Estilos ABNT, Harvard, Vancouver, APA, etc.
14

Mishra, Jagriti. "Aavaran: creating niche through contemporary traditional textiles". Emerald Emerging Markets Case Studies 3, n.º 2 (24 de maio de 2013): 1–12. http://dx.doi.org/10.1108/eemcs-aug-2012-0143.

Texto completo da fonte
Resumo:
Subject area Marketing. Study level/applicability The case is aimed at Business Administration students. Case overview Udaipur based Aavaran – the echos of rural India – is a concept by COS-V, a leading non-governmental organization (NGO), which aims at connecting the tribal women of rural India with the mainstream. The NGO, set up in 1988 by Smt. Girija Vyas, was initially involved in imparting vocational training to the rural poor. Later, COS-V was taken up by Alka Sharma, a graduate from the Indian Institute of Crafts and Design, Jaipur, who completely changed the direction of the NGO. Her interest in textiles and crafts led to the genesis of the concept “Aavaran”. Aavaran is a retail outlet which was opened with a vision to provide the Indian market with traditional yet contemporary textiles and clothing. It offers a collection of women's and children's clothing and home textiles using a variety of traditional textiles and crafts. It is an artisan driven concept where the supply chain incorporates the essence of Indian textiles and crafts at every level. From the dyeing, printing, sampling and assembly of garments everything is done by the local women trained by COS-V with the support of DC-Handicrafts. The raw materials – the textiles, grey fabrics, etc. – are sourced directly from the rural weavers and artisans across India. The case study discusses how Aavaran developed the unique positioning of a retail platform for contemporary products made from traditional techniques, skills and hand-based processes; how it could revive the diminishing arts of Dabu and Phetia and how it carved a niche through its channelized marketing efforts. Expected learning outcomes The case will familiarize management students with the concept of niche marketing with Udaipur based firm Aavaran as an example which developed a unique positioning through its traditionally developed products. It will also acquaint students with a basic understanding of a supply chain with a cooperative firm in focus. Supplementary materials Teaching notes are available for educators only. Please contact your library to gain login details or email support@emeraldinsight.com to request teaching notes.
Estilos ABNT, Harvard, Vancouver, APA, etc.
15

Faiman, Beth M., Paul Jacobsen, Amy Callahan, Still Nadia, SaraLena Panzer e Rachid Baz. "Feasibility of Implementing Innovative Supportive Care Plans for Symptom Management in Multiple Myeloma". Blood 126, n.º 23 (3 de dezembro de 2015): 5620. http://dx.doi.org/10.1182/blood.v126.23.5620.5620.

Texto completo da fonte
Resumo:
Abstract Introduction: Treatment of Multiple Myeloma (MM) is undergoing transformation. With the rapid emergence of new agents and regimens has also come new toxicities that limit their tolerability and impact patient adherence. Innovative strategies are needed to proactively screen for, assess, and manage disease- and treatment-related symptoms, and engage patients (pts) and families in their identification and self-management. To be sustainable, processes must efficiently integrated within clinical workflows. Growing attention is thus being placed on the potential of novel health information technology (IT) solutions to transform cancer care delivery. Yet, most IT solutions to date have focused little attention to supportive care and facilitating self-management of symptoms. The Carevive (formally On Q) Care Planning System™ (CPS) was developed to address this gap, by providing efficient and clinically integrated IT solutions that facilitate evidence-based supportive care, while concurrently promoting supported self-management. The primary objective of this study is to assess the feasibility of the Carevive CPS electronic symptom screening and care management system in pts receiving active treatment for MM. Methods: Ninety pts with MM and their healthcare providers (physicians and nurse practitioners) are planned for enrollment to this in-progress multi-center pilot intervention study (30 pts each at 3 sites). All participants must be 18 years or older with a pathologically confirmed diagnosis of MM and currently receiving treatment for at least 4 weeks prior to enrollment. At baseline and a 4-8 week follow-up, each patient completes an electronic patient-reported outcome (ePRO) questionnaire in the Carevive CPS about their symptoms. These, plus clinical data, are processed by the Carevive CPS proprietary rules engine to generate personalized, guideline-driven care plans with peer-reviewed and evidence-based recommendations for symptom management. Providers review, finalize, and deliver these care plans containing individually tailored education, resources, and referrals at the 2 clinical visits. Care plans can be delivered to the pts electronically (via secure email or thumb drive) or paper. Feasibility of delivery of the Carevive CPS intervention was assessed by: a) patient enrollment and attrition, b) completion rates of key intervention components (e.g., completion of the questionnaire by pts, percent of pts who receive a care plan), and c) process variables (e.g. format of care plan delivery, time spent discussing the care plan). A tracking log was used to capture percentage of pts enrolled and rates of completion of ePRO questionnaires. Process variables were captured using a study designed worksheet, completed by research staff following each intervention visit. Achievement of feasibility of the intervention was defined as a questionnaire and care plan completion rate of 75% or higher. Data Analysis: To date 20 pts have enrolled at 2 participating sites, with a third site planned to open for enrollment in August 2015. Nine providers are participating to date. Updated data on both pts and providers will be presented at time of the symposium. Of the 25 pts approached to participate, 20 participated for a consent rate of 80%. 20/20 (100%) of pts completed the questionnaire in full. The average time the patient spent independently completing the questionnaire was 7.36 minutes and 100% of these pts received care plans. Healthcare providers spent an average of 5.55 minutes discussing the care plan with the patient. Care plan delivery has been predominately electronic, with 70% having been delivered via secure email or thumb drive and 30% being delivered via paper format. Conclusions: The Carevive CPS is a novel electronic platform designed to deliver evidence-based and personalized supportive care plans to pts at the point of care. Preliminary data, which will be updated at time of presentation, are supportive of its feasibility in MM. The 75% threshold for feasibility has been met for enrollment, ePRO questionnaire completion, and care plan delivery. Time spent in questionnaire completion and care plan review was low compared to other studies. These feasibility data provide initial evidence to support the Carevive CPS intervention. Future studies are planned to evaluate the impact on patient outcomes including symptom burden and adherence to therapy. Disclosures Faiman: Amgen/Onyx: Consultancy; Takeda/Millennium: Consultancy; Celgene: Consultancy, Speakers Bureau. Nadia:Carevive, Inc.: Employment. Panzer:Carevive, Inc.: Employment.
Estilos ABNT, Harvard, Vancouver, APA, etc.
16

Hjollund, Niels Henrik I. "Fifteen Years’ Use of Patient-Reported Outcome Measures at the Group and Patient Levels: Trend Analysis". Journal of Medical Internet Research 21, n.º 9 (30 de setembro de 2019): e15856. http://dx.doi.org/10.2196/15856.

Texto completo da fonte
Resumo:
Background Since 2004, we have collected patient-reported outcome (PRO) data from several Danish patient populations for use at the group and patient levels. Objective The aim of this paper is to highlight trends during the last 15 years with respect to patient inclusion, the methods for collection of PRO data, the processing of the data, and the actual applications and use of the PRO measurements. Methods All PRO data have been collected using the AmbuFlex/WestChronic PRO system, which was developed by the author in 2004 and has been continuously updated since. The analysis of trends was based on a generic model applicable for any kind of clinical health data, according to which any application of clinical data may be divided into four processes: patient identification, data collection, data aggregation, and the actual data use. Data for analysis were generated by a specific application in the system and transferred for analysis to the R package. Results During the 15-year period, 78,980 patients within 28 different groups of chronic and malignant illnesses have answered 260,433 questionnaires containing a total of 13,538,760 responses. Several marked changes have taken place: (1) the creation of cohorts for clinical epidemiological research purposes has shifted towards cohorts defined by clinical use of PRO data at the patient level; (2) the development of AmbuFlex, where PRO data are used as the entire basis for outpatient follow-up instead of fixed appointments, has undergone exponential growth and the system is currently in use in 47 International Statistical Classification of Diseases and Related Health Problems groups, covering 16,000 patients and 94 departments throughout Denmark; (3) response rates (up to 92%) and low attrition rates have been reached in group level projects, and there are even higher response rates in AmbuFlex where the patients are individually referred; (4) The answering method has shifted, as while in 2005 a total of 66.5% of questionnaires were paper based, this is the case for only 4.3% in 2019; and (5) the approach methods for questionnaires and reminders have changed dramatically from letter, emails, and short message service text messaging to a national, secure electronic mail system through which 93.2% of the communication to patients took place in 2019. The combination of secure email and web-based answering has resulted in a low turnaround time in which half of responses are now received within 5 days. Conclusions The demand for clinical use of PRO measurements has increased, driven by a wish among patients as well as clinicians to use PRO to promote better symptom assessment, more patient-centered care, and more efficient use of resources. Important technological changes have occurred, creating new opportunities, and making PRO collection and use cheaper and more feasible. Several legal changes may constitute a barrier for further development as well as a barrier for better utilization of patients’ questionnaire data. The current legal restrictions on the joint use of health data imposed by the European Union’s General Data Protection Regulation makes no distinction between use and misuse, and steps should be taken to alleviate these restrictions on the joint use of PRO data.
Estilos ABNT, Harvard, Vancouver, APA, etc.
17

Heinrich, C. H., S. McCarthy, S. McHugh, G. M. Curran e M. D. Donovan. "449 Multidisciplinary deprescribing review for frail older adults in long-term care: Development of a theory informed implementation strategy using a Delphi survey and Roundtable discussion with stakeholders". International Journal of Pharmacy Practice 31, Supplement_1 (1 de abril de 2023): i8—i9. http://dx.doi.org/10.1093/ijpp/riad021.010.

Texto completo da fonte
Resumo:
Abstract Introduction Deprescribing, the systematic process of stopping or altering inappropriate medicines, has been suggested as a safe, effective and appropriate process to optimise prescribing for older adults (1). There is still a lack of understanding of how best to implement sustainable deprescribing in long-term care (LTC). Aim Design a theory-driven implementation strategy, based on consensus from healthcare professionals (HCPs), to facilitate their engagement with deprescribing for frail older adults in LTC. Methods This study consisted of three phases and was designed in conjunction with HCPs working in LTC. Firstly, barriers and enablers to deprescribing in LTC previously identified by the research team were mapped to behaviour change techniques (BCTs) (2). Secondly, a Delphi survey of HCPs (General Practitioners (GPs), Pharmacists, nurses, geriatricians and psychiatrists of old age) was conducted to select feasible BCTs to support deprescribing. HCPs were purposively sampled and recruited via email, divided into HCPs working in Ireland and internationally. Experts were identified from existing professional relationships, engagement with LTC or deprescribing research. Using the results from the Delphi process, the literature on deprescribing interventions and research team knowledge, the BCTs which could form components of an intervention in LTC were shortlisted based on was acceptability, effectiveness, affordability, safety and equity (APEASE). Finally, a roundtable discussion was held with a purposeful, convenience sample of GPs, Pharmacists and nurses working in LTC in Ireland, to prioritise the previously identified barriers/enablers and operationalise the proposed deprescribing strategies created from feasible BCTs and identify the most important strategy to facilitate deprescribing. Results Overall, 34 BCTs were mapped to previously identified barriers/enablers to deprescribing in LTC. For the Delphi survey, 33 HCPs were invited, 20 agreed and it was completed by 16 participants. The Delphi consisted of two rounds. Participants reached consensus that 26 of the BCTs could feasibly be implemented in LTC. Following the APEASE assessment, 21 BCTs were considered eligible for operationalisation. The roundtable discussion, consisting of eight HCPs, identified that lack of resources (time, staffing, technology), was the most important barrier to address. The agreed implementation strategy to enhance engagement with deprescribing processes was an education session prior to a multidisciplinary team (MDT) meeting, led by a nurse from the LTC setting. This was designed from 11 BCTs, including action planning, social support and environmental restructuring addressing the predominant barrier. Incorporating the MDT reduces the burden which would exist if the responsibility was placed on one HCP. Meeting at the LTC site addresses the insufficient technology, as patient information is available. Conclusion This study describes an implementation strategy design process following principles of behavioural and implementation science. Engaging targeted end users throughout the process allows for the creation of a strategy which is intended to address the main perceived barrier to deprescribing of insufficient resources. A limitation of this study is the specificity of the intervention for Irish LTC context, which may have different staffing and organisational structures compared to international healthcare systems. References 1. Ibrahim K, Cox NJ, Stevenson JM, Lim S, Fraser SDS, Roberts HC. A systematic review of the evidence for deprescribing interventions among older people living with frailty. BMC Geriatrics. 2021 Apr 17;21(1):258. 2. Michie S, Atkins L, West R. The Behaviour Change Wheel: A Guide to Designing Interventions [Internet]. 1st ed. Silverback Publishing; 2014 [cited 2022 Jan 11]. Available from: https://books.google.ie/books/about/The_behaviour_change_wheel_a_guide_to_de.html?id=1TGIrgEACAAJ&source=kp_book_description&redir_esc=y
Estilos ABNT, Harvard, Vancouver, APA, etc.
18

Zamora, David, e Juan Carlos Barahona. "Data-driven innovation at the Grupo Pellas SER company". Emerald Emerging Markets Case Studies 6, n.º 2 (10 de junho de 2016): 1–16. http://dx.doi.org/10.1108/eemcs-06-2015-0147.

Texto completo da fonte
Resumo:
Subject area Management of Innovation and Technology/Management Information Systems. Study level/applicability Information Systems. Case overview SER (Sugar, Energy & Rum) was a company belonging to the Grupo Pellas Corporation. The company operated in four countries, had six subsidiaries, employed more than 25,000 people, had more than 43,500 manzanas of sugarcane crops in Nicaragua alone and had global annual sales of more than US$400m. In 2008, due to the negative effects of the crisis on the company’s business model (increasing costs due to higher prices for fuel and decreasing income because of low international sugar prices), the company decided to implement a business intelligence (BI) system to optimize its processes to reduce costs and increase productivity. At that time, the company had more than 100 years of data, information systems that fed into their main business processes and a culture that appreciated data as the basis for decision-making. However, there were inconsistencies among data systems, users received highly complex reports in Excel or green screens and process monitoring happened long after the tasks had been completed. As a response, SER used extract–transform–load to collect and clean data that would be used in the BI system (the case leaves the questions regarding the systems selection unsolved for discussion). Based on their business model, they selected the most critical processes and defined key performance indicators to measure the impact of changes in those processes. They considered graphic design as a tool to make the system more accepted by users and worked together with users so that reports only offered the most important information. The result was improved costs and productivity. They decreased manual time spent by 14 per cent, automated time spent by 10 per cent, and eliminated 1,556 hours of dead time for equipment in the field, which allowed them to increase productivity by US$1m just in sugar. They saved 20,000 trips from the fields to the factories, which represented more than US$1m in savings by monitoring the weight of wagons loaded with sugarcane in real time. They improved client perceptions about the company both locally and internationally by implementing a sugar traceability system. Expected learning outcomes The case “Business Intelligence at the Grupo Pellas SER Company” has as its objective to respond to the question: How does a company make its BI system implementation successful? As such, the case: Discusses what a BI system is and what it provides to a business analyses challenges, benefits and context when implementing a BI system; analyses success factors and recommendations in the BI system implementation process; analyses the process of implementing a BI and highlights the importance of the system priority questions and technological alternatives. Supplementary materials Teaching notes are available for educators only. Please contact your library to gain login details or email support@emeraldinsight.com to request teaching notes. Subject code CSS 11: Strategy
Estilos ABNT, Harvard, Vancouver, APA, etc.
19

Bedir, Cemre. "Contract Law in the Age of Big Data". European Review of Contract Law 16, n.º 3 (8 de setembro de 2020): 347–65. http://dx.doi.org/10.1515/ercl-2020-0020.

Texto completo da fonte
Resumo:
AbstractIn data-driven business models, users’ personal data is collected in order to determine the preferences of consumers and to the tailor production and advertising to these preferences. In these business models, consumers do not pay a price but provide their data, such as IP numbers, locations, and email addresses to benefit from the digital service or content. Contracts facilitate interactions between these providers and users. Their transactions are regulated by contracts in which their agreement on data use and data processing are stipulated. Data is always collected and processed through a contractual relationship and in this paper, I will argue that there are problems arising from contracts involving data to which contract law applies and that contract law can map these problems and offer insights. The scope of this study will be limited to issues where data is provided as counter-performance and where data is provided in addition to a monetary payment.
Estilos ABNT, Harvard, Vancouver, APA, etc.
20

Păvăloaia, Vasile-Daniel, Ionuț-Daniel Anastasiei e Doina Fotache. "Social Media and E-mail Marketing Campaigns: Symmetry versus Convergence". Symmetry 12, n.º 12 (25 de novembro de 2020): 1940. http://dx.doi.org/10.3390/sym12121940.

Texto completo da fonte
Resumo:
Companies use social business intelligence (SBI) to identify and collect strategically significant information from a wide range of publicly available data sources, such as social media (SM). This study is an SBI-driven analysis of a company operating in the insurance sector. It underlines the contribution of SBI technology to sustainable profitability of a company by using an optimized marketing campaign on Facebook, in symmetry with a traditional e-mail campaign. Starting from a campaign on SM, the study identified a client portfolio, processed data, and applied a set of statistical methods, such as the index and the statistical significance (T-test), which later enabled the authors to validate research hypotheses (RH), and led to relevant business decisions. The study outlines the preferences of the selected group of companies for the manner in which they run a marketing campaign on SM in symmetry with an e-mail-run campaign. Although the study focused on the practical field of insurance, the suggested model can be used by any company of any industry proving that BI technologies is the nexus of collecting and interpreting results that are essential, globally applicable, and lead to sustainable development of companies operating in the age of globalization. The results of the study prove that symmetrical unfolding (time and opportunity symmetry) of SM marketing campaigns, and using email, could lead to better results compared to two separate marketing campaigns. Moreover, the outcomes of both campaigns showed convergence on SBI platforms, which led to higher efficiency of management of preferences of campaign beneficiaries in the insurance sector.
Estilos ABNT, Harvard, Vancouver, APA, etc.
21

Alfianti, Sarahdillah, Lesi Hertati, Lili Syafitri, Aris Munandar e Rum Hendarmin. "SOSIALISASI PENGEMBANGAN STRATEGI PEMASARAN DAN PENGELOLAAN UMKM POTENSI DESA PETANANG MELALUI PENINGKATAN KEMAMPUAN SUMBER DAYA MANUSIA PROGRAM KKN TEMATIK MBKM MAHASISWA UNIVERSITAS INDO GLOBAL MANDIRI". PRIMA : PORTAL RISET DAN INOVASI PENGABDIAN MASYARAKAT 1, n.º 4 (27 de agosto de 2022): 153–62. http://dx.doi.org/10.55047/prima.v1i4.337.

Texto completo da fonte
Resumo:
The natural resources produced by Petanang Village are very abundant, but the natural resources owned by Petanang Village have not been fully utilized by the community, so there is a need for guidance or assistance to the community to take advantage of the existing potential. One of them is by utilizing processed cassava / sweet potato products that are made into a food with a high selling price. Seeing the problems and potentials that exist, the village community is considered to need to know more about how to process and benefit from cassava / sweet potato so that it can increase profits or can be used as a source of additional income for the community. Digital marketing is the marketing of products or services using technology via the internet, social media, cell phones, or other digital media. It is also an umbrella term that can cover a variety of digital marketing strategies such as social media marketing, search engine optimization (SEO), as well as email marketing. This digital marketing strategy is different from traditional marketing such as print media, billboards, and TV. This digital marketing strategy is certainly data driven. The purpose of digital marketing is to attract consumers and potential customers quickly. As you know, the acceptance of technology and the internet in society is very broad, so it is not surprising that digital marketing activities are the main choice for every company. As a result, every company competes with each other and competes in creating interesting content to be displayed in its marketing in cyberspace.
Estilos ABNT, Harvard, Vancouver, APA, etc.
22

Blankenship, Peter, David DeLaRosa, Marc Burris, Steven Cusson, Kayla Hendricks, Christopher Earwood, Suzanne Fields Jones et al. "Dedicated clinical trial tissue tracking database to improve turn-around time at high-volume center." Journal of Clinical Oncology 39, n.º 15_suppl (20 de maio de 2021): 1543. http://dx.doi.org/10.1200/jco.2021.39.15_suppl.1543.

Texto completo da fonte
Resumo:
1543 Background: Tissue requirements in oncology clinical trials are increasingly complex due to prescreening protocols for patient selection and serial biopsies to understand molecular-level treatment effects. Novel solutions for tissue processing are necessary for timely tissue procurement. Based on these needs, we developed a Tissue Tracker (TT), a comprehensive database for study-related tissue tasks at our high-volume clinical trial center. Methods: In this Microsoft Access database, patients are assigned an ID within the TT that is associated with their name, medical record number, and study that follows their request to external users: pathology departments, clinical trial coordinators and data team members. To complete tasks in the TT, relevant information is required to update the status. Due to the high number of archival tissue requests from unique pathology labs, the TT has a “Follow-Up Dashboard” that organizes information needed to conduct follow-up on all archival samples with the status “Requested”. This results in an autogenerated email and pdf report sent to necessary teams. The TT also includes a kit inventory system and a real-time read only version formatted for interdepartmental communication, metric reporting, and other data-driven efforts. The primary outcome in this study was to evaluate our average turnaround time (ATAT: average time from request to shipment) for archival and fresh tissue samples before and after TT development. Results: Before implementing the TT, between March 2016 and March 2018, we processed 2676 archival requests from 235 unique source labs resulting in 2040 shipments with an ATAT of 19.29 days. We also processed 1099 fresh biopsies resulting in 944 shipments with an ATAT of 7.72 days. After TT implementation, between April 2018 and April 2020, we processed 2664 archival requests from 204 unique source labs resulting in 2506 shipments (+28.0%) with an ATAT of 14.78 days (-23.4%). During that same period, we processed 1795 fresh biopsies (+63.3%) resulting in 2006 shipments (+112.5%) with an ATAT of 6.85 days (-11.3%). Conclusions: Oncology clinical trials continue to evolve toward more extensive tissue requirements for prescreening and scientific exploration of on-treatment molecular profiling. Timely results are required to optimize patient trial participation. During the intervention period, our tissue sample volume and shipments increased, but the development and implementation of an automated tracking system allowed improvement in ATAT of both archival and fresh tissue. This automation not only improves end-user expectations and experiences for patients and trial sponsors but this allows our team to adapt to the increasing interest in tissue exploration.
Estilos ABNT, Harvard, Vancouver, APA, etc.
23

McLeod, M., S. Farah, K. Macaulay, T. Sheth, M. Patel, A. Ghafour, M. Denning et al. "Designing a continuous data-driven feedback and learning initiative to improve electronic prescribing: an interdisciplinary quality improvement study". International Journal of Pharmacy Practice 29, Supplement_1 (26 de março de 2021): i18—i19. http://dx.doi.org/10.1093/ijpp/riab016.023.

Texto completo da fonte
Resumo:
Abstract Introduction The WHO Global Patient Safety Challenge aims to reduce severe avoidable medication-related harm by 50% by 2023[1]. Research suggests that providing timely, trusted feedback that incorporates relevant action can improve practice. However, a key barrier is lack of prescribing error data. Hospital electronic prescribing (EP) data may help address this gap. Aims To explore approaches for continuously monitoring medication safety signals using existing or new EP data, and to deliver personalised prescribing feedback and learning to improve patient safety. Methods We conducted a feasibility study (November 2019 - February 2020) on a 28-bed adult gastroenterology. This ward was chosen because of a high prescribing error rate. All foundation year 1 and 2 doctors, and pharmacists on the ward, participated in the study. The study team comprised pharmacists, doctors, quality improvement experts and clinical analysts, and used a quality improvement approach to design and test (i) methods for extracting electronic data to calculate prescribing accuracy rates, (ii) ways to refine a paper-prototype of an electronic pharmacists’ interventions form, (iii) potential digital medication safety indicators, and (iv) approaches for feedback for doctors to augment existing verbal feedback from pharmacists. Data were documented in accordance with local information governance and analysed using Excel. Acceptability and usability was assessed through verbal feedback from participants during weekly huddles. Outcome measures: feasibility of using EP to determine prescribing accuracy, user acceptability and usability of data collection, feedback and learning by pharmacists and doctors. We also measured changes in prescribing accuracy rate, pharmacists’ interventions, and quality of prescribing for targeted problematic medications. Results Extracting EP data required multiple data linkages to be configured and validated, and not all required data were available. Potential digital medication safety indicators: utility of the reason code ‘prescribed in error’ and actions by pharmacists to modify medications were limited by underuse and lack of data granularity. After testing different ways to extract relevant EP data, we eventually used a combination of EP and manual retrospective review of electronic patient records to determine prescribing accuracy rates. An intervention form was redesigned to tally interventions and capture details for contextual learning for email feedback to doctors and weekly prescribing improvement huddles. Doctors reported emails as timely and helpful for gaining new prescribing- and system-related knowledge. Pharmacists reported intervention data as providing invaluable evidence to drive improvement. Statistical process control charts showed no special cause variation around a mean prescription accuracy rate of 98% for inpatient orders, and 87% for discharge orders. By contrast, pharmacists recorded a mean of 10 interventions/day with 7 special cause variation (above upper control limit of 19) in the first two months. Omission of venous thromboembolism prophylaxis was identified as a priority medication issue. Specific prescriber- and system-based improvements were suggested (Jan 2020), some implemented (Feb 2020) and others fed back to the thrombosis committee (Feb 2020). Conclusion Harnessing the potential of EP data to improve medication safety requires the workforce to have a deeper understanding of the EP data structure and processes. Using a quality improvement approach, we developed a feedback and learning model that is acceptable and useful to pharmacists and doctors. Further research should explore adapting the approach to other clinical areas. Reference 1. Sheikh, A., Dhingra-Kumar, N., Kelley, E., Kieny, M. and Donaldson, L. The Third Global Patient Safety Challenge: Tackling Medication-Related Harm. Bulletin of the World Health Organization. World Health Organisation. 2017;95:546-546A.
Estilos ABNT, Harvard, Vancouver, APA, etc.
24

Rajasekhar, Karanam, e Mr Zeeshan Khan. "Structural Health Monitoring Using IOT". INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, n.º 07 (26 de julho de 2024): 1–14. http://dx.doi.org/10.55041/ijsrem36802.

Texto completo da fonte
Resumo:
In the construction industry, maintaining structural integrity is pivotal for safety, efficiency, and economic viability. Traditional inspection methods, often sporadic and reliant on visual assessments, can overlook critical issues, especially in challenging environments where access is restricted or hazardous. The integration of IoT (Internet of Things) technology has revolutionized structural health monitoring by enabling continuous, remote data collection and analysis through sophisticated sensor networks. These networks, comprising wireless sensors strategically placed across buildings or infrastructure, monitor a range of parameters including temperature, humidity, light levels, vibration, and structural strain. This real-time data is transmitted wirelessly to central hubs or gateways, typically utilizing cost-effective solutions like Raspberry Pi devices programmed with Python for efficient data management. The collected data is then processed and stored in cloud servers, leveraging the scalability and accessibility of cloud computing to facilitate advanced signal processing and analysis. MATLAB is utilized for its robust capabilities in numerical computing and visualization, presenting the data in graphical formats that highlight trends, anomalies, and potential deterioration patterns. Crucially, this system incorporates an alert mechanism, notifying stakeholders via email of critical sensor readings or emerging issues, enabling swift responses to prevent accidents or structural failures. The adoption of IoT-enabled structural health monitoring offers multifaceted benefits to the construction industry and broader economic landscape. By continuously monitoring infrastructure health, this approach allows for early detection of defects or wear, facilitating proactive maintenance interventions that can significantly extend the service life of buildings and infrastructure. This proactive maintenance not only enhances safety and reliability but also reduces long-term costs associated with reactive repairs and unplanned downtime. Moreover, by minimizing the need for frequent physical inspections, IoT technology contributes to environmental sustainability by reducing carbon emissions associated with transportation and improving operational efficiency through data-driven decision-making. These efficiencies translate into tangible economic gains, as stakeholders can optimize resource allocation, prioritize maintenance efforts, and mitigate the financial impacts of unexpected structural failures or degradation. From a safety perspective, IoT-enabled monitoring systems enhance risk management by providing real-time insights into structural conditions. By identifying potential hazards or weaknesses early on, stakeholders can implement targeted interventions to mitigate risks and ensure compliance with stringent safety regulations. This proactive approach not only protects human lives but also safeguards investments in infrastructure by preemptively addressing issues before they escalate into costly emergencies. Furthermore, by leveraging cloud-based data storage and analytics, these systems empower stakeholders with unprecedented access to comprehensive, actionable insights.
Estilos ABNT, Harvard, Vancouver, APA, etc.
25

Maassen, Oliver, Sebastian Fritsch, Julia Palm, Saskia Deffge, Julian Kunze, Gernot Marx, Morris Riedel, Andreas Schuppert e Johannes Bickenbach. "Future Medical Artificial Intelligence Application Requirements and Expectations of Physicians in German University Hospitals: Web-Based Survey". Journal of Medical Internet Research 23, n.º 3 (5 de março de 2021): e26646. http://dx.doi.org/10.2196/26646.

Texto completo da fonte
Resumo:
Background The increasing development of artificial intelligence (AI) systems in medicine driven by researchers and entrepreneurs goes along with enormous expectations for medical care advancement. AI might change the clinical practice of physicians from almost all medical disciplines and in most areas of health care. While expectations for AI in medicine are high, practical implementations of AI for clinical practice are still scarce in Germany. Moreover, physicians’ requirements and expectations of AI in medicine and their opinion on the usage of anonymized patient data for clinical and biomedical research have not been investigated widely in German university hospitals. Objective This study aimed to evaluate physicians’ requirements and expectations of AI in medicine and their opinion on the secondary usage of patient data for (bio)medical research (eg, for the development of machine learning algorithms) in university hospitals in Germany. Methods A web-based survey was conducted addressing physicians of all medical disciplines in 8 German university hospitals. Answers were given using Likert scales and general demographic responses. Physicians were asked to participate locally via email in the respective hospitals. Results The online survey was completed by 303 physicians (female: 121/303, 39.9%; male: 173/303, 57.1%; no response: 9/303, 3.0%) from a wide range of medical disciplines and work experience levels. Most respondents either had a positive (130/303, 42.9%) or a very positive attitude (82/303, 27.1%) towards AI in medicine. There was a significant association between the personal rating of AI in medicine and the self-reported technical affinity level (H4=48.3, P<.001). A vast majority of physicians expected the future of medicine to be a mix of human and artificial intelligence (273/303, 90.1%) but also requested a scientific evaluation before the routine implementation of AI-based systems (276/303, 91.1%). Physicians were most optimistic that AI applications would identify drug interactions (280/303, 92.4%) to improve patient care substantially but were quite reserved regarding AI-supported diagnosis of psychiatric diseases (62/303, 20.5%). Of the respondents, 82.5% (250/303) agreed that there should be open access to anonymized patient databases for medical and biomedical research. Conclusions Physicians in stationary patient care in German university hospitals show a generally positive attitude towards using most AI applications in medicine. Along with this optimism comes several expectations and hopes that AI will assist physicians in clinical decision making. Especially in fields of medicine where huge amounts of data are processed (eg, imaging procedures in radiology and pathology) or data are collected continuously (eg, cardiology and intensive care medicine), physicians’ expectations of AI to substantially improve future patient care are high. In the study, the greatest potential was seen in the application of AI for the identification of drug interactions, assumedly due to the rising complexity of drug administration to polymorbid, polypharmacy patients. However, for the practical usage of AI in health care, regulatory and organizational challenges still have to be mastered.
Estilos ABNT, Harvard, Vancouver, APA, etc.
26

-, Prof Sangeeta Randive, e Dr Jayashri Nalkar -. "The Transformative Use of Artificial Intelligence in Augmenting Email Writing Skills in the Corporate Workplace". International Journal For Multidisciplinary Research 6, n.º 5 (26 de setembro de 2024). http://dx.doi.org/10.36948/ijfmr.2024.v06i05.27843.

Texto completo da fonte
Resumo:
Abstract The emergence of Artificial Intelligence (AI) has significantly reshaped various elements of the corporate workplace, particularly in the realm of email communication. This paper delves into the revolutionary role of AI in improving the efficiency, effectiveness, and personalization of emails within professional settings. It offers a comprehensive review of contemporary AI applications, including tools like ChatGPT, automated content creation systems, and intelligent response mechanisms, demonstrating how these technologies streamline repetitive tasks, optimize content for greater engagement, and ensure communication is error-free. By integrating AI into email writing processes, companies not only alleviate cognitive burdens on employees but also foster more dynamic and meaningful interactions with clients and colleagues. Moreover, this paper addresses the ethical concerns and challenges posed by AI-driven communication, such as issues of privacy, potential biases, and the importance of transparency in automated interactions. Through a nuanced examination of AI's strengths and limitations in email correspondence, the study emphasizes AI's transformative potential in corporate communication practices, ultimately contributing to improved workplace efficiency and productivity. The advent of Artificial Intelligence (AI) has brought profound changes to many sectors, and one of the most notable areas of transformation is the corporate workplace, particularly in email communication. As businesses increasingly rely on digital communication, the role of AI in enhancing both the quality and speed of email interactions has grown substantially. This paper examines the revolutionary impact of AI on professional email communication, focusing on how AI-driven tools are improving efficiency, effectiveness, and personalization in this domain. AI in Professional Email Communication: AI's contribution to email communication goes beyond just automating basic tasks. Advanced AI systems, such as ChatGPT, automated content creation tools, and intelligent response mechanisms, have introduced new ways to handle the repetitive nature of emails while enhancing the quality of the communication itself. Streamlining Repetitive Tasks: AI-powered systems can automate routine and repetitive email tasks, such as responding to common inquiries, organizing inboxes, scheduling meetings, and sending reminders. By taking over these low-value but time-consuming activities, AI frees up employees to focus on more strategic and creative work. Optimizing Content for Engagement: AI tools are also capable of optimizing the content of emails for greater engagement. Through natural language processing (NLP) and machine learning algorithms, these tools analyze factors like tone, style, and word choice, tailoring emails to specific audiences. This ensures that emails are not only professional and polished but also resonate better with recipients, fostering stronger relationships with clients, partners, and colleagues. Error-Free Communication: Another key advantage of AI is its ability to ensure error-free communication. By integrating grammar and style correction tools, AI systems detect and correct errors in real-time, which is especially useful in corporate settings where professionalism is paramount. This reduces the risk of misunderstandings, maintains the company’s credibility, and minimizes the need for human review, making email writing faster and more reliable. Personalization at Scale: AI enables a higher degree of personalization in email communication by analyzing previous interactions, customer preferences, and behavioral data. This allows companies to craft highly personalized emails that feel tailored to each recipient, even when sent in large volumes. Personalized emails are shown to increase engagement rates, improve client satisfaction, and create a more human-like connection between businesses and their clients.
Estilos ABNT, Harvard, Vancouver, APA, etc.
27

Johanes, Budi, e Manahan Parlindungan Saragih Siallagan. "Driving Business Growth through Data Decision Making: The Role of Marketing Automation". International Journal of Current Science Research and Review 07, n.º 08 (5 de agosto de 2024). http://dx.doi.org/10.47191/ijcsrr/v7-i8-16.

Texto completo da fonte
Resumo:
The business landscape has transformed significantly with the abundance of information and rapid technological advances, creating numerous opportunities for data-driven decision-making to foster company growth. This research evaluates the impact of marketing automation systems on marketing communication channels using a quantitative analysis approach with data from organizations that have implemented these systems. Marketing automation has become essential for businesses, automating repetitive tasks such as email marketing and customer segmentation. This software personalizes messages for different segments, increasing customer engagement, adoption rates, and accelerating company growth while enhancing profit margins. This research contributes new knowledge by demonstrating the positive effects of marketing automation on organizational growth rates. It shows that companies adopting automated platforms see dramatic efficiency increases. Businesses using these platforms for data-driven decision-making experience higher effectiveness rates and attract more customers, improving overall performance. The study also addresses potential challenges during implementation, such as integrating data management systems and customizing processes for organizational readiness. Examining how marketing automation supports business growth, the insights gained are relevant for both academia and practitioners. This research highlights the importance of information-based marketing strategies and the potential of marketing automation to foster sustainable business growth. The findings provide practical guidance for optimizing marketing activities to achieve a competitive edge through appropriate marketing automation systems.
Estilos ABNT, Harvard, Vancouver, APA, etc.
28

Tigchelaar, Magda. "‘I do the peer review by myself’". Writing and Pedagogy 12, n.º 2-3 (15 de agosto de 2021). http://dx.doi.org/10.1558/wap.20355.

Texto completo da fonte
Resumo:
In response to increasing interest in Vygotskian sociocultural theory in second-language learning (Lantolf and Thorne, 2006; Swain, Kinnear, and Steinman, 2015) and the call for understanding language-learning processes in relation to contexts surrounding individuals (e.g., Polio and Williams, 2009; Ferris and Hedgcock, 2014), this study adopts a sociocultural approach – more specifically, an activity theory (Leont’ev, 1981) framework – to explore an undergraduate student’s approach to L2 writing in a preparatory writing course. Using a single case study design (Duff, 2014), I investigated how a student from China learned to write academic papers that met the academic norms in an English as a second language (ESL) writing class in an American university. Specifically, I analyzed how his writing activity aligned with his instructor’s proposed approach to a writing task. Through the analysis of course materials, the participant’s written work, observations, email communications, and interviews, I tracked how his agency (Bhowmik, 2016; Casanave, 2012; Lee, 2008; Saenkhum, 2016) as a writer developed over his first semester in the ESL program. Findings indicate that while the participant did not follow the operations assigned by the instructor, he acted strategically to accomplish selected parts of his writing assignments. His mediated actions were driven by his goals and motives that were understood from within his social and cultural environments, and interacted with each other in a dynamic and constructive manner. Overall, the study underscores the need for flexible approaches to writing instruction and the usefulness of employing activity theory as a framework in studying L2 writing processes.
Estilos ABNT, Harvard, Vancouver, APA, etc.
29

Edmonds, Christopher, e Ashwini Panse. "The new world of meta finance and its yet to be tested efficiencies". Journal of Securities Operations & Custody, 1 de janeiro de 2023. http://dx.doi.org/10.69554/ndza3616.

Texto completo da fonte
Resumo:
Technology has been a long-standing catalyst for change, innovation and the emergence of new business models. As technology evolves and matures, the financial services industry revisits its current processes and capabilities to assess if leveraging more modern technologies can drive additional client and business value. There are some proposed use cases for distributed ledger technology (DLT) that propose disintermediating the entire financial industry. There is no doubt the broader financial industry agrees DLT presents an opportunity to shape the future vision of capital markets and recognises the value inherent in the shared DLT platform that can build security, privacy and auditability into every financial transaction and could potentially eliminate costly reconciliation. However, DLT, like any emerging technology, must be thoroughly vetted through rigorous testing. Moreover, regulators across the globe are promoting responsible innovation and fair competition among markets and market participants. And for innovation to be responsible and competition to be fair, it must comply with regulations. Meta finance aka decentralised finance (‘DeFi’) that runs on decentralised infrastructure, remains immature and volatile, with several economic, technical, ethical and public policy issues still waiting to be addressed. DeFi enthusiasts claim that meta finance is doing to money what email did to postal services, with a promise to provide a secure financial platform that is open to anyone with access to a computer and an internet connection. It has the potential to transform global finance, but activity to date has focused on the community of digital asset owners. DeFi offers efficiencies driven by automation and disintermediation, powered by blockchains and smart contracts with a vision of a more efficient payment system, with instant transactions and lower costs no matter where on the globe one is located. Its efficiencies and safeguards, however, are yet to be tested and the broader community feels safe and secure with the belts and braces traditional finance offers today. DeFi is not devoid of risks relating to high volatility, market manipulation, fraud, illicit finance and lack of governance, which collectively could severely damage market integrity and investor confidence.
Estilos ABNT, Harvard, Vancouver, APA, etc.
30

Naji, Gehad Mohammed Ahmed, Khairul Shafee Kalid e K. S. Savita. "The Moderating Effect of App Trustworthiness and User Attitudes on Intention to Use Adopt Mobile Applications Among Employees in The Oil and Gas Industry". Sage Open 14, n.º 4 (outubro de 2024). http://dx.doi.org/10.1177/21582440241286300.

Texto completo da fonte
Resumo:
In today’s technologically advanced world, the adoption of mobile applications has become a transformative force across various industries, significantly impacting the way businesses operate and manage their processes. One area where mobile applications are making a remarkable impact is in the realm of Occupational Health and Safety (OHS). The integration of mobile applications in OHS practices has ushered in a new era of efficiency, accessibility, and data-driven decision-making, elevating workplace safety standards to unprecedented heights. Occupational health and safety are paramount concerns for organizations as they strive to create safe and healthy work environments for their employees. Historically, OHS management relied on manual and paper-based processes, which could be time-consuming, error-prone, and challenging to track comprehensively. However, with the advent of mobile technology, organizations now have the opportunity to streamline their OHS protocols, enhance communication, and proactively manage workplace risks, all through the convenience of mobile applications. This research aims to analyze the effects between perceived ease of use and the attitude towards the use of the mobile application. Perceived usefulness of applications on Attitude towards the use of the app, attitude towards the use of the app on intention to use the application. The population of this research is the employees from the oil and gas industry in Malaysia. The theory of the Technology Acceptance Model (TAM) was applied. Perceived danger, perceived utility, perceived ease of use, and perceived intention to use were the antecedents for mobile apps. Employees’ intentions to use mobile applications were shown to be significantly influenced by their satisfaction with the information system and their inventiveness. In this study, devices were categorized as large or small according to their screen size, while respondents were categorized as young (<30 years old) or old (>30 years old) according to their age. A data collection of 545 responses to an online survey with 32 questions was obtained for analysis. The survey was distributed via email to respondents using a Survey among employees. The factors impacting the intention to use mobile applications were investigated using structural equation modeling. We looked at the moderating impact of the app’s trustworthiness on users’ intentions to use/adopt it. The variables’ responses for perceived ease of use, perceived usefulness, attitude towards the use of the App, trust (integrity) of the App, and intention to use/adopt the application varied significantly between the subgroups, according to the results. The study also showed the existence of moderating effects on the intention to use/adopt an app related to the trust integrity of the app. This paper discusses the survey, the findings, and the ramifications of the observations. This research contributes to the current literature by presenting empirical evidence on the importance of trustworthiness and user attitudes in mobile app adoption in the oil and gas industry. It also discusses the broader implications for improving workplace safety through the strategic use of mobile devices in OHS practices.
Estilos ABNT, Harvard, Vancouver, APA, etc.
31

Leaver, Tama. "The Social Media Contradiction: Data Mining and Digital Death". M/C Journal 16, n.º 2 (8 de março de 2013). http://dx.doi.org/10.5204/mcj.625.

Texto completo da fonte
Resumo:
Introduction Many social media tools and services are free to use. This fact often leads users to the mistaken presumption that the associated data generated whilst utilising these tools and services is without value. Users often focus on the social and presumed ephemeral nature of communication – imagining something that happens but then has no further record or value, akin to a telephone call – while corporations behind these tools tend to focus on the media side, the lasting value of these traces which can be combined, mined and analysed for new insight and revenue generation. This paper seeks to explore this social media contradiction in two ways. Firstly, a cursory examination of Google and Facebook will demonstrate how data mining and analysis are core practices for these corporate giants, central to their functioning, development and expansion. Yet the public rhetoric of these companies is not about the exchange of personal information for services, but rather the more utopian notions of organising the world’s information, or bringing everyone together through sharing. The second section of this paper examines some of the core ramifications of death in terms of social media, asking what happens when a user suddenly exists only as recorded media fragments, at least in digital terms. Death, at first glance, renders users (or post-users) without agency or, implicitly, value to companies which data-mine ongoing social practices. Yet the emergence of digital legacy management highlights the value of the data generated using social media, a value which persists even after death. The question of a digital estate thus illustrates the cumulative value of social media as media, even on an individual level. The ways Facebook and Google approach digital death are examined, demonstrating policies which enshrine the agency and rights of living users, but become far less coherent posthumously. Finally, along with digital legacy management, I will examine the potential for posthumous digital legacies which may, in some macabre ways, actually reanimate some aspects of a deceased user’s presence, such as the Lives On service which touts the slogan “when your heart stops beating, you'll keep tweeting”. Cumulatively, mapping digital legacy management by large online corporations, and the affordances of more focussed services dealing with digital death, illustrates the value of data generated by social media users, and the continued importance of the data even beyond the grave. Google While Google is universally synonymous with search, and is the world’s dominant search engine, it is less widely understood that one of the core elements keeping Google’s search results relevant is a complex operation mining user data. Different tools in Google’s array of services mine data in different ways (Zimmer, “Gaze”). Gmail, for example, uses algorithms to analyse an individual’s email in order to display the most relevant related advertising. This form of data mining is comparatively well known, with most Gmail users knowingly and willingly accepting more personalised advertising in order to use Google’s email service. However, the majority of people using Google’s search engine are unaware that search, too, is increasingly driven by the tracking, analysis and refining of results on the basis of user activity (Zimmer, “Externalities”). As Alexander Halavais (160–180) quite rightly argues, recent focus on the idea of social search – the deeper integration of social network information in gauging search results – is oxymoronic; all search, at least for Google, is driven by deep analysis of personal and aggregated social data. Indeed, the success of Google’s mining of user data has led to concerns that often invisible processes of customisation and personalisation will mean that the supposedly independent or objective algorithms producing Google’s search results will actually yield a different result for every person. As Siva Vaidhyanathan laments: “as users in a diverse array of countries train Google’s algorithms to respond to specialized queries with localised results, each place in the world will have a different list of what is important, true, or ‘relevant’ in response to any query” (138). Personalisation and customisation are not inherently problematic, and frequently do enhance the relevance of search results, but the main objection raised by critics is not Google’s data mining, but the lack of transparency in the way data are recorded, stored and utilised. Eli Pariser, for example, laments the development of a ubiquitous “filter bubble” wherein all search results are personalised and subjective but are hidden behind the rhetoric of computer-driven algorithmic objectivity (Pariser). While data mining informs and drives many of Google’s tools and services, the cumulative value of these captured fragments of information is best demonstrated by the new service Google Now. Google Now is a mobile app which delivers an ongoing stream of search results but without the need for user input. Google Now extrapolates the rhythms of a person’s life, their interests and their routines in order to algorithmically determine what information will be needed next, and automatically displays it on a user’s mobile device. Clearly Google Now is an extremely valuable and clever tool, and the more information a user shares, the better the ongoing customised results will be, demonstrating the direct exchange value of personal data: total personalisation requires total transparency. Each individual user will need to judge whether they wish to share with Google the considerable amount of personal information needed to make Google Now work. The pressing ethical question that remains is whether Google will ensure that users are sufficiently aware of the amount of data and personal privacy they are exchanging in order to utilise such a service. Facebook Facebook began as a closed network, open only to students at American universities, but has transformed over time to a much wider and more open network, with over a billion registered users. Facebook has continually reinvented their interface, protocols and design, often altering both privacy policies and users’ experience of privacy, and often meeting significant and vocal resistance in the process (boyd). The data mining performed by social networking service Facebook is also extensive, although primarily aimed at refining the way that targeted advertising appears on the platform. In 2007 Facebook partnered with various retail loyalty services and combined these records with Facebook’s user data. This information was used to power Facebook’s Beacon service, which added details of users’ retail history to their Facebook news feed (for example, “Tama just purchased a HTC One”). The impact of all of these seemingly unrelated purchases turning up in many people’s feeds suddenly revealed the complex surveillance, data mining and sharing of these data that was taking place (Doyle and Fraser). However, as Beacon was turned on, without consultation, for all Facebook users, there was a sizable backlash that meant that Facebook had to initially switch the service to opt-in, and then discontinue it altogether. While Beacon has been long since erased, it is notable that in early 2013 Facebook announced that they have strengthened partnerships with data mining and profiling companies, including Datalogix, Epsilon, Acxiom, and BlueKai, which harness customer information from a range of loyalty cards, to further refine the targeting ability offered to advertisers using Facebook (Hof). Facebook’s data mining, surveillance and integration across companies is thus still going on, but no longer directly visible to Facebook users, except in terms of the targeted advertisements which appear on the service. Facebook is also a platform, providing a scaffolding and gateway to many other tools and services. In order to use social games such as Zynga’s Farmville, Facebook users agree to allow Zynga to access their profile information, and use Facebook to authenticate their identity. Zynga has been unashamedly at the forefront of user analytics and data mining, attempting to algorithmically determine the best way to make virtual goods within their games attractive enough for users to pay for them with real money. Indeed, during a conference presentation, Zynga Vice President Ken Rudin stated outright that Zynga is “an analytics company masquerading as a games company” (Rudin). I would contend that this masquerade succeeds, as few Farmville players are likely to consider how their every choice and activity is being algorithmically scrutinised in order to determine what virtual goods they might actually buy. As an instance of what is widely being called ‘big data’, the data miing operations of Facebook, Zynga and similar services lead to a range of ethical questions (boyd and Crawford). While users may have ostensibly agreed to this data mining after clicking on Facebook’s Terms of Use agreement, the fact that almost no one reads these agreements when signing up for a service is the Internet’s worst kept secret. Similarly, the extension of these terms when Facebook operates as a platform for other applications is a far from transparent process. While examining the recording of user data leads to questions of privacy and surveillance, it is important to note that many users are often aware of the exchange to which they have agreed. Anders Albrechtslund deploys the term ‘social surveillance’ to usefully emphasise the knowing, playful and at times subversive approach some users take to the surveillance and data mining practices of online service providers. Similarly, E.J. Westlake notes that performances of self online are often not only knowing but deliberately false or misleading with the aim of exploiting the ways online activities are tracked. However, even users well aware of Facebook’s data mining on the site itself may be less informed about the social networking company’s mining of offsite activity. The introduction of ‘like’ buttons on many other Websites extends Facebook’s reach considerably. The various social plugins and ‘like’ buttons expand both active recording of user activity (where the like button is actually clicked) and passive data mining (since a cookie is installed or updated regardless of whether a button is actually pressed) (Gerlitz and Helmond). Indeed, because cookies – tiny packets of data exchanged and updated invisibly in browsers – assign each user a unique identifier, Facebook can either combine these data with an existing user’s profile or create profiles about non-users. If that person even joins Facebook, their account is connected with the existing, data-mined record of their Web activities (Roosendaal). As with Google, the significant issue here is not users knowingly sharing their data with Facebook, but the often complete lack of transparency in terms of the ways Facebook extracts and mines user data, both on Facebook itself and increasingly across applications using Facebook as a platform and across the Web through social plugins. Google after Death While data mining is clearly a core element in the operation of Facebook and Google, the ability to scrutinise the activities of users depends on those users being active; when someone dies, the question of the value and ownership of their digital assets becomes complicated, as does the way companies manage posthumous user information. For Google, the Gmail account of a deceased person becomes inactive; the stored email still takes up space on Google’s servers, but with no one using the account, no advertising is displayed and thus Google can earn no revenue from the account. However, the process of accessing the Gmail account of a deceased relative is an incredibly laborious one. In order to even begin the process, Google asks that someone physically mails a series of documents including a photocopy of a government-issued ID, the death certificate of the deceased person, evidence of an email the requester received from the deceased, along with other personal information. After Google have received and verified this information, they state that they might proceed to a second stage where further documents are required. Moreover, if at any stage Google decide that they cannot proceed in releasing a deceased relative’s Gmail account, they will not reveal their rationale. As their support documentation states: “because of our concerns for user privacy, if we determine that we cannot provide the Gmail content, we will not be able to share further details about the account or discuss our decision” (Google, “Accessing”). Thus, Google appears to enshrine the rights and privacy of individual users, even posthumously; the ownership or transfer of individual digital assets after death is neither a given, nor enshrined in Google’s policies. Yet, ironically, the economic value of that email to Google is likely zero, but the value of the email history of a loved one or business partner may be of substantial financial and emotional value, probably more so than when that person was alive. For those left behind, the value of email accounts as media, as a lasting record of social communication, is heightened. The question of how Google manages posthumous user data has been further complicated by the company’s March 2012 rationalisation of over seventy separate privacy policies for various tools and services they operate under the umbrella of a single privacy policy accessed using a single unified Google account. While this move was ostensibly to make privacy more understandable and transparent at Google, it had other impacts. For example, one of the side effects of a singular privacy policy and single Google identity is that deleting one of a recently deceased person’s services may inadvertently delete them all. Given that Google’s services include Gmail, YouTube and Picasa, this means that deleting an email account inadvertently erases all of the Google-hosted videos and photographs that individual posted during their lifetime. As Google warns, for example: “if you delete the Google Account to which your YouTube account is linked, you will delete both the Google Account AND your YouTube account, including all videos and account data” (Google, “What Happens”). A relative having gained access to a deceased person’s Gmail might sensibly delete the email account once the desired information is exported. However, it seems less likely that this executor would realise that in doing so all of the private and public videos that person had posted on YouTube would also permanently disappear. While material possessions can be carefully dispersed to specific individuals following the instructions in someone’s will, such affordances are not yet available for Google users. While it is entirely understandable that the ramification of policy changes are aimed at living users, as more and more online users pass away, the question of their digital assets becomes increasingly important. Google, for example, might allow a deceased person’s executor to elect which of their Google services should be kept online (perhaps their YouTube videos), which traces can be exported (perhaps their email), and which services can be deleted. At present, the lack of fine-grained controls over a user’s digital estate at Google makes this almost impossible. While it violates Google’s policies to transfer ownership of an account to another person, if someone does leave their passwords behind, this provides their loved ones with the best options in managing their digital legacy with Google. When someone dies and their online legacy is a collection of media fragments, the value of those media is far more apparent to the loved ones left behind rather than the companies housing those media. Facebook Memorialisation In response to users complaining that Facebook was suggesting they reconnect with deceased friends who had left Facebook profiles behind, in 2009 the company instituted an official policy of turning the Facebook profiles of departed users into memorial pages (Kelly). Technically, loved ones can choose between memorialisation and erasing an account altogether, but memorialisation is the default. This entails setting the account so that no one can log into it, and that no new friends (connections) can be made. Existing friends can access the page in line with the user’s final privacy settings, meaning that most friends will be able to post on the memorialised profile to remember that person in various ways (Facebook). Memorialised profiles (now Timelines, after Facebook’s redesign) thus become potential mourning spaces for existing connections. Since memorialised pages cannot make new connections, public memorial pages are increasingly popular on Facebook, frequently set up after a high-profile death, often involving young people, accidents or murder. Recent studies suggest that both of these Facebook spaces are allowing new online forms of mourning to emerge (Marwick and Ellison; Carroll and Landry; Kern, Forman, and Gil-Egui), although public pages have the downside of potentially inappropriate commentary and outright trolling (Phillips). Given Facebook has over a billion registered users, estimates already suggest that the platform houses 30 million profiles of deceased people, and this number will, of course, continue to grow (Kaleem). For Facebook, while posthumous users do not generate data themselves, the fact that they were part of a network means that their connections may interact with a memorialised account, or memorial page, and this activity, like all Facebook activities, allows the platform to display advertising and further track user interactions. However, at present Facebook’s options – to memorialise or delete accounts of deceased people – are fairly blunt. Once Facebook is aware that a user has died, no one is allowed to edit that person’s Facebook account or Timeline, so Facebook literally offers an all (memorialisation) or nothing (deletion) option. Given that Facebook is essentially a platform for performing identities, it seems a little short-sighted that executors cannot clean up or otherwise edit the final, lasting profile of a deceased Facebook user. As social networking services and social media become more ingrained in contemporary mourning practices, it may be that Facebook will allow more fine-grained control, positioning a digital executor also as a posthumous curator, making the final decision about what does and does not get kept in the memorialisation process. Since Facebook is continually mining user activity, the popularity of mourning as an activity on Facebook will likely mean that more attention is paid to the question of digital legacies. While the user themselves can no longer be social, the social practices of mourning, and the recording of a user as a media entity highlights the fact that social media can be about interactions which in significant ways include deceased users. Digital Legacy Services While the largest online corporations have fairly blunt tools for addressing digital death, there are a number of new tools and niche services emerging in this area which are attempting to offer nuanced control over digital legacies. Legacy Locker, for example, offers to store the passwords to all of a user’s online services and accounts, from Facebook to Paypal, and to store important documents and other digital material. Users designate beneficiaries who will receive this information after the account holder passes away, and this is confirmed by preselected “verifiers” who can attest to the account holder’s death. Death Switch similarly provides the ability to store and send information to users after the account holder dies, but tests whether someone is alive by sending verification emails; fail to respond to several prompts and Death Switch will determine a user has died, or is incapacitated, and executes the user’s final instructions. Perpetu goes a step further and offers the same tools as Legacy Locker but also automates existing options from social media services, allowing users to specify, for example, that their Facebook, Twitter or Gmail data should be downloaded and this archive should be sent to a designated recipient when the Perpetu user dies. These tools attempt to provide a more complex array of choices in terms of managing a user’s digital legacy, providing similar choices to those currently available when addressing material possessions in a formal will. At a broader level, the growing demand for these services attests to the ongoing value of online accounts and social media traces after a user’s death. Bequeathing passwords may not strictly follow the Terms of Use of the online services in question, but it is extremely hard to track or intervene when a user has the legitimate password, even if used by someone else. More to the point, this finely-grained legacy management allows far more flexibility in the utilisation and curation of digital assets posthumously. In the process of signing up for one of these services, or digital legacy management more broadly, the ongoing value and longevity of social media traces becomes more obvious to both the user planning their estate and those who ultimately have to manage it. The Social Media Afterlife The value of social media beyond the grave is also evident in the range of services which allow users to communicate in some fashion after they have passed away. Dead Social, for example, allows users to schedule posthumous social media activity, including the posting of tweets, sending of email, Facebook messages, or the release of online photos and videos. The service relies on a trusted executor confirming someone’s death, and after that releases these final messages effectively from beyond the grave. If I Die is a similar service, which also has an integrated Facebook application which ensures a user’s final message is directly displayed on their Timeline. In a bizarre promotional campaign around a service called If I Die First, the company is promising that the first user of the service to pass away will have their posthumous message delivered to a huge online audience, via popular blogs and mainstream press coverage. While this is not likely to appeal to everyone, the notion of a popular posthumous performance of self further complicates that question of what social media can mean after death. Illustrating the value of social media legacies in a quite different but equally powerful way, the Lives On service purports to algorithmically learn how a person uses Twitter while they are live, and then continue to tweet in their name after death. Internet critic Evgeny Morozov argues that Lives On is part of a Silicon Valley ideology of ‘solutionism’ which casts every facet of society as a problem in need of a digital solution (Morozov). In this instance, Lives On provides some semblance of a solution to the problem of death. While far from defeating death, the very fact that it might be possible to produce any meaningful approximation of a living person’s social media after they die is powerful testimony to the value of data mining and the importance of recognising that value. While Lives On is an experimental service in its infancy, it is worth wondering what sort of posthumous approximation might be built using the robust data profiles held by Facebook or Google. If Google Now can extrapolate what a user wants to see without any additional input, how hard would it be to retool this service to post what a user would have wanted after their death? Could there, in effect, be a Google After(life)? Conclusion Users of social media services have differing levels of awareness regarding the exchange they are agreeing to when signing up for services provided by Google or Facebook, and often value the social affordances without necessarily considering the ongoing media they are creating. Online corporations, by contrast, recognise and harness the informatic traces users generate through complex data mining and analysis. However, the death of a social media user provides a moment of rupture which highlights the significant value of the media traces a user leaves behind. More to the point, the value of these media becomes most evident to those left behind precisely because that individual can no longer be social. While beginning to address the issue of posthumous user data, Google and Facebook both have very blunt tools; Google might offer executors access while Facebook provides the option of locking a deceased user’s account as a memorial or removing it altogether. Neither of these responses do justice to the value that these media traces hold for the living, but emerging digital legacy management tools are increasingly providing a richer set of options for digital executors. While the differences between material and digital assets provoke an array of legal, spiritual and moral issues, digital traces nevertheless clearly hold significant and demonstrable value. For social media users, the death of someone they know is often the moment where the media side of social media – their lasting, infinitely replicable nature – becomes more important, more visible, and casts the value of the social media accounts of the living in a new light. For the larger online corporations and service providers, the inevitable increase in deceased users will likely provoke more fine-grained controls and responses to the question of digital legacies and posthumous profiles. It is likely, too, that the increase in online social practices of mourning will open new spaces and arenas for those same corporate giants to analyse and data-mine. References Albrechtslund, Anders. “Online Social Networking as Participatory Surveillance.” First Monday 13.3 (2008). 21 Apr. 2013 ‹http://firstmonday.org/article/view/2142/1949›. boyd, danah. “Facebook’s Privacy Trainwreck: Exposure, Invasion, and Social Convergence.” Convergence 14.1 (2008): 13–20. ———, and Kate Crawford. “Critical Questions for Big Data.” Information, Communication & Society 15.5 (2012): 662–679. Carroll, Brian, and Katie Landry. “Logging On and Letting Out: Using Online Social Networks to Grieve and to Mourn.” Bulletin of Science, Technology & Society 30.5 (2010): 341–349. Doyle, Warwick, and Matthew Fraser. “Facebook, Surveillance and Power.” Facebook and Philosophy: What’s on Your Mind? Ed. D.E. Wittkower. Chicago, IL: Open Court, 2010. 215–230. Facebook. “Deactivating, Deleting & Memorializing Accounts.” Facebook Help Center. 2013. 7 Mar. 2013 ‹http://www.facebook.com/help/359046244166395/›. Gerlitz, Carolin, and Anne Helmond. “The Like Economy: Social Buttons and the Data-intensive Web.” New Media & Society (2013). Google. “Accessing a Deceased Person’s Mail.” 25 Jan. 2013. 21 Apr. 2013 ‹https://support.google.com/mail/answer/14300?hl=en›. ———. “What Happens to YouTube If I Delete My Google Account or Google+?” 8 Jan. 2013. 21 Apr. 2013 ‹http://support.google.com/youtube/bin/answer.py?hl=en&answer=69961&rd=1›. Halavais, Alexander. Search Engine Society. Polity, 2008. Hof, Robert. “Facebook Makes It Easier to Target Ads Based on Your Shopping History.” Forbes 27 Feb. 2013. 1 Mar. 2013 ‹http://www.forbes.com/sites/roberthof/2013/02/27/facebook-makes-it-easier-to-target-ads-based-on-your-shopping-history/›. Kaleem, Jaweed. “Death on Facebook Now Common as ‘Dead Profiles’ Create Vast Virtual Cemetery.” Huffington Post. 7 Dec. 2012. 7 Mar. 2013 ‹http://www.huffingtonpost.com/2012/12/07/death-facebook-dead-profiles_n_2245397.html›. Kelly, Max. “Memories of Friends Departed Endure on Facebook.” The Facebook Blog. 27 Oct. 2009. 7 Mar. 2013 ‹http://www.facebook.com/blog/blog.php?post=163091042130›. Kern, Rebecca, Abbe E. Forman, and Gisela Gil-Egui. “R.I.P.: Remain in Perpetuity. Facebook Memorial Pages.” Telematics and Informatics 30.1 (2012): 2–10. Marwick, Alice, and Nicole B. Ellison. “‘There Isn’t Wifi in Heaven!’ Negotiating Visibility on Facebook Memorial Pages.” Journal of Broadcasting & Electronic Media 56.3 (2012): 378–400. Morozov, Evgeny. “The Perils of Perfection.” The New York Times 2 Mar. 2013. 4 Mar. 2013 ‹http://www.nytimes.com/2013/03/03/opinion/sunday/the-perils-of-perfection.html?pagewanted=all&_r=0›. Pariser, Eli. The Filter Bubble: What the Internet Is Hiding from You. London: Viking, 2011. Phillips, Whitney. “LOLing at Tragedy: Facebook Trolls, Memorial Pages and Resistance to Grief Online.” First Monday 16.12 (2011). 21 Apr. 2013 ‹http://firstmonday.org/ojs/index.php/fm/article/view/3168›. Roosendaal, Arnold. “We Are All Connected to Facebook … by Facebook!” European Data Protection: In Good Health? Ed. Serge Gutwirth et al. Dordrecht: Springer, 2012. 3–19. Rudin, Ken. “Actionable Analytics at Zynga: Leveraging Big Data to Make Online Games More Fun and Social.” San Diego, CA, 2010. Vaidhyanathan, Siva. The Googlization of Everything. 1st ed. Berkeley: University of California Press, 2011. Westlake, E.J. “Friend Me If You Facebook: Generation Y and Performative Surveillance.” TDR: The Drama Review 52.4 (2008): 21–40. Zimmer, Michael. “The Externalities of Search 2.0: The Emerging Privacy Threats When the Drive for the Perfect Search Engine Meets Web 2.0.” First Monday 13.3 (2008). 21 Apr. 2013 ‹http://firstmonday.org/ojs/index.php/fm/article/view/2136/1944›. ———. “The Gaze of the Perfect Search Engine: Google as an Infrastructure of Dataveillance.” Web Search. Eds. Amanda Spink & Michael Zimmer. Berlin: Springer, 2008. 77–99.
Estilos ABNT, Harvard, Vancouver, APA, etc.
32

Sully, Nicole, Timothy O'Rourke e Andrew Wilson. "Design". M/C Journal 24, n.º 4 (13 de agosto de 2021). http://dx.doi.org/10.5204/mcj.2848.

Texto completo da fonte
Resumo:
Conventional definitions of design rarely capture its reach into our everyday lives. The Design Council, for example, estimates that more than 2.5 million people use design-related skills, principles, and practices on a daily basis in UK workplaces (Design Council 5, 8). Further, they calculate that these workers contribute £209 billion to the economy annually (8). The terrain of design professions extends from the graphic design of online environments, the business models that make them economically viable, and the algorithms that enable them to function, through to the devices we use, the clothes we wear, the furniture we sit on and the spaces where we live and work. Yet paradoxically a search of online dictionaries reiterates the connection of design primarily to drawing and making plans for buildings. As we witness the adoption of practices of “design thinking” in non-traditional design disciplines, it is interesting to note that the Italian renaissance term disegno, referred to both drawing and aspects of thinking. Giorgio Vasari claimed that design was “the animating principle of all creative processes” (Sorabello). Buckminster Fuller was just as florid and even more expansive when he argued that “the opposite of design is chaos” (Papanek 2). The Oxford English Dictionary captures a broad sense of design as “a plan or scheme conceived in the mind and intended for subsequent execution” (OED Online). This issue of M/C Journal offered contributors the opportunity to consider “design” in its broadest sense. The articles in this issue cast a wide net over design in both practice and theory, and emerge from varied disciplinary bases including material culture, graphic design, media studies, and architecture. The authors critique diverse design practices and pedagogy as well as the social reach of design and its political potential. Design Canons and the Economy While design histories begin with the earliest accounts of toolmaking (Margolin), the Industrial Revolution reinforced the more abstract intellectual dimensions of the discipline. Changing methods of production distinguished making from thinking and led to the emergence of the design profession (Spark). During the twentieth century, New York’s Museum of Modern Art (MoMA) was instrumental in not only raising awareness of design, but its exhibitions and acquisitions endorsed and canonised the styles and figures associated with “good design”. From its promotion of modern architecture in 1932, to the Good Design exhibition in 1950, MoMA’s advocacy reinforced a selective (and exclusive) canon for modernist design, educating the design-conscious consumer and reaching out to public taste, even advising of local stockists. Design became a means to mediate the art of the past with contemporary furnishings, that they accurately predicted would become the art of the future (MoMA 1). Drawing from this context, in this issue, Curt Lund’s essay interrogates a porcelain toy tea set from 1968 through the lens of material culture analysis to confirm its role in mediating relationships, transmitting values, and embodying social practices, tastes, and beliefs. MoMA was not unique in recognising both the artistic merit and commercial potential of design. Few design activities are inseparable from the market economy and the language of design has infiltrated business more broadly. Design processes are used in seemingly novel ways across businesses and governments seeking to improve their digital and real-world services. A 2018 report by the UK’s Design Council recognised the expanding reach of design and the competitive advantage of design-based economies: the skills, principles and practices of design are now widely used from banking to retail. Designers, too, have always drawn on a range of different skills, tools and technologies to deliver new ideas, goods and services. This is what makes design unique, and is how it makes products, services and systems more useful, usable and desirable in advanced economies around the world. (Design Council 5) Underpinned by design, the global gaming market, for example, is an expanding multi-billion-dollar industry. In this issue, Heather Blakey explores connections made in the digital world. Her article asks, how the design of interactions between characters in the game world can align the player experience with the designer's objectives? The reality of working within the design economy is also addressed by Yaron Meron, whose article in this issue examines the frequent absence of the brief in the graphic design context. Meron highlights problems that arise due to a recurring failure to define the scope of the brief and its significance to a formal collaborative framework between designers and their clientele. Recognition of the design economy’s value has translated to the educational sector. With the expansion of design practice beyond its traditional twentieth century silos, higher education managers are seeking to harness design as a source of innovation, driven by the perceptions of value in disparate industries. This desire for interdisciplinarity raises significant questions about pedagogy and the future of the design studio, which has anchored design tuition since the late nineteenth century. Mark Sawyer’s and Philip Goldswain’s article proposes the employment of concepts from open design literature, “meta-design”, and design “frames” to inform a toolkit to enable shared meaning in an architectural studio setting. Design for a Better World The American industrial designer George Nelson described design as “an attempt to make a contribution through change” (Packard 69), echoing the perception that progress represents improvement in a teleological sense. Many designers have long pursued social agendas and explored solutions to inequities. What loosely unites the disparate design disciplines is a shared sense that design improves the world we live in. But even with the best intentions, design does not inevitably lead to a better world. Accounts of design frequently recognise its shortcomings. These might include narratives that document or delight in famous design failures, such as the complex circumstances that led to the famed demolition of the Pruitt Igoe housing complex in the 1970s (Bristol). We also regularly encounter design flaws in the digital environment—whether an encryption algorithm open to compromise or online forms that do not recognise apostrophes or umlauts. Although on one level this leads to frustration, it also leads to other types of exclusion. Lisa Hackett’s article on 1950s-style fashion shows how the failure of the fashion industry to accommodate varying body shapes has led some women to seek solutions with vintage-style fashion choices. Hackett’s article brings to mind the serious concerns that occur when the standards that define the normative fail to account for large parts of the population. Overlooking gender and race can have a cumulative and significant impact on the everyday lives of women and minorities (Criado-Perez). The global pandemic has emphasised the dangers arising from PPE ill-designed for racial and gender diversity (Porterfield). The idea that design was a means of progressive improvement began to be prominently debunked in the 1960s with the discussion of the design life of machines, objects, and buildings. Planned obsolescence—or designed obsolescence as it was also known—came to attention in the early 1960s when Vance Packard’s The Waste Makers called into question the ethics of post-war consumerism. Packard’s work drew attention to the ethical responsibilities of designers, by revealing their complicity in the phenomenon of planned obsolescence. Packard’s critique linked the problem of “growthmanship” with issues of saturation and disposal (Packard 5). Large digital libraries replace physical objects but introduce new types of clutter. The anxieties produced by alerts that one’s device is “out of memory” may be easily dismissed as “first world problems”, but the carbon footprint of digital communication and storage is a global concern (Tsukayama; Chan). Digital clutter is explored in Ananya’s article “Minimalist Design in the Age of Archive Fever” in this issue. Ananya contrasts minimalist aesthetics, and Marie Kondo-style decluttering, with our burgeoning prosthetic memory, and its attendant digital footprint. In the late 1960s, Victor Papanek considered the ethics of design choices, and in particular, the nexus between design and consumerism, acknowledging Thorstein Veblen’s coruscating critique of conspicuous consumption. But Papanek also drew attention to contemporary environmental crises. He railed against industrial designers, architects, and planners, attributing blame for the profligate consumerism and environmental degradation arguing that “in all pollution, designers are implicated, at least partially" (14). Inclusive Design Papanek’s influential advocacy acknowledged the political dimensions of design and the inherent biases of the time. In response to his teaching, Danish student Susanne Koefoed designed the now ubiquitous International Symbol of Access (ISA), which Guffey suggests is the most widely exported work of Scandinavian design (358). In this issue’s feature article, Sam Holleran explores the connection between visual literacy and civic life, and the design of an international symbol language, which aimed to ameliorate social disadvantage and cultural barriers. Discussions of inclusive design acknowledge that design history is most often Eurocentric, and frequently exclusionary of diversity. Articles in this issue examine more inclusive approaches to design. These efforts to make design more inclusive extend beyond the object or product, to examining techniques and processes that might improve society. Poiner and Drake, for example, explore the potential and challenges of participatory approaches in the design of buildings for a remote Indigenous community. Fredericks and Bradfield, in this issue, argue that Indigenous memes can provoke audiences and demand recognition of First Nations peoples. The meme offers a more inclusive critique of a national government’s intransigence to constitutional change that recognises Indigenous sovereignty and self-determination. Further, they advocate for co-design of policy that will enshrine an Indigenous Voice to the Australian Parliament. There are many reasons to be grateful for design and optimistic about its future: the swift design and production of efficacious vaccines come to mind. But as Papanek recognised 50 years ago, designers, most often handmaidens of capital, are still implicated in the problems of the Anthropocene. How can design be used to repair the legacies of a century of profligacy, pollution, and climate change? Design needs its advocates, but the preaching and practice of design are best tempered with continuous forms of critique, analysis, and evaluation. Acknowledgements The editors thank the scholars who submitted work for this issue and the blind referees for their thoughtful and generous responses to the articles. References Bristol, Katharine G. “The Pruitt-Igoe Myth.” Journal of Architectural Education 44.3 (1991): 163–171. Chan, Delle. “Your Website Is Killing the Planet.” Wired, 22 Mar. 21. <https://www.wired.co.uk/article/internet-carbon-footprint>. Criado-Perez, Caroline. Invisible Women: Exposing Data Bias in a World Designed for Men. Chatto & Windus, 2019. "design, n." OED Online. Oxford University Press, June 2021. <http://www.oed.com/view/Entry/50840. Accessed 8 August 2021>. Design Council UK. Designing a Future Economy: Developing Design Skills for Productivity and Innovation. Feb. 2018. <https://www.designcouncil.org.uk/sites/default/files/asset/document/Designing_a_future_economy18.pdf>. Guffey, Elizabeth. “The Scandinavian Roots of the International Symbol of Access.” Design and Culture 7.3 (2015): 357–76. DOI: 10.1080/17547075.2015.1105527. Margolin, Victor. World History of Design. Vol. 1. Bloomsbury, 2017. Museum of Modern Art. “First Showing of Good Design Exhibition in New York.” Press Release. 16 Nov. 1950. <https://assets.moma.org/documents/moma_press-release_325754.pdf?_ga=2.206889043.1160124053.1628409746-2001272077.1623303269>. Packard, Vance. The Waste Makers. David McKay, 1960. Papanek, Victor J. Design for the Real World: Human Ecology and Social Change. Bantam Books, 1973. Porterfield, Carlie. “A Lot of PPE Doesn’t Fit Women—and in the Coronavirus Pandemic, It Puts Them in Danger.” Forbes, 29 Apr. 2020. <https://www.forbes.com/sites/carlieporterfield/2020/04/29/a-lot-of-ppe-doesnt-fit-women-and-in-the-coronavirus-pandemic-it-puts-them-in-danger/?sh=5b5deaf9315a>. Sorabella, Jean. “Venetian Color and Florentine Design.” In Heilbrunn Timeline of Art History. The Metropolitan Museum of Art, 2000–. Oct. 2002 <http://www.metmuseum.org/toah/hd/vefl/hd_vefl.htm>. Sparke, Penny. “Design.” Grove Art Online, 2003. <https://doi-org.ezproxy.library.uq.edu.au/10.1093/gao/9781884446054.article.T022395>. Tsukayama, Hayley. “How Bad Is Email for the Environment?” Washington Post, 25 Jan. 2017. <https://www.washingtonpost.com/news/the-switch/wp/2017/01/25/how-bad-is-email-for-the-environment/>.
Estilos ABNT, Harvard, Vancouver, APA, etc.
33

Dieter, Michael. "Amazon Noir". M/C Journal 10, n.º 5 (1 de outubro de 2007). http://dx.doi.org/10.5204/mcj.2709.

Texto completo da fonte
Resumo:
There is no diagram that does not also include, besides the points it connects up, certain relatively free or unbounded points, points of creativity, change and resistance, and it is perhaps with these that we ought to begin in order to understand the whole picture. (Deleuze, “Foucault” 37) Monty Cantsin: Why do we use a pervert software robot to exploit our collective consensual mind? Letitia: Because we want the thief to be a digital entity. Monty Cantsin: But isn’t this really blasphemic? Letitia: Yes, but god – in our case a meta-cocktail of authorship and copyright – can not be trusted anymore. (Amazon Noir, “Dialogue”) In 2006, some 3,000 digital copies of books were silently “stolen” from online retailer Amazon.com by targeting vulnerabilities in the “Search inside the Book” feature from the company’s website. Over several weeks, between July and October, a specially designed software program bombarded the Search Inside!™ interface with multiple requests, assembling full versions of texts and distributing them across peer-to-peer networks (P2P). Rather than a purely malicious and anonymous hack, however, the “heist” was publicised as a tactical media performance, Amazon Noir, produced by self-proclaimed super-villains Paolo Cirio, Alessandro Ludovico, and Ubermorgen.com. While controversially directed at highlighting the infrastructures that materially enforce property rights and access to knowledge online, the exploit additionally interrogated its own interventionist status as theoretically and politically ambiguous. That the “thief” was represented as a digital entity or machinic process (operating on the very terrain where exchange is differentiated) and the emergent act of “piracy” was fictionalised through the genre of noir conveys something of the indeterminacy or immensurability of the event. In this short article, I discuss some political aspects of intellectual property in relation to the complexities of Amazon Noir, particularly in the context of control, technological action, and discourses of freedom. Software, Piracy As a force of distribution, the Internet is continually subject to controversies concerning flows and permutations of agency. While often directed by discourses cast in terms of either radical autonomy or control, the technical constitution of these digital systems is more regularly a case of establishing structures of operation, codified rules, or conditions of possibility; that is, of guiding social processes and relations (McKenzie, “Cutting Code” 1-19). Software, as a medium through which such communication unfolds and becomes organised, is difficult to conceptualise as a result of being so event-orientated. There lies a complicated logic of contingency and calculation at its centre, a dimension exacerbated by the global scale of informational networks, where the inability to comprehend an environment that exceeds the limits of individual experience is frequently expressed through desires, anxieties, paranoia. Unsurprisingly, cautionary accounts and moral panics on identity theft, email fraud, pornography, surveillance, hackers, and computer viruses are as commonplace as those narratives advocating user interactivity. When analysing digital systems, cultural theory often struggles to describe forces that dictate movement and relations between disparate entities composed by code, an aspect heightened by the intensive movement of informational networks where differences are worked out through the constant exposure to unpredictability and chance (Terranova, “Communication beyond Meaning”). Such volatility partially explains the recent turn to distribution in media theory, as once durable networks for constructing economic difference – organising information in space and time (“at a distance”), accelerating or delaying its delivery – appear contingent, unstable, or consistently irregular (Cubitt 194). Attributing actions to users, programmers, or the software itself is a difficult task when faced with these states of co-emergence, especially in the context of sharing knowledge and distributing media content. Exchanges between corporate entities, mainstream media, popular cultural producers, and legal institutions over P2P networks represent an ongoing controversy in this respect, with numerous stakeholders competing between investments in property, innovation, piracy, and publics. Beginning to understand this problematic landscape is an urgent task, especially in relation to the technological dynamics that organised and propel such antagonisms. In the influential fragment, “Postscript on the Societies of Control,” Gilles Deleuze describes the historical passage from modern forms of organised enclosure (the prison, clinic, factory) to the contemporary arrangement of relational apparatuses and open systems as being materially provoked by – but not limited to – the mass deployment of networked digital technologies. In his analysis, the disciplinary mode most famously described by Foucault is spatially extended to informational systems based on code and flexibility. According to Deleuze, these cybernetic machines are connected into apparatuses that aim for intrusive monitoring: “in a control-based system nothing’s left alone for long” (“Control and Becoming” 175). Such a constant networking of behaviour is described as a shift from “molds” to “modulation,” where controls become “a self-transmuting molding changing from one moment to the next, or like a sieve whose mesh varies from one point to another” (“Postscript” 179). Accordingly, the crisis underpinning civil institutions is consistent with the generalisation of disciplinary logics across social space, forming an intensive modulation of everyday life, but one ambiguously associated with socio-technical ensembles. The precise dynamics of this epistemic shift are significant in terms of political agency: while control implies an arrangement capable of absorbing massive contingency, a series of complex instabilities actually mark its operation. Noise, viral contamination, and piracy are identified as key points of discontinuity; they appear as divisions or “errors” that force change by promoting indeterminacies in a system that would otherwise appear infinitely calculable, programmable, and predictable. The rendering of piracy as a tactic of resistance, a technique capable of levelling out the uneven economic field of global capitalism, has become a predictable catch-cry for political activists. In their analysis of multitude, for instance, Antonio Negri and Michael Hardt describe the contradictions of post-Fordist production as conjuring forth a tendency for labour to “become common.” That is, as productivity depends on flexibility, communication, and cognitive skills, directed by the cultivation of an ideal entrepreneurial or flexible subject, the greater the possibilities for self-organised forms of living that significantly challenge its operation. In this case, intellectual property exemplifies such a spiralling paradoxical logic, since “the infinite reproducibility central to these immaterial forms of property directly undermines any such construction of scarcity” (Hardt and Negri 180). The implications of the filesharing program Napster, accordingly, are read as not merely directed toward theft, but in relation to the private character of the property itself; a kind of social piracy is perpetuated that is viewed as radically recomposing social resources and relations. Ravi Sundaram, a co-founder of the Sarai new media initiative in Delhi, has meanwhile drawn attention to the existence of “pirate modernities” capable of being actualised when individuals or local groups gain illegitimate access to distributive media technologies; these are worlds of “innovation and non-legality,” of electronic survival strategies that partake in cultures of dispersal and escape simple classification (94). Meanwhile, pirate entrepreneurs Magnus Eriksson and Rasmus Fleische – associated with the notorious Piratbyrn – have promoted the bleeding away of Hollywood profits through fully deployed P2P networks, with the intention of pushing filesharing dynamics to an extreme in order to radicalise the potential for social change (“Copies and Context”). From an aesthetic perspective, such activist theories are complemented by the affective register of appropriation art, a movement broadly conceived in terms of antagonistically liberating knowledge from the confines of intellectual property: “those who pirate and hijack owned material, attempting to free information, art, film, and music – the rhetoric of our cultural life – from what they see as the prison of private ownership” (Harold 114). These “unruly” escape attempts are pursued through various modes of engagement, from experimental performances with legislative infrastructures (i.e. Kembrew McLeod’s patenting of the phrase “freedom of expression”) to musical remix projects, such as the work of Negativland, John Oswald, RTMark, Detritus, Illegal Art, and the Evolution Control Committee. Amazon Noir, while similarly engaging with questions of ownership, is distinguished by specifically targeting information communication systems and finding “niches” or gaps between overlapping networks of control and economic governance. Hans Bernhard and Lizvlx from Ubermorgen.com (meaning ‘Day after Tomorrow,’ or ‘Super-Tomorrow’) actually describe their work as “research-based”: “we not are opportunistic, money-driven or success-driven, our central motivation is to gain as much information as possible as fast as possible as chaotic as possible and to redistribute this information via digital channels” (“Interview with Ubermorgen”). This has led to experiments like Google Will Eat Itself (2005) and the construction of the automated software thief against Amazon.com, as process-based explorations of technological action. Agency, Distribution Deleuze’s “postscript” on control has proven massively influential for new media art by introducing a series of key questions on power (or desire) and digital networks. As a social diagram, however, control should be understood as a partial rather than totalising map of relations, referring to the augmentation of disciplinary power in specific technological settings. While control is a conceptual regime that refers to open-ended terrains beyond the architectural locales of enclosure, implying a move toward informational networks, data solicitation, and cybernetic feedback, there remains a peculiar contingent dimension to its limits. For example, software code is typically designed to remain cycling until user input is provided. There is a specifically immanent and localised quality to its actions that might be taken as exemplary of control as a continuously modulating affective materialism. The outcome is a heightened sense of bounded emergencies that are either flattened out or absorbed through reconstitution; however, these are never linear gestures of containment. As Tiziana Terranova observes, control operates through multilayered mechanisms of order and organisation: “messy local assemblages and compositions, subjective and machinic, characterised by different types of psychic investments, that cannot be the subject of normative, pre-made political judgments, but which need to be thought anew again and again, each time, in specific dynamic compositions” (“Of Sense and Sensibility” 34). This event-orientated vitality accounts for the political ambitions of tactical media as opening out communication channels through selective “transversal” targeting. Amazon Noir, for that reason, is pitched specifically against the material processes of communication. The system used to harvest the content from “Search inside the Book” is described as “robot-perversion-technology,” based on a network of four servers around the globe, each with a specific function: one located in the United States that retrieved (or “sucked”) the books from the site, one in Russia that injected the assembled documents onto P2P networks and two in Europe that coordinated the action via intelligent automated programs (see “The Diagram”). According to the “villains,” the main goal was to steal all 150,000 books from Search Inside!™ then use the same technology to steal books from the “Google Print Service” (the exploit was limited only by the amount of technological resources financially available, but there are apparent plans to improve the technique by reinvesting the money received through the settlement with Amazon.com not to publicise the hack). In terms of informational culture, this system resembles a machinic process directed at redistributing copyright content; “The Diagram” visualises key processes that define digital piracy as an emergent phenomenon within an open-ended and responsive milieu. That is, the static image foregrounds something of the activity of copying being a technological action that complicates any analysis focusing purely on copyright as content. In this respect, intellectual property rights are revealed as being entangled within information architectures as communication management and cultural recombination – dissipated and enforced by a measured interplay between openness and obstruction, resonance and emergence (Terranova, “Communication beyond Meaning” 52). To understand data distribution requires an acknowledgement of these underlying nonhuman relations that allow for such informational exchanges. It requires an understanding of the permutations of agency carried along by digital entities. According to Lawrence Lessig’s influential argument, code is not merely an object of governance, but has an overt legislative function itself. Within the informational environments of software, “a law is defined, not through a statue, but through the code that governs the space” (20). These points of symmetry are understood as concretised social values: they are material standards that regulate flow. Similarly, Alexander Galloway describes computer protocols as non-institutional “etiquette for autonomous agents,” or “conventional rules that govern the set of possible behavior patterns within a heterogeneous system” (7). In his analysis, these agreed-upon standardised actions operate as a style of management fostered by contradiction: progressive though reactionary, encouraging diversity by striving for the universal, synonymous with possibility but completely predetermined, and so on (243-244). Needless to say, political uncertainties arise from a paradigm that generates internal material obscurities through a constant twinning of freedom and control. For Wendy Hui Kyong Chun, these Cold War systems subvert the possibilities for any actual experience of autonomy by generalising paranoia through constant intrusion and reducing social problems to questions of technological optimisation (1-30). In confrontation with these seemingly ubiquitous regulatory structures, cultural theory requires a critical vocabulary differentiated from computer engineering to account for the sociality that permeates through and concatenates technological realities. In his recent work on “mundane” devices, software and code, Adrian McKenzie introduces a relevant analytic approach in the concept of technological action as something that both abstracts and concretises relations in a diffusion of collective-individual forces. Drawing on the thought of French philosopher Gilbert Simondon, he uses the term “transduction” to identify a key characteristic of technology in the relational process of becoming, or ontogenesis. This is described as bringing together disparate things into composites of relations that evolve and propagate a structure throughout a domain, or “overflow existing modalities of perception and movement on many scales” (“Impersonal and Personal Forces in Technological Action” 201). Most importantly, these innovative diffusions or contagions occur by bridging states of difference or incompatibilities. Technological action, therefore, arises from a particular type of disjunctive relation between an entity and something external to itself: “in making this relation, technical action changes not only the ensemble, but also the form of life of its agent. Abstraction comes into being and begins to subsume or reconfigure existing relations between the inside and outside” (203). Here, reciprocal interactions between two states or dimensions actualise disparate potentials through metastability: an equilibrium that proliferates, unfolds, and drives individuation. While drawing on cybernetics and dealing with specific technological platforms, McKenzie’s work can be extended to describe the significance of informational devices throughout control societies as a whole, particularly as a predictive and future-orientated force that thrives on staged conflicts. Moreover, being a non-deterministic technical theory, it additionally speaks to new tendencies in regimes of production that harness cognition and cooperation through specially designed infrastructures to enact persistent innovation without any end-point, final goal or natural target (Thrift 283-295). Here, the interface between intellectual property and reproduction can be seen as a site of variation that weaves together disparate objects and entities by imbrication in social life itself. These are specific acts of interference that propel relations toward unforeseen conclusions by drawing on memories, attention spans, material-technical traits, and so on. The focus lies on performance, context, and design “as a continual process of tuning arrived at by distributed aspiration” (Thrift 295). This later point is demonstrated in recent scholarly treatments of filesharing networks as media ecologies. Kate Crawford, for instance, describes the movement of P2P as processual or adaptive, comparable to technological action, marked by key transitions from partially decentralised architectures such as Napster, to the fully distributed systems of Gnutella and seeded swarm-based networks like BitTorrent (30-39). Each of these technologies can be understood as a response to various legal incursions, producing radically dissimilar socio-technological dynamics and emergent trends for how agency is modulated by informational exchanges. Indeed, even these aberrant formations are characterised by modes of commodification that continually spillover and feedback on themselves, repositioning markets and commodities in doing so, from MP3s to iPods, P2P to broadband subscription rates. However, one key limitation of this ontological approach is apparent when dealing with the sheer scale of activity involved, where mass participation elicits certain degrees of obscurity and relative safety in numbers. This represents an obvious problem for analysis, as dynamics can easily be identified in the broadest conceptual sense, without any understanding of the specific contexts of usage, political impacts, and economic effects for participants in their everyday consumptive habits. Large-scale distributed ensembles are “problematic” in their technological constitution, as a result. They are sites of expansive overflow that provoke an equivalent individuation of thought, as the Recording Industry Association of America observes on their educational website: “because of the nature of the theft, the damage is not always easy to calculate but not hard to envision” (“Piracy”). The politics of the filesharing debate, in this sense, depends on the command of imaginaries; that is, being able to conceptualise an overarching structural consistency to a persistent and adaptive ecology. As a mode of tactical intervention, Amazon Noir dramatises these ambiguities by framing technological action through the fictional sensibilities of narrative genre. Ambiguity, Control The extensive use of imagery and iconography from “noir” can be understood as an explicit reference to the increasing criminalisation of copyright violation through digital technologies. However, the term also refers to the indistinct or uncertain effects produced by this tactical intervention: who are the “bad guys” or the “good guys”? Are positions like ‘good’ and ‘evil’ (something like freedom or tyranny) so easily identified and distinguished? As Paolo Cirio explains, this political disposition is deliberately kept obscure in the project: “it’s a representation of the actual ambiguity about copyright issues, where every case seems to lack a moral or ethical basis” (“Amazon Noir Interview”). While user communications made available on the site clearly identify culprits (describing the project as jeopardising arts funding, as both irresponsible and arrogant), the self-description of the artists as political “failures” highlights the uncertainty regarding the project’s qualities as a force of long-term social renewal: Lizvlx from Ubermorgen.com had daily shootouts with the global mass-media, Cirio continuously pushed the boundaries of copyright (books are just pixels on a screen or just ink on paper), Ludovico and Bernhard resisted kickback-bribes from powerful Amazon.com until they finally gave in and sold the technology for an undisclosed sum to Amazon. Betrayal, blasphemy and pessimism finally split the gang of bad guys. (“Press Release”) Here, the adaptive and flexible qualities of informatic commodities and computational systems of distribution are knowingly posited as critical limits; in a certain sense, the project fails technologically in order to succeed conceptually. From a cynical perspective, this might be interpreted as guaranteeing authenticity by insisting on the useless or non-instrumental quality of art. However, through this process, Amazon Noir illustrates how forces confined as exterior to control (virality, piracy, noncommunication) regularly operate as points of distinction to generate change and innovation. Just as hackers are legitimately employed to challenge the durability of network exchanges, malfunctions are relied upon as potential sources of future information. Indeed, the notion of demonstrating ‘autonomy’ by illustrating the shortcomings of software is entirely consistent with the logic of control as a modulating organisational diagram. These so-called “circuit breakers” are positioned as points of bifurcation that open up new systems and encompass a more general “abstract machine” or tendency governing contemporary capitalism (Parikka 300). As a consequence, the ambiguities of Amazon Noir emerge not just from the contrary articulation of intellectual property and digital technology, but additionally through the concept of thinking “resistance” simultaneously with regimes of control. This tension is apparent in Galloway’s analysis of the cybernetic machines that are synonymous with the operation of Deleuzian control societies – i.e. “computerised information management” – where tactical media are posited as potential modes of contestation against the tyranny of code, “able to exploit flaws in protocological and proprietary command and control, not to destroy technology, but to sculpt protocol and make it better suited to people’s real desires” (176). While pushing a system into a state of hypertrophy to reform digital architectures might represent a possible technique that produces a space through which to imagine something like “our” freedom, it still leaves unexamined the desire for reformation itself as nurtured by and produced through the coupling of cybernetics, information theory, and distributed networking. This draws into focus the significance of McKenzie’s Simondon-inspired cybernetic perspective on socio-technological ensembles as being always-already predetermined by and driven through asymmetries or difference. As Chun observes, consequently, there is no paradox between resistance and capture since “control and freedom are not opposites, but different sides of the same coin: just as discipline served as a grid on which liberty was established, control is the matrix that enables freedom as openness” (71). Why “openness” should be so readily equated with a state of being free represents a major unexamined presumption of digital culture, and leads to the associated predicament of attempting to think of how this freedom has become something one cannot not desire. If Amazon Noir has political currency in this context, however, it emerges from a capacity to recognise how informational networks channel desire, memories, and imaginative visions rather than just cultivated antagonisms and counterintuitive economics. As a final point, it is worth observing that the project was initiated without publicity until the settlement with Amazon.com. There is, as a consequence, nothing to suggest that this subversive “event” might have actually occurred, a feeling heightened by the abstractions of software entities. To the extent that we believe in “the big book heist,” that such an act is even possible, is a gauge through which the paranoia of control societies is illuminated as a longing or desire for autonomy. As Hakim Bey observes in his conceptualisation of “pirate utopias,” such fleeting encounters with the imaginaries of freedom flow back into the experience of the everyday as political instantiations of utopian hope. Amazon Noir, with all its underlying ethical ambiguities, presents us with a challenge to rethink these affective investments by considering our profound weaknesses to master the complexities and constant intrusions of control. It provides an opportunity to conceive of a future that begins with limits and limitations as immanently central, even foundational, to our deep interconnection with socio-technological ensembles. References “Amazon Noir – The Big Book Crime.” http://www.amazon-noir.com/>. Bey, Hakim. T.A.Z.: The Temporary Autonomous Zone, Ontological Anarchy, Poetic Terrorism. New York: Autonomedia, 1991. Chun, Wendy Hui Kyong. Control and Freedom: Power and Paranoia in the Age of Fibre Optics. Cambridge, MA: MIT Press, 2006. Crawford, Kate. “Adaptation: Tracking the Ecologies of Music and Peer-to-Peer Networks.” Media International Australia 114 (2005): 30-39. Cubitt, Sean. “Distribution and Media Flows.” Cultural Politics 1.2 (2005): 193-214. Deleuze, Gilles. Foucault. Trans. Seán Hand. Minneapolis: U of Minnesota P, 1986. ———. “Control and Becoming.” Negotiations 1972-1990. Trans. Martin Joughin. New York: Columbia UP, 1995. 169-176. ———. “Postscript on the Societies of Control.” Negotiations 1972-1990. Trans. Martin Joughin. New York: Columbia UP, 1995. 177-182. Eriksson, Magnus, and Rasmus Fleische. “Copies and Context in the Age of Cultural Abundance.” Online posting. 5 June 2007. Nettime 25 Aug 2007. Galloway, Alexander. Protocol: How Control Exists after Decentralization. Cambridge, MA: MIT Press, 2004. Hardt, Michael, and Antonio Negri. Multitude: War and Democracy in the Age of Empire. New York: Penguin Press, 2004. Harold, Christine. OurSpace: Resisting the Corporate Control of Culture. Minneapolis: U of Minnesota P, 2007. Lessig, Lawrence. Code and Other Laws of Cyberspace. New York: Basic Books, 1999. McKenzie, Adrian. Cutting Code: Software and Sociality. New York: Peter Lang, 2006. ———. “The Strange Meshing of Impersonal and Personal Forces in Technological Action.” Culture, Theory and Critique 47.2 (2006): 197-212. Parikka, Jussi. “Contagion and Repetition: On the Viral Logic of Network Culture.” Ephemera: Theory & Politics in Organization 7.2 (2007): 287-308. “Piracy Online.” Recording Industry Association of America. 28 Aug 2007. http://www.riaa.com/physicalpiracy.php>. Sundaram, Ravi. “Recycling Modernity: Pirate Electronic Cultures in India.” Sarai Reader 2001: The Public Domain. Delhi, Sarai Media Lab, 2001. 93-99. http://www.sarai.net>. Terranova, Tiziana. “Communication beyond Meaning: On the Cultural Politics of Information.” Social Text 22.3 (2004): 51-73. ———. “Of Sense and Sensibility: Immaterial Labour in Open Systems.” DATA Browser 03 – Curating Immateriality: The Work of the Curator in the Age of Network Systems. Ed. Joasia Krysa. New York: Autonomedia, 2006. 27-38. Thrift, Nigel. “Re-inventing Invention: New Tendencies in Capitalist Commodification.” Economy and Society 35.2 (2006): 279-306. Citation reference for this article MLA Style Dieter, Michael. "Amazon Noir: Piracy, Distribution, Control." M/C Journal 10.5 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0710/07-dieter.php>. APA Style Dieter, M. (Oct. 2007) "Amazon Noir: Piracy, Distribution, Control," M/C Journal, 10(5). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0710/07-dieter.php>.
Estilos ABNT, Harvard, Vancouver, APA, etc.
34

Crouch, David, e Katarina Damjanov. "Extra-Planetary Digital Cultures". M/C Journal 18, n.º 5 (20 de agosto de 2015). http://dx.doi.org/10.5204/mcj.1020.

Texto completo da fonte
Resumo:
Digital culture, as we know it, owes much to space exploration. The technological pursuit of outer space has fuelled innovations in signal processing and automated computing that have left an impact on the hardware and software that make our digital present possible. Developments in satellite technologies, for example, produced far-reaching improvements in digital image processing (Gonzalez and Woods) and the demands of the Apollo missions advanced applications of the integrated circuit – the predecessor to the microchip (Hall). All the inventive digital beginnings in space found their way back to earth and contributed to the development of contemporary formations of culture composed around practices dependent on and driven by digital technologies. Their terrestrial adoption and adaptation supported a revolution in information, mediation and communication technologies, increasing the scope and speed of global production, exchange and use of data and advancing techniques of imaging, mapping, navigation, surveillance, remote sensing and telemetry to a point that could only be imagined before the arrival of the space age. Steadily knotted with contemporary scientific, commercial and military endeavours and the fabric of the quotidian, digital devices and practices now have a bearing upon all aspects of our pursuits, pleasures and politics. Our increasing reliance upon the digital shaped the shared surfaces of human societies and produced cultures in their own right. While aware of the uneasy baggage of the term ‘culture’, we use it here to designate all digitally grounded objects, systems and processes which are materially and socially inflecting our ways of life. In this sense, we consider both what Michael Hardt and Antonio Negri describe as “those results of social production that are necessary for social interaction and further production, such as knowledges, languages, codes, information, affects, and so forth” (viii), and the material contexts of these products of the social. The effects of digital technologies on the socio-material ambits of human life are many and substantial and – as we want to suggest here – evolving through their ‘extraterrestrial’ beginnings. The contemporary courses of digital cultures not only continue to develop through investments in space exploration, they are themselves largely contingent on the technologies that we have placed in outer space, for instance, global telecommunications infrastructure, GPS, Google maps, weather and climate monitoring facilities and missile grids all rely on the constellation of satellites orbiting the earth. However, we have been increasingly witnessing something new: modes of social production that developed on earth from the technical demands of the space age are now being directed, or rather returned back to have new beginnings beyond the globe. Our focus in this paper is this outward momentum of digital cultures. We do not aim to overview the entire history of the digital in outer space, but instead to frame the extraterrestrial extension of human technologies in terms of the socio-material dimensions of extra-planetary digital cultures. Hannah Arendt described how the space age accelerated the already rapid pace of techno-scientific development, denying us pause during which to grasp its effects upon the “human condition”. Our treacherously fast technological conquest of outer space leaves in its wake an aporia in language and “the trouble”, as Arendt puts it, is that we will “forever be unable to understand, that is, to think and speak about the things which nevertheless we are able to do” (3). This crisis in language has at its core a problem of ontology: a failure to recognise that the words we use to describe ourselves are always, and have always been, bound up in our technological modes of being. As thinkers such as Gilbert Simondon and Bernard Stiegler argued and Arendt derided (but could not deny), our technologies are inseparably bound up with the evolutionary continuum of the human and the migration of our digital ways of life into outer space still further complicates articulation of our techno-logic condition. In Stiegler’s view the technical is the primordial supplement to the human into which we have been “exteriorising” our “interiors” of social memory and shared culture to alter, assert and advance the material-social ambits of our living milieu and which have been consequently changing the idea of what it is to be human (141). Without technologies – what Stiegler terms “organised inorganic matter” (17), which mediate our relationships to the world – there is no human in the inhuman extraterrestrial environment and so, effectively, it is only through the organisation of inert matter that culture or social life can exist outside the earth. Offering the possibility of digitally abstracting and processing the complexities and perils of outer space, space technologies are not only a means of creating a human milieu ‘out there’, but of expediting potentially endless extra-planetary progress. The transposition of digital culture into outer space occasions a series of beginnings (and returns). In this paper, we explore extra-planetary digital culture as a productive trajectory in broader discussions of the ontological status of technologies that are socially and materially imbricated in the idea of the human. We consider the digital facilitation of exchanges between earth and outer space and assign them a place in an evolving discourse concerned with expressing the human in relation to the technological. We suggest that ontological questions occasioned by the socio-material effects of technologies require consideration of the digital in outer space and that the inhuman milieu of the extraterrestrial opens up a unique perspective from which to consider the nascent shape of what might be the emerging extra-planetary beginnings of the post human. Digital Exurbias The unfolding of extra-planetary digital cultures necessitates the simultaneous exteriorisation of our production of the social into outer space and the domestication of our extraterrestrial activities here on earth. Caught in the processes of mediated exploration, the moon, Mars, Pluto and other natural or human-made celestial bodies such as the International Space Station are almost becoming remote outer suburbs – exurbias of earth. Digital cultures are reaching toward and expanding into outer space through the development of technologies, but more specifically through advancing the reciprocal processes of social exchanges between terrestrial and extraterrestrial space. Whether it be through public satellite tracking via applications such as Heavens-Above or The High Definition Earth Viewing system’s continuous video feed from the camera attached to the ISS (NASA, "High Definition") – which streams us back an image of our planetary habitat from an Archimedean point of view – we are being encouraged to embrace a kind of digital enculturation of extraterrestrial space. The production of social life outside our own planet has already had many forms, but perhaps can be seen most clearly aboard the International Space Station, presently the only extraterrestrial environment physically occupied by humans. Amongst its many landmark events, the ISS has become a vigorous node of social media activity. For example, in 2013 Chris Hadfield became a Twitter phenomenon while living aboard the ISS; the astronaut gathered over a million Twitter followers, he made posts on Facebook, Tumblr and Reddit, multiple mini-vids, and his rendition of David Bowie’s Space Oddity on YouTube (Hadfield) has thus far been viewed over 26 million times. His success, as has been noted, was not merely due to his use of social media in the unique environment of outer space, but rather that he was able to make the highly technical lives of those in space familiar by revealing to a global audience “how you make a sandwich in microgravity, how you get a haircut” (Potter). This techno-mediation of the everyday onboard ISS is, from a Stieglerian perspective, a gesture toward the establishment of “the relation of the living to its milieu” (49). As part of this process, the new trends and innovations of social media on earth are, for example, continuously replayed and rehearsed in the outer space, with a litany of ‘digital firsts’ such as the first human-sent extraterrestrial ‘tweet’, first Instagram post, first Reddit AMA and first Pinterest ‘pin’ (Knoblauch), betraying our obsessions with serial digital beginnings. The constitution of an extra-planetary milieu progresses with the ability to conduct real-time interactions between those on and outside the earth. This, in essence, collapses all social aspects of the physical barrier and the ISS becomes merely a high-tech outer suburb of the globe. Yet fluid, uninterrupted, real-time communications with the station have only just become possible. Previously, the Iinternet connections between earth and the ISS were slow and troublesome, akin to the early dial-up, but the recently installed Optical Payload for Lasercomm Science (OPAL), a laser communications system, now enables the incredible speeds needed to effortlessly communicate with the human orbital outpost in real-time. After OPAL was affixed to the ISS, it was first tested using the now-traditional system test, “hello, world” (NASA, "Optical Payload"); referencing the early history of digital culture itself, and in doing so, perhaps making the most apt use of this phrase, ever. Open to Beginnings Digital technologies have become vital in sustaining social life, facilitating the immaterial production of knowledge, information and affects (Hardt and Negri), but we have also become increasingly attentive to their materialities; or rather, the ‘matter of things’ never went away, it was only partially occluded by the explosion of social interactivities sparked by the ‘digital revolution’. Within the ongoing ‘material turn’, there have been a gamut of inquiries into the material contexts of the ‘digital’, for example, in the fields of digital anthropology (Horst and Miller), media studies (Kirschenbaum, Fuller, Parikka) and science and technology studies (Gillespie, Boczkowski, and Foot) – to mention only a very few of these works. Outside the globe material things are again insistent, they contain and maintain the terrestrial life from which they were formed. Outer space quickens our awareness of the materiality underpinning the technical apparatus we use to mediate and communicate and the delicate support that it provides for the complex of digital practices built upon it. Social exchanges between earth and its extra-planetary exurbias are made possible through the very materiality of digital signals within which these immaterial interactions can take place. In the pared down reality of contemporary life in outer space, the sociality of the digital is also harnessed to bring forth forms of material production. For example, when astronauts in space recently needed a particular wrench, NASA was able to email them a digital file from which they were then able print the required tool (Shukman). Through technologies such as the 3D printer, the line between products of the social and the creation of material objects becomes blurred. In extra-planetary space, the ‘thingness’ of technologies is at least as crucial as it is on earth and yet – as it appears – material production in space might eventually rely on the infrastructures occasioned by the immaterial exchanges of digital culture. As technical objects, like the 3D printer, are evolving so too are conceptions of the relationship that humans have with technologies. One result of this is the idea that technologies themselves are becoming capable of producing social life; in this conception, the relationships and interrelationships of and with technologies become a potential field of study. We suggest here that the extra-planetary extension of digital cultures will not only involve, but help shape, the evolution of these relationships, and as such, our conceptions and articulations of a future beyond the globe will require a re-positioning of the human and technical objects within the arena of life. This will require new beginnings. Yet beginnings are duplicitous, as Maurice Blanchot wrote – “one must never rely on the word beginning”; technologies have always been part of the human, our rapport is in some sense what defines the human. To successfully introduce the social in outer space will involve an evolution in both the theory and practice of this participation. And it is perhaps through the extra-planetary projection of digital culture that this will come about. In outer space the human partnership with the objects of technology, far from being a utopian promise or dystopian end, is not only a necessity but also a productive force shaping the collective beginnings of our historical co-evolution. Objects of technology that migrate into space appear designed to smooth the ontological misgivings that might arise from our extra-planetary progress. While they are part of the means for producing the social in outer space and physical fortifications against human frailty, they are perhaps also the beginnings of the extraterrestrial enculturation of technologies, given form. One example of such technologies is the anthropomorphic robots currently developed by the Dextrous Robotics Laboratory for NASA. The latest iteration of these, Robotnaut 2 was the first humanoid robot in space; it is a “highly dexterous” robot that works beside astronauts performing a wide range of manual and sensory activities (NASA, "Robonaut"). The Robonaut 2 has recorded its own series of ‘firsts’, including being the “first robot inside a human space vehicle operating without a cage, and first robot to work with human-rated tools in space” (NASA, "Robonaut"). One of the things which mark it as a potential beginning is this ability to use the same tools as astronauts. This suggests the image of a tool using a tool – at first glance, something now quite common in the operation of machines – however, in this case the robot is able to manipulate a tool that was not designed for it. This then might also include the machine itself in our own origins, in that evolutionary moment of grasping a tool or stealing fire from the gods. As an exteriorisation of the human, these robots also suggest that a shared extra-planetary culture would involve acknowledging the participation of technologic entities, recognising that they share these beginnings with us, and thus are participating in the origins of our potential futures beyond the globe – the prospects of which we can only imagine now. Identifiably human-shaped, Robonauts are created to socialise with, and labour together with, astronauts; they share tools and work on the same complex tasks in the same environment aboard the International Space Station. In doing so, their presence might break down the separation between the living and the nonliving, giving form to Stiegler’s hypothesis regarding the ontology of technical objects, and coming to represent a mode of “being” described as “organized inert matter” (49). The robonaut is not dominated by the human, like a hand-held tool, nor is it dominating like a faceless system; it is engineered to be conducted, ‘organised’ rather than controlled. In addition to its anthropomorphic tendencies – which among other things, makes them appear more human than astronauts wearing space suits – is the robonaut’s existence as part of an assemblage of networked life that links technical objects with wet bodies into an animate system of information and matter. While this “heralds the possibility of making the technical being part of culture” (Simondon 16), it also suggests that extra-planetary digital cultures will harness what Simondon formulates as an “ensemble” of “open machines” – a system of sensitive technologies toward which the human acts as “organizer and as a living interpreter” (13). In the design of our extra-planetary envoys we are evolving toward this openness; the Robonaut, a technical object that shares in digital culture and its social and material production, might be the impetus through which the human and technological acquire a language that expresses a kind of evolutionary dialectic. As a system of inclusions that uses technologies to incorporate/socialise everything it can, including its own relationship with technical objects, digital culture in outer space clarifies how technologies might relate and “exchange information with each other through the intermediacy of the human interpreter” (Simondon 14). The Robonaut, like the tweeting astronaut, provides the test signals for what might eventually become points of communication between different modes of being. In this context, culture is collective cumulative memory; the ‘digital’ form of culture suggests an evolution of both technologic life and human life because it incorporates the development of more efficient means of storing and transmitting memory as cultural knowledge, while recognising the experience of both. Social learning and memory will first define the evolution of the Robonaut. Digital culture and the social expressed through technology – toward a shared social life and cultural landscape established in outer space – will involve the conservation, transmission and setting of common patterns that pool a composite interplay of material, neurobiologic and technologic variables. This will in turn require new practices of enculturation, conviviality with technologies, a sharing, incorporation and care. Only then might this transform into a discussion concerning the ontologies of the ‘we’. (Far from) Conclusions Hannah Arendt wrote that technologic progress could not find full expression in “normal” (3) language and that we must constantly be aware that our knowledge, politics, ethics and interactions with regard to technologies are incomplete, unformulated or unexpressed. It could be said then that our relationship with technologies is constantly beginning, that this need to keep finding new language to grasp it means that it actually progresses through its rehearsal of beginnings, through the need to maintain the productive inquisitive force of a pleasant first meeting. Yet Arendt’s idea emerges from a kind of contempt for technology and her implied separation between ‘normal’ and what could be called ‘technical’ language suggests that she privileges the lay ‘human’ tongue as the only one in which meaningful ideas can be properly expressed. What this fails to acknowledge is an appreciation of the potential richness of technical language and Arendt instead establishes a hierarchy that privileges one’s ‘natural’ language. The invocation of the term ‘normal’ is itself an admission of unequal relations with technologies. For a language to develop in which we can truly begin to express and understand the human relationship with ever-changing but ever-present technologies,, we must first allow the entrance of the language of technology into social life – it must be incorporated, learnt or translated. In the future, this might ultimately give technology a voice in a dialogue that might be half-composed of binary code. Digital culture is perhaps a forerunner of such a conversation and perhaps it is in the milieu of outer space that it could be possible to see advances in our ideas about the mutually co-constitutive relationship between the human and technical. The ongoing extra-planetary extension of the digital cultures have the productive potential to sculpt the material and social ambits of our world, and it is this capacity that may precipitate beginnings which will leave lasting imprints upon the prospects of our shared post-human futures. References Arendt, Hannah. The Human Condition. 2nd ed. Chicago: University of Chicago Press, 1958. Blanchot, Maurice. Friendship. Trans. Elizabeth Rottenberg. Stanford: Stanford University Press, 1997. Originally published in French in 1971 under the title L’Amitié. Fuller, Matthew. Media Ecologies: Materialist Energies in Art and Technoculture. Cambridge, MA: MIT Press, 2005. Gillespie, Tarleton, Pablo J. Boczkowski, and Kirsten A. Foot (eds.). Media Technologies: Essays on Communication, Materiality, and Society. Cambridge, Massachusetts: MIT Press, 2014. Gonzalez, Rafael, and Richard E. Woods. Digital Image Processing. 2nd ed. New Jersey: Prentice Hall, 2002. Hadfield, Chris. “Space Oddity.” YouTube, 12 May 2013. 10 Aug. 2015 ‹https://www.youtube.com/watch?v=KaOC9danxNo›. Hall, Eldon C. Journey to the Moon: The History of the Apollo Guidance Computer. Reston: American Institute of Aeronautics and Astronautics, 1996. Hardt, Michael, and Antonio Negri. Commonwealth. Cambridge, MA: Harvard University Press, 2009. Heavens-Above. ‹http://www.heavens-above.com›. Horst, Heather, and Daniel Miller. Digital Anthropology. London and New York: Berg, 2012. Kirschenbaum, Matthew. Mechanisms: New Media and the Forensic Imagination. Cambridge, MA: MIT Press, 2008. Knoblauch, Max. “The 8 First Social Media Posts from Space.” Mashable 13 Aug. 2013. ‹http://mashable.com/2013/08/13/space-social-media-firsts/›. NASA. “High Definition Earth-Viewing.” ‹http://www.nasa.gov/mission_pages/station/research/experiments/917.html›.NASA. “Optical Payload for Lasercomm Science (OPALS).” 13 May 2015. ‹http://www.nasa.gov/mission_pages/station/research/experiments/861.html›. NASA. “Robonaut Homepage.” ‹http://robonaut.jsc.nasa.gov/default.asp›. Parikka, Jussi. “Dust and Exhaustion: The Labour of New Materialism.” C-Theory 2 Oct. 2013. ‹http://www.ctheory.net/articles.aspx?id=726›. Potter, Ned. “How Chris Hadfield Conquered Social Media from Outer Space.” Forbes 28 Jul. 2013. ‹http://www.forbes.com/sites/forbesleadershipforum/2013/06/28/how-chris-hadfield-conquered-social-media-from-outer-space›. Shukman, David. “NASA Emails Spanner to Space Station - Analysis.” BBC News 19 Dec. 2014. ‹http://www.bbc.com/news/science-environment-30549341›. Simondon, Gilbert. On the Mode of Existence of Technical Objects. Paris: Aubier, Editions Montaigne, 1958. Trans. Ninian Mellamphy. University of Western Ontario, 1980. Stiegler, Bernard. Technics and Time 1: The Fault of Epimetheus. Stanford: Stanford University Press, 1998.
Estilos ABNT, Harvard, Vancouver, APA, etc.
35

Meese, James. "“It Belongs to the Internet”: Animal Images, Attribution Norms and the Politics of Amateur Media Production". M/C Journal 17, n.º 2 (24 de fevereiro de 2014). http://dx.doi.org/10.5204/mcj.782.

Texto completo da fonte
Resumo:
Cute pictures of animals feature as an inoffensive and adorable background to the contemporary online experience with cute content regularly shared on social media platforms. Indeed the demand for cuteness is so strong in the current cultural milieu that some animals become recognisable animal celebrities in the process (Hepola). However, despite the existence of this professionalisation in some sections of the cute economy, amateurs produce the majority of cute content that circulates online. This is largely because one of the central contributors to this steady stream of cute animal pictures is the subforum Aww, hosted on the online community Reddit. Aww is wholly dedicated to pictures of cute things and allows users to directly submit cute content directly to the site. Aww is one of the default subforums that new Reddit users are automatically subscribed to and is immensely popular, featuring over 4.2 million dedicated subscribers as well as untold casual visits. The section is self-described as: “Things that make you go AWW! -- like puppies, and bunnies, and so on...Feel free to post pictures, videos and stories of cute things” ("The cutest things on the internet!"). Users upload cute animal photos that they have taken and wait for the Reddit community to vote on their favourite pictures. The voting mechanism helps users to acknowledge their favourite posts, with the most popular featured on the front page of Aww (for a detailed critique of this process see van der Nagel 2013). The user-generated model of the site means that instead of visitors being confronted with a formally curated selection of cute animal photos, Aww offers a constantly changing mixture of amateur, semi-pro and professional content. Aww - and Reddit more generally - stand as an emblematic example of participatory culture (Jenkins 2006), with users playing an active role in the production and curation of online content. However, given the commercial nature of many user-generated content sites, this amateur media activity is becoming increasingly subject to intellectual property claims and conflicts (see Burgess; Kennedy). Across the internet there are growing tensions between website operators and amateur producers. As Jenny Kennedy (132) notes, while these platforms promote a public rhetoric of “sharing”, these corporate narratives “downplay their economic power” and imply “that they do not control the practices contained within their sites”. Subsequently, the expectations of users regarding how content is managed and organised can differ substantially from the corporate goals of social media companies. This paper contributes to the growing body of literature interested in the politics of amateur media production (see Hunter and Lastowka; Benkler; Burgess; Kennedy) by exploring the emergence of attribution norms and informal enforcement measures in and around the Aww online community. In contrast to professional content creators, amateurs often have fewer resources on hand to protect their copyrighted work and are also challenged by a pervasive online rhetoric that suggests that popular content essentially “belongs to the Internet” (Douglas). A number of communities on Reddit have questioned the company’s handling of amateur content with users suggesting that Reddit actively seeks to de-contextualise original content and not attribute original creators. By examining how amateur creators and online communities regulate content online, I interrogate the power relations that exist between social media platforms and users and explore how the corporate rhetoric of participatory culture interacts with the legal framework of copyright law. This article also contributes to existing legal scholarship on communities of practice and norms-based intellectual property systems. This literature has explored how social norms effectively regulate the protection of, among other things, recipes (Fauchart and Von Hippel), fashion design (Raustiala and Sprigman) and stand-up comedy routines (Oliar and Sprigman), in situations where copyright law does not function as an effective regulatory mechanism. Often these norms are in line with copyright law protections, but in other cases they diverge from these legal principles. In this paper I suggest that particular sections of Reddit function in a similar way, with their own set of self-governing norms, and that these norms largely align with the philosophical aims of copyright law. The paper begins by outlining a series of recent debates that have occurred between amateur media creators and Reddit, before exploring how norms are regulated on Reddit subforums Aww and Karma Court. I then offer some brief conclusions on the value of paying attention to how social norms structure forms of “sharing” (see Kennedy) and provide a useful way for amateur media producers to protect their content without going through formal legal processes. Introducing Reddit and the Confused Politics of Amateur Content Reddit is a social news site, a vibrant community and one of the most popular websites online. It stands as the most visible iteration of a long-standing tradition of user-generated and managed news, one that goes back to websites like Slashdot, which operated in the mid to late-90s. Founded in 2005 Reddit was launched after only one funding round of venture capital, receiving $100k in seed funding from Y Combinatory (Miller). Despite some early rivalry between Reddit and competitor site Digg, Reddit had enough potential to be purchased by Condé Nast for an estimated $20 million (Carr). Reddit’s audience numbers have grown exponentially in the last few years, with the site currently receiving over 5 billion page views and 114 million unique visitors per month (“About Reddit”). It has also changed focus significantly in the last few years with the site now “as much about posting interesting or funny pictures as it is about news” (Sepponen). Reddit hosts a number of individual subforums (called subreddits), which focus on a particular topic and function essentially like online bulletin boards. The front-page of Reddit showcases the most popular content from across the whole website, and user-generated content features heavily here. Amateur media cannot spread without the structural support of social media platforms, but this support is qualified in particular ways. Reddit stands as a paradigmatic case. Users on Reddit are “incentivized to submit direct links to images, because viewers can get to them more easily” (Douglas) and the website encourages amateur creators to use a preferred content server – Imgur – to host images. The Imgur service provides a direct public link to an image – even bypassing the Reddit discussion page – and with its free hosting and limited ads it has become a popular service and is used by most Reddit users (Slater-Robins). For the majority of Reddit users this is an unproblematic partnership. Imgur is free, effective and fast. However, a vocal minority of Reddit users and amateur creators claim that the partnership between Reddit and Imgur has created the equivalent of an online ghetto (Douglas).As Nick Douglas explains, when using services like Imgur there is no requirement to either provide an external link to a creators website or to attribute the creator, limiting the ability for an amateur creator to gain exposure. It also bypasses existing revenue streams that may have been set up by creators, including ad-supported websites or online stores offering merchandise. As a result creators have little opportunity to benefit either economically or reputationally from this system. This occurs to such an extent that “there are actually warnings against submitting your own [original] work” to particular subforums on Reddit (Douglas). For example, some forum moderators require submissions to either “link directly to a specific image file or to a website with minimal ads” (“Reddit Pics”). It is in this context, that the posting of original content without attribution is not actively policed. There are a number of complaints circulating within the Reddit community about these practices (see “Ok, look people. I know you heart Imgur, but webcomics? Just link to the freaking site”; “The problem with reddit”). Many creators have directly protested against this aspect of Reddit’s structural organisation. Blogger Benjamin Grelle (a.k.a The Frogman) and writer Chris Menning are two notable examples. Grelle’s protest was witty and dramatic. He wrote a blog post featuring a picture of an email he sent to Imgur offering the company a choice: send him a huge novelty check for $10,000 or alternatively, add a proper attribution system that allows artists, photographers and content creators to properly credit their work. Grelle estimates that his work generated around $20,000 in ad revenue for Imgur; however the structure of Reddit and Imgur meant he earned little income from the “viral” success of his content. Grelle claimed he was happy for his work to be shared, but attribution meant that it was more likely a fan would follow the link to his website and provide him with some financial recompense for his work. Unsurprisingly, Grelle didn’t receive a paycheck and so in response has developed a unique way to gain exposure. He has started to insert himself into his work, “[s]o when you see a stolen Frogman piece, you still see Ben Grelle’s face” (Douglas). Chris Menning posted a blog about being banned from Reddit, hoping to bring to light some of the inequalities that persist around Reddit’s current structure. He began by noting that he had received a significant amount of traffic from them in the past. He had responded in kind by looking to create original content for particular subforums, knowing what a particular community would enjoy. However, his habit of providing the link to his own website along with the content he posted saw him get labelled as a spammer and banned by administrators. Menning chose not to fight the ban:It seems that the only way I could avoid [getting banned] is if I were to relinquish any rights to my original content and post it exclusively to Imgur. In effect, reddit punishes the creation of original content, and rewards content theft (Menning). Instead he decided to quit Reddit, claiming that Reddit’s approach would carry long-term consequences as the platform provided little incentive for creators to produce wholly original content. It is worth noting that neither Menning nor Grelle turned to legal avenues in order to gain financial restitution. Considering the nature of the practices they were complaining about, compensation in the form of an injunction or damages would have certainly been possible. In Benjamin’s case, a user had combined a number of his copyrighted works into one image and posted the image to Imgur without attribution --this infringed Grelle’s copyright in his work as well as his moral right to be attributed as the creator of the work. However, the public comments of both creators suggest that despite the possibility of legal success, their issue was not so much to do with their individual cases but rather the broader structural issues at play within Reddit. While they might gain individually from a successful legal challenge, over the long term Reddit would continue to be a fraught place for amateur and semi-professional content creators. Certain parts of the Reddit community appear to be sympathetic to these issues, and the complaints of dissenting users like Menning and Grelle have received active support from some users and moderators on the site. This has led to changes in the way content is being posted and managed on Aww, and has also driven the emergence of a satirical user-run court entitled Karma Court. In these spaces moderators and members establish community norms, regularly police the correct attribution of works and challenge the de-contextualisation of content overtly encouraged by Reddit, Imgur and other subforums. In the following section I will examine both Aww and Karma Court in order to explore how these norms are established and negotiated by both moderators and users alike. reddit.com/r/aww: The Online Hub of Cute Animal Pictures As we have seen, the design of Reddit and Imgur creates a number of problems for amateur creators who wish to protect their intellectual property. To address these shortcomings, the Aww community has created its own informal regulatory systems. Volunteer moderators play a crucial role: they establish informal codes of conduct for the Aww community and enforce various rules about how the site should be used. One of these rules relates to attribution. Users are asked to to “post original content whenever possible or attribute original content creators” ("The cutest things on the internet!"). Due to the volunteer nature of the work and the size of the Aww sub-reddit, moderator enforcement is haphazard. Consequently, responsibility falls on the wider user community to self-police. Despite its informal nature, this process manages to facilitate a fairly consistent standard of attribution. In this way it functions as an informal method of intellectual property protection. It is worth noting however that this commitment to original content is not solely due to the moral character of Aww users. A significant motivation is the distribution of karma points amongst Reddit users. Karma, which represents your good standing within the Reddit community, can be earned through user likes and votes – these push the most popular content to the front page of each subforum. Thus karma stands as a numerical representation of a user’s value to Reddit. This ostensibly democratic system has the paradoxical effect of fuelling intellectual property violations on the site. Users often repost other users’ jpegs, animated gifs, and other content, in order to reap the social and cultural capital that comes with posting a popular picture. In some cases they claim authorship of the content; in other cases they simply re-post content that they feel “belongs to the internet” (Douglas). Some content is so popular or pervasive online (this content that is often described as “viral”) that users feel there is little reason or need to attribute content. This helps to explain the persistence of ownership and attribution conflicts on Reddit. In the eyes of some users and moderators the management of these rights and the correct distribution of karma are seen to be vital to the long-term functioning of site. The karma system offers a numerical representation of each contributor’s value. Re-posting already successful content and claiming it as your own challenges the proper functioning of the karma system and potentially ‘inhibits the innovative potential of contributions (Richterich). On Aww the re-posting of original content is viewed as a taboo act that breaches these norms. The poster is seen to have engaged in deceptive conduct in order to gain karma for their user profile. In addition there is a strong ethic that runs through these comment threads that the original creator deserves attribution. There is a presumption that this attribution is vital in order to increasing the possible marketability of the posted content and to recognise and courage creators within the community. This sort of community-driven regulation contrasts with the aforementioned site design of Reddit and Imgur, which frustrates effective authorship attribution practices. Aww users, in contrast, have shown a willingness to defend what they see as the intellectual property rights of content creators.A series of recent examples outline how this process works in practice. User “moonlikeme123” posted a picture of a cat with its hands on the steering wheel of a car. The picture was entitled “we don’t need to ask for directions, Helen”. During the same day, three separate users had identified the picture as a repost, with one noting that the same picture was already on the front page of Aww. “moonlikeme123” received no karma points for the picture. In a second example, the user “nibblur” posted a photo of a kitten “hunting” a toy mouse. Within a day, one enterprising user had identified the original photographer – “torode”, an amateur photographer – and linked to his Reddit profile (see fig. 2) ("ferocious cat hunting its prey: aww."). One further example: on 15 July 2013 “Cuzacelmare” posted a picture of two dogs comforting each other – an image which had originally been posted by “lauface”. Again, users were quick to point out the lack of attribution and the attempt to claim someone else’s content as their own (“Comforting her sister during a storm: aww). It is worth noting that some Reddit users consider attributing content to be entirely without benefit. Some deride karma as “meaningless” and suggest that as a significant amount of content online is regularly reposted elsewhere, there is little harm done in re-posting what is essentially amateur content destined to be lost in the bowels of the internet. For example, the comments that follow Cuzacelmare’s reflect an ambivalence about reposting, suggesting that users weigh up the benefits of exposure gained by the re-posting against the lack of attribution granted and the increasingly decontextualized nature of the photo itself:Why does everyone get so bitchy about reposts. Not everyone is on ALL the time or has been on Rreddit since it was created. I mean if you've seen it already ignore it. It's just picture you aren't forced to click the link. [sic] (“Comforting her sister during a storm: aww”)We're arguing semantics, but any content that gets attention can benefit the creator, whether it's reddit or Youtube (“Comforting her sister during a storm: aww”) Such discussions are common on comment threads following re-posts by other users. They underline the conflicted status of this ephemeral media and the underlying frictions that are part of these processes. These discussions underline the fact that on Reddit the “sharing” (Kennedy) and “spreading” (Jenkins et al.) of content is not seen as an unquestioned positive but rather as a contestable structural feature that needs to be constantly negotiated and discussed. These informal methods of identification, post-hoc attribution and criticism in comment threads have been the long-standing method used to redress questions of attribution and ownership of content on Reddit. However in recent times, Reddit users have turned to satirical methods of formal adjudication for particularly egregious cases. A sub-reddit, Karma Court, now functions as an informal tribunal in which punishment is meted out for “the abuse of karma and general contemptible actions heretofore identified as wrongdoing” (“Constitution and F.A.Q of the Karma Court”). Due to its double function as both an adjudicator and satire of users overly-invested in online debates, there is no limit to the possible “crimes” a user may be charged with. The following charges are only presented as guidelines and speak to common negative experiences on online: (1). Douchebaggery - When one is being a douche.(2). Defamation - Tarnishing another redditor's [user’s] username.(3). Public Indecency - When a user flexes his or her 'e-peen' with the intent to shame other users.(4). OhShit.exe - Intentional reposting that results in reddit Gold.(5). GrandTheft.jpg - Reposting while claiming credit for the post.(6). Obstruction of Justice - Impeding or interfering with an investigation, such as submitting false screenshots, deleting evidence, or providing false evidence to the court.(7). Other - Literally anything else you want. We like creative names for charges.(“Constitution and F.A.Q of the Karma Court”) In Karma Court, legal representation can be sourced from a list of attorneys and judges, populated by users who volunteer to help adjudicate the case. They are required to have been a Reddit member for over six months. The only punishment is a public shaming. Interestingly Karma Court has developed a fair reposting clause that attempts to manage the complex debates around reposting and attribution. Under the non-binding satirical clause, users are able to repost content if it has not featured on the front page of a sub-reddit for seven or more days, if the re-poster acknowledges in the title or description that they are re-posting or if the original poster has less than 30,000 link karma (which means that the original poster has not substantially contributed to the Reddit community). If a re-poster does not adhere by these rules and claims a re-post as their own original content (or “OC”), they can be charged with “grandtheft.jpg” and brought to trial by another Reddit user. As one of the most popular subforums, a number of cases have emerged from Aww. The aforementioned re-poster “Cuzacelmare” (“I am bringing /U/ Cuzacelmare to trial …”) was “charged” through this process and served with a summons after denying “cute and innocent animals of that subreddit of their much deserved karma”. Similar cases to do with re-posting without attribution on Aww involve “FreshCorio” (“Reddit vs. U/FreshCorio …”) and “ninjacollin” (“People of Reddit vs. /U/ ninjacollin”) who were also brought to karma court. In each case prosecutors were adamant that false authorship claims needed to be punished. With these mock trials run by volunteers it takes time for arguments to be heard and judgment to occur; however “ninjacollin” expedited the legal process by offering a full confession. As a new user, “ninjacollin” was reprimanded severely for his actions and the users on Karma Court underlined the consequences of not identifying original content creators when re-posting content. Ownership and Attribution: Amateur Media, Distribution and Law The practices outlined above offer a number of alternate ways to think about amateur media and how it is distributed. An increasingly complex picture of content attribution and circulation emerges once we take into account the structural operation of Reddit, the intellectual property norms of users, and the various formal and informal systems of regulation that are appearing on the site. Such practices require users to negotiate complex questions of ownership between each other and in relation to corporate bodies. These negotiations often lead to informal agreements around a set of norms to regulate the spread of content within a particular community, suggesting that the lack of a formal legal process in these debates does not mean that there is an absence of regulation. As noted throughout this paper, the spread of online content often involves progressive de-contextualisation. Website design features often support this process in the hopes of encouraging content to spread in a fashion amenable to their corporate goals. Considering this tendency for content to be decontextualized online, the presence of attribution norms on subforums like Aww is significant. Instead of remixing, spreading and re-purposing content indiscriminately, users retain a concept of ownership and attribution that tracks closely to the basic principles of copyright law. Rather than users radically redefining concepts of attribution and ownership, as prefigured in some of the more utopian accounts of participatory media, the dominant norms of the Reddit community extend a discourse of copyright and ownership. As well as providing a greater level of detail to contemporary debates around amateur media and its viral or spreadable nature (Burgess; Jenkins; Jenkins et al), this analysis offers some lessons for copyright law. The emergence of norms in particular Reddit subforums which govern the use of copyrighted content and the use of a mock court structure suggests that online communities have the capacity to engage in forms of redress for amateur creators. These organic forms of copyright management operate adjacent to formal legal structures of copyright law. However, they are more accessible and practical for amateur creators, who do not always have the money to hire lawyers, especially when the market value of their content might be negligible. The informal regulatory systems outlined above may not operate perfectly but they reveal communities who are willing to engage foundational conversations around the importance of attribution and ownership. Following the existing literature (Fauchart and Von Hippel; Raustiala and Sprigman; Schultz; Oliar and Sprigman), I suggest that these online social norms provide a useful form of alternative protection for amateur creators. Acknowledgements Thanks to Ramon Lobato and Emily van der Nagel for comments and productive discussions around these issues. I am also grateful to the two anonymous peer reviewers for their assistance in developing this argument. References “About Reddit.” Reddit, 2014. 29 Apr. 2014 ‹http://www.reddit.com/about/›. Benkler, Yochai. The Wealth of Networks: How Social Production Transforms Markets and Freedom. New Haven: Yale University Press, 2006. Burgess, Jean. “YouTube and the Formalisation of Amateur Media.” Amateur Media: Social, Cultural and Legal Perspectives. In Dan Hunter, Ramon Lobato, Megan Richardson, and Julian Thomas, eds. Oxford: Routledge, 2012. Carr, Nicholas. “Left Alone by Its Owner, Reddit Soars.” The New York Times: Business, 2 Sep. 2012. “Comforting Her Sister during a Storm: aww.” reddit: the front page of the internet, 15 July 2013. “Constitution and F.A.Q of the Karma Court.” reddit: the front page of the internet, 2014. Douglas, Nick. “Everything on the Internet Gets Stolen: Here’s How You Should Feel about That.” Slacktory, 8 Sep. 2009. Fauchart, Emmanual, and Eric von Hippel. “Norms-Based Intellectual Property Systems: The Case of French Chefs.” Organization Science 19.2 (2008): 187 - 201 "Ferocious Cat Hunting Its Prey: aww." reddit: the front page of the internet, 4 April 2013. 29 Apr. 2014 ‹http://www.rreddit.com/r/aww/comments/1bobcp/ferocious_cat_hunting_its_prey/›. Hepola, Sarah. “The Internet is Made of Kittens.” Salon.com, 11 Feb. 2009. 29 Apr. 2014 ‹http://www.salon.com/2009/02/10/cat_internet/›. Hunter, Dan, and Greg Lastowka. “Amateur-to-Amateur.” William & Mary Law Review 46 (2004): 951 - 1030. “I Am Bringing /U/ Cuzacelmare to Trial on the Basis of Being One of the Biggest _______ I’ve Ever Seen, by Reposting Cute Animal Pictures to /R/Awww. Feels.Jpg.” reddit: the front page of the internet, 21 March 2013. Jenkins, Henry. Convergence Culture: Where Old and New Media Collide. New York: New York University Press, 2006. Jenkins, Henry, Sam Ford, and Joshua Green. Spreadable Media: Creating Value and Meaning in a Networked Culture. New York: New York University Press, 2013. Menning, Chris. "So I Got Banned from Reddit" Modern Primate, 23 Aug. 2012. Miller, Keery. “How Y Combinator Helped Shape Reddit.” Bloomberg Businessweek, 26 Sep. 2007. 29 Apr. 2014 ‹http://www.businessweek.com/stories/2007-09-26/how-y-combinator-helped-shape-redditbusinessweek-business-news-stock-market-and-financial-advice›. “Ok, Look People. I Know You Heart Imgur, But Webcomics? Just Link to the Freaking Site.” reddit: the front page of the internet, 22 Aug. 2011. Oliar, Dotan, and Christopher Sprigman. “There’s No Free Laugh (Anymore): The Emergence of Intellectual Property Norms and the Transformation of Stand-Up Comedy.” Virginia Law Review 94.8 (2009): 1787 – 1867. “People of reddit vs. /U/Ninjacollin for Grandtheft.jpg.” reddit: the front page of the internet, 30 Jan. 2013. Raustiala, Kal, and Christopher Sprigman. “The Piracy Paradox: Innovation and Intellectual Property in Fashion Design”. Virginia Law Review 92.8 (2006): 1687-1777. “Reddit v. U/FreshCorio. User Uploads Popular Repost Picture of R/AWW and Claims It Is His Sister’s Cat. Falsely Claims It Is His Cakeday for Good Measure.” reddit: the front page of the internet, 12 Apr. 2013. 29 Apr. 2014 ‹http://www.reddit.com/r/KarmaCourt/comments/1c7vxz/reddit_vs_ufreshcorio_user_uploads_popular_repost/›. “Reddit Pics.” reddit: the front page of the internet, 2014. 29 Apr. 2014 ‹http://www.reddit.com/r/pics/›. Richterich, Annika. “’Karma, Precious Karma!’ Karmawhoring on Reddit and the Front Page’s Econometrisation.” Journal of Peer Production 4 (2014). 29 Apr. 2014 ‹http://peerproduction.net/issues/issue-4-value-and-currency/peer-reviewed-articles/karma-precious-karma/›. Schultz, Mark. “Fear and Norms and Rock & Roll: What Jambands Can Teach Us about Persuading People to Obey Copyright Law.” Berkley Technology Law Journal 21.2 (2006): 651 – 728. Sepponen, Bemmu. “Why Redditors Gave Imgur a Chance.” Social Media Today, 20 July 2011. Slater-Robins, Max. “From Rags to Riches: The Story of Imgur.” Neowin, 21 Apr. 2013. "The Cutest Things on the Internet!" reddit: the front page of the internet, n.d. “The Problem with reddit.” reddit: the front page of the internet, 23 Aug. 2012. 29 Apr. 2014 ‹http://www.rreddit.com/r/technology/comments/ypbe2/the_problem_with_rreddit/›. Van der Nagel, Emily. “Faceless Bodies: Negotiating Technological and Cultural Codes on reddit gonewild.” Scan: Journal of Media Arts Culture 10.2 (2013). "We Don’t Need to Ask for Directions, Helen: aww." reddit: the front page of the internet, 30 June 2013. 29 Apr. 2014 ‹http://www.rreddit.com/r/aww/comments/1heut6/we_dont_need_to_ask_for_directions_helen/›.
Estilos ABNT, Harvard, Vancouver, APA, etc.
36

Brien, Donna Lee. "Climate Change and the Contemporary Evolution of Foodways". M/C Journal 12, n.º 4 (5 de setembro de 2009). http://dx.doi.org/10.5204/mcj.177.

Texto completo da fonte
Resumo:
Introduction Eating is one of the most quintessential activities of human life. Because of this primacy, eating is, as food anthropologist Sidney Mintz has observed, “not merely a biological activity, but a vibrantly cultural activity as well” (48). This article posits that the current awareness of climate change in the Western world is animating such cultural activity as the Slow Food movement and is, as a result, stimulating what could be seen as an evolutionary change in popular foodways. Moreover, this paper suggests that, in line with modelling provided by the Slow Food example, an increased awareness of the connections of climate change to the social injustices of food production might better drive social change in such areas. This discussion begins by proposing that contemporary foodways—defined as “not only what is eaten by a particular group of people but also the variety of customs, beliefs and practices surrounding the production, preparation and presentation of food” (Davey 182)—are changing in the West in relation to current concerns about climate change. Such modification has a long history. Since long before the inception of modern Homo sapiens, natural climate change has been a crucial element driving hominidae evolution, both biologically and culturally in terms of social organisation and behaviours. Macroevolutionary theory suggests evolution can dramatically accelerate in response to rapid shifts in an organism’s environment, followed by slow to long periods of stasis once a new level of sustainability has been achieved (Gould and Eldredge). There is evidence that ancient climate change has also dramatically affected the rate and course of cultural evolution. Recent work suggests that the end of the last ice age drove the cultural innovation of animal and plant domestication in the Middle East (Zeder), not only due to warmer temperatures and increased rainfall, but also to a higher level of atmospheric carbon dioxide which made agriculture increasingly viable (McCorriston and Hole, cited in Zeder). Megadroughts during the Paleolithic might well have been stimulating factors behind the migration of hominid populations out of Africa and across Asia (Scholz et al). Thus, it is hardly surprising that modern anthropogenically induced global warming—in all its’ climate altering manifestations—may be driving a new wave of cultural change and even evolution in the West as we seek a sustainable homeostatic equilibrium with the environment of the future. In 1962, Rachel Carson’s Silent Spring exposed some of the threats that modern industrial agriculture poses to environmental sustainability. This prompted a public debate from which the modern environmental movement arose and, with it, an expanding awareness and attendant anxiety about the safety and nutritional quality of contemporary foods, especially those that are grown with chemical pesticides and fertilizers and/or are highly processed. This environmental consciousness led to some modification in eating habits, manifest by some embracing wholefood and vegetarian dietary regimes (or elements of them). Most recently, a widespread awareness of climate change has forced rapid change in contemporary Western foodways, while in other climate related areas of socio-political and economic significance such as energy production and usage, there is little evidence of real acceleration of change. Ongoing research into the effects of this expanding environmental consciousness continues in various disciplinary contexts such as geography (Eshel and Martin) and health (McMichael et al). In food studies, Vileisis has proposed that the 1970s environmental movement’s challenge to the polluting practices of industrial agri-food production, concurrent with the women’s movement (asserting women’s right to know about everything, including food production), has led to both cooks and eaters becoming increasingly knowledgeable about the links between agricultural production and consumer and environmental health, as well as the various social justice issues involved. As a direct result of such awareness, alternatives to the industrialised, global food system are now emerging (Kloppenberg et al.). The Slow Food (R)evolution The tenets of the Slow Food movement, now some two decades old, are today synergetic with the growing consternation about climate change. In 1983, Carlo Petrini formed the Italian non-profit food and wine association Arcigola and, in 1986, founded Slow Food as a response to the opening of a McDonalds in Rome. From these humble beginnings, which were then unashamedly positing a return to the food systems of the past, Slow Food has grown into a global organisation that has much more future focused objectives animating its challenges to the socio-cultural and environmental costs of industrial food. Slow Food does have some elements that could be classed as reactionary and, therefore, the opposite of evolutionary. In response to the increasing homogenisation of culinary habits around the world, for instance, Slow Food’s Foundation for Biodiversity has established the Ark of Taste, which expands upon the idea of a seed bank to preserve not only varieties of food but also local and artisanal culinary traditions. In this, the Ark aims to save foods and food products “threatened by industrial standardization, hygiene laws, the regulations of large-scale distribution and environmental damage” (SFFB). Slow Food International’s overarching goals and activities, however, extend far beyond the preservation of past foodways, extending to the sponsoring of events and activities that are attempting to create new cuisine narratives for contemporary consumers who have an appetite for such innovation. Such events as the Salone del Gusto (Salon of Taste) and Terra Madre (Mother Earth) held in Turin every two years, for example, while celebrating culinary traditions, also focus on contemporary artisanal foods and sustainable food production processes that incorporate the most current of agricultural knowledge and new technologies into this production. Attendees at these events are also driven by both an interest in tradition, and their own very current concerns with health, personal satisfaction and environmental sustainability, to change their consumer behavior through an expanded self-awareness of the consequences of their individual lifestyle choices. Such events have, in turn, inspired such events in other locations, moving Slow Food from local to global relevance, and affecting the intellectual evolution of foodway cultures far beyond its headquarters in Bra in Northern Italy. This includes in the developing world, where millions of farmers continue to follow many traditional agricultural practices by necessity. Slow Food Movement’s forward-looking values are codified in the International Commission on the Future of Food and Agriculture 2006 publication, Manifesto on the Future of Food. This calls for changes to the World Trade Organisation’s rules that promote the globalisation of agri-food production as a direct response to the “climate change [which] threatens to undermine the entire natural basis of ecologically benign agriculture and food preparation, bringing the likelihood of catastrophic outcomes in the near future” (ICFFA 8). It does not call, however, for a complete return to past methods. To further such foodway awareness and evolution, Petrini founded the University of Gastronomic Sciences at Slow Food’s headquarters in 2004. The university offers programs that are analogous with the Slow Food’s overall aim of forging sustainable partnerships between the best of old and new practice: to, in the organisation’s own words, “maintain an organic relationship between gastronomy and agricultural science” (UNISG). In 2004, Slow Food had over sixty thousand members in forty-five countries (Paxson 15), with major events now held each year in many of these countries and membership continuing to grow apace. One of the frequently cited successes of the Slow Food movement is in relation to the tomato. Until recently, supermarkets stocked only a few mass-produced hybrids. These cultivars were bred for their disease resistance, ease of handling, tolerance to artificial ripening techniques, and display consistency, rather than any culinary values such as taste, aroma, texture or variety. In contrast, the vine ripened, ‘farmer’s market’ tomato has become the symbol of an “eco-gastronomically” sustainable, local and humanistic system of food production (Jordan) which melds the best of the past practice with the most up-to-date knowledge regarding such farming matters as water conservation. Although the term ‘heirloom’ is widely used in relation to these tomatoes, there is a distinctively contemporary edge to the way they are produced and consumed (Jordan), and they are, along with other organic and local produce, increasingly available in even the largest supermarket chains. Instead of a wholesale embrace of the past, it is the connection to, and the maintenance of that connection with, the processes of production and, hence, to the environment as a whole, which is the animating premise of the Slow Food movement. ‘Slow’ thus creates a gestalt in which individuals integrate their lifestyles with all levels of the food production cycle and, hence to the environment and, importantly, the inherently related social justice issues. ‘Slow’ approaches emphasise how the accelerated pace of contemporary life has weakened these connections, while offering a path to the restoration of a sense of connectivity to the full cycle of life and its relation to place, nature and climate. In this, the Slow path demands that every consumer takes responsibility for all components of his/her existence—a responsibility that includes becoming cognisant of the full story behind each of the products that are consumed in that life. The Slow movement is not, however, a regime of abstention or self-denial. Instead, the changes in lifestyle necessary to support responsible sustainability, and the sensual and aesthetic pleasure inherent in such a lifestyle, exist in a mutually reinforcing relationship (Pietrykowski 2004). This positive feedback loop enhances the potential for promoting real and long-term evolution in social and cultural behaviour. Indeed, the Slow zeitgeist now informs many areas of contemporary culture, with Slow Travel, Homes, Design, Management, Leadership and Education, and even Slow Email, Exercise, Shopping and Sex attracting adherents. Mainstreaming Concern with Ethical Food Production The role of the media in “forming our consciousness—what we think, how we think, and what we think about” (Cunningham and Turner 12)—is self-evident. It is, therefore, revealing in relation to the above outlined changes that even the most functional cookbooks and cookery magazines (those dedicated to practical information such as recipes and instructional technique) in Western countries such as the USA, UK and Australian are increasingly reflecting and promoting an awareness of ethical food production as part of this cultural change in food habits. While such texts have largely been considered as useful but socio-politically relatively banal publications, they are beginning to be recognised as a valid source of historical and cultural information (Nussel). Cookbooks and cookery magazines commonly include discussion of a surprising range of issues around food production and consumption including sustainable and ethical agricultural methods, biodiversity, genetic modification and food miles. In this context, they indicate how rapidly the recent evolution of foodways has been absorbed into mainstream practice. Much of such food related media content is, at the same time, closely identified with celebrity mass marketing and embodied in the television chef with his or her range of branded products including their syndicated articles and cookbooks. This commercial symbiosis makes each such cuisine-related article in a food or women’s magazine or cookbook, in essence, an advertorial for a celebrity chef and their named products. Yet, at the same time, a number of these mass media food celebrities are raising public discussion that is leading to consequent action around important issues linked to climate change, social justice and the environment. An example is Jamie Oliver’s efforts to influence public behaviour and government policy, a number of which have gained considerable traction. Oliver’s 2004 exposure of the poor quality of school lunches in Britain (see Jamie’s School Dinners), for instance, caused public outrage and pressured the British government to commit considerable extra funding to these programs. A recent study by Essex University has, moreover, found that the academic performance of 11-year-old pupils eating Oliver’s meals improved, while absenteeism fell by 15 per cent (Khan). Oliver’s exposé of the conditions of battery raised hens in 2007 and 2008 (see Fowl Dinners) resulted in increased sales of free-range poultry, decreased sales of factory-farmed chickens across the UK, and complaints that free-range chicken sales were limited by supply. Oliver encouraged viewers to lobby their local councils, and as a result, a number banned battery hen eggs from schools, care homes, town halls and workplace cafeterias (see, for example, LDP). The popular penetration of these ideas needs to be understood in a historical context where industrialised poultry farming has been an issue in Britain since at least 1848 when it was one of the contributing factors to the establishment of the RSPCA (Freeman). A century after Upton Sinclair’s The Jungle (published in 1906) exposed the realities of the slaughterhouse, and several decades since Peter Singer’s landmark Animal Liberation (1975) and Tom Regan’s The Case for Animal Rights (1983) posited the immorality of the mistreatment of animals in food production, it could be suggested that Al Gore’s film An Inconvenient Truth (released in 2006) added considerably to the recent concern regarding the ethics of industrial agriculture. Consciousness-raising bestselling books such as Jim Mason and Peter Singer’s The Ethics of What We Eat and Michael Pollan’s The Omnivore’s Dilemma (both published in 2006), do indeed ‘close the loop’ in this way in their discussions, by concluding that intensive food production methods used since the 1950s are not only inhumane and damage public health, but are also damaging an environment under pressure from climate change. In comparison, the use of forced labour and human trafficking in food production has attracted far less mainstream media, celebrity or public attention. It could be posited that this is, in part, because no direct relationship to the environment and climate change and, therefore, direct link to our own existence in the West, has been popularised. Kevin Bales, who has been described as a modern abolitionist, estimates that there are currently more than 27 million people living in conditions of slavery and exploitation against their wills—twice as many as during the 350-year long trans-Atlantic slave trade. Bales also chillingly reveals that, worldwide, the number of slaves is increasing, with contemporary individuals so inexpensive to purchase in relation to the value of their production that they are disposable once the slaveholder has used them. Alongside sex slavery, many other prevalent examples of contemporary slavery are concerned with food production (Weissbrodt et al; Miers). Bales and Soodalter, for example, describe how across Asia and Africa, adults and children are enslaved to catch and process fish and shellfish for both human consumption and cat food. Other campaigners have similarly exposed how the cocoa in chocolate is largely produced by child slave labour on the Ivory Coast (Chalke; Off), and how considerable amounts of exported sugar, cereals and other crops are slave-produced in certain countries. In 2003, some 32 per cent of US shoppers identified themselves as LOHAS “lifestyles of health and sustainability” consumers, who were, they said, willing to spend more for products that reflected not only ecological, but also social justice responsibility (McLaughlin). Research also confirms that “the pursuit of social objectives … can in fact furnish an organization with the competitive resources to develop effective marketing strategies”, with Doherty and Meehan showing how “social and ethical credibility” are now viable bases of differentiation and competitive positioning in mainstream consumer markets (311, 303). In line with this recognition, Fair Trade Certified goods are now available in British, European, US and, to a lesser extent, Australian supermarkets, and a number of global chains including Dunkin’ Donuts, McDonalds, Starbucks and Virgin airlines utilise Fair Trade coffee and teas in all, or parts of, their operations. Fair Trade Certification indicates that farmers receive a higher than commodity price for their products, workers have the right to organise, men and women receive equal wages, and no child labour is utilised in the production process (McLaughlin). Yet, despite some Western consumers reporting such issues having an impact upon their purchasing decisions, social justice has not become a significant issue of concern for most. The popular cookery publications discussed above devote little space to Fair Trade product marketing, much of which is confined to supermarket-produced adverzines promoting the Fair Trade products they stock, and international celebrity chefs have yet to focus attention on this issue. In Australia, discussion of contemporary slavery in the press is sparse, having surfaced in 2000-2001, prompted by UNICEF campaigns against child labour, and in 2007 and 2008 with the visit of a series of high profile anti-slavery campaigners (including Bales) to the region. The public awareness of food produced by forced labour and the troubling issue of human enslavement in general is still far below the level that climate change and ecological issues have achieved thus far in driving foodway evolution. This may change, however, if a ‘Slow’-inflected connection can be made between Western lifestyles and the plight of peoples hidden from our daily existence, but contributing daily to them. Concluding Remarks At this time of accelerating techno-cultural evolution, due in part to the pressures of climate change, it is the creative potential that human conscious awareness brings to bear on these challenges that is most valuable. Today, as in the caves at Lascaux, humanity is evolving new images and narratives to provide rational solutions to emergent challenges. As an example of this, new foodways and ways of thinking about them are beginning to evolve in response to the perceived problems of climate change. The current conscious transformation of food habits by some in the West might be, therefore, in James Lovelock’s terms, a moment of “revolutionary punctuation” (178), whereby rapid cultural adaption is being induced by the growing public awareness of impending crisis. It remains to be seen whether other urgent human problems can be similarly and creatively embraced, and whether this trend can spread to offer global solutions to them. References An Inconvenient Truth. Dir. Davis Guggenheim. Lawrence Bender Productions, 2006. Bales, Kevin. Disposable People: New Slavery in the Global Economy. Berkeley: University of California Press, 2004 (first published 1999). Bales, Kevin, and Ron Soodalter. The Slave Next Door: Human Trafficking and Slavery in America Today. Berkeley: University of California Press, 2009. Carson, Rachel. Silent Spring. Boston: Houghton Mifflin, 1962. Chalke, Steve. “Unfinished Business: The Sinister Story behind Chocolate.” The Age 18 Sep. 2007: 11. Cunningham, Stuart, and Graeme Turner. The Media and Communications in Australia Today. Crows Nest: Allen & Unwin, 2002. Davey, Gwenda Beed. “Foodways.” The Oxford Companion to Australian Folklore. Ed. Gwenda Beed Davey, and Graham Seal. Melbourne: Oxford University Press, 1993. 182–85. Doherty, Bob, and John Meehan. “Competing on Social Resources: The Case of the Day Chocolate Company in the UK Confectionery Sector.” Journal of Strategic Marketing 14.4 (2006): 299–313. Eshel, Gidon, and Pamela A. Martin. “Diet, Energy, and Global Warming.” Earth Interactions 10, paper 9 (2006): 1–17. Fowl Dinners. Exec. Prod. Nick Curwin and Zoe Collins. Dragonfly Film and Television Productions and Fresh One Productions, 2008. Freeman, Sarah. Mutton and Oysters: The Victorians and Their Food. London: Gollancz, 1989. Gould, S. J., and N. Eldredge. “Punctuated Equilibrium Comes of Age.” Nature 366 (1993): 223–27. (ICFFA) International Commission on the Future of Food and Agriculture. Manifesto on the Future of Food. Florence, Italy: Agenzia Regionale per lo Sviluppo e l’Innovazione nel Settore Agricolo Forestale and Regione Toscana, 2006. Jamie’s School Dinners. Dir. Guy Gilbert. Fresh One Productions, 2005. Jordan, Jennifer A. “The Heirloom Tomato as Cultural Object: Investigating Taste and Space.” Sociologia Ruralis 47.1 (2007): 20-41. Khan, Urmee. “Jamie Oliver’s School Dinners Improve Exam Results, Report Finds.” Telegraph 1 Feb. 2009. 24 Aug. 2009 < http://www.telegraph.co.uk/education/educationnews/4423132/Jamie-Olivers-school-dinners-improve-exam-results-report-finds.html >. Kloppenberg, Jack, Jr, Sharon Lezberg, Kathryn de Master, G. W. Stevenson, and John Henrickson. ‘Tasting Food, Tasting Sustainability: Defining the Attributes of an Alternative Food System with Competent, Ordinary People.” Human Organisation 59.2 (Jul. 2000): 177–86. (LDP) Liverpool Daily Post. “Battery Farm Eggs Banned from Schools and Care Homes.” Liverpool Daily Post 12 Jan. 2008. 24 Aug. 2009 < http://www.liverpooldailypost.co.uk/liverpool-news/regional-news/2008/01/12/battery-farm-eggs-banned-from-schools-and-care-homes-64375-20342259 >. Lovelock, James. The Ages of Gaia: A Biography of Our Living Earth. New York: Bantam, 1990 (first published 1988). Mason, Jim, and Peter Singer. The Ethics of What We Eat. Melbourne: Text Publishing, 2006. McLaughlin, Katy. “Is Your Grocery List Politically Correct? Food World’s New Buzzword Is ‘Sustainable’ Products.” The Wall Street Journal 17 Feb. 2004. 29 Aug. 2009 < http://www.globalexchange.org/campaigns/fairtrade/coffee/1732.html >. McMichael, Anthony J, John W Powles, Colin D Butler, and Ricardo Uauy. “Food, Livestock Production, Energy, Climate Change, and Health.” The Lancet 370 (6 Oct. 2007): 1253–63. Miers, Suzanne. “Contemporary Slavery”. A Historical Guide to World Slavery. Ed. Seymour Drescher, and Stanley L. Engerman. New York: Oxford University Press, 1998. Mintz, Sidney W. Tasting Food, Tasting Freedom: Excursions into Eating, Culture, and the Past. Boston: Beacon Press, 1994. Nussel, Jill. “Heating Up the Sources: Using Community Cookbooks in Historical Inquiry.” History Compass 4/5 (2006): 956–61. Off, Carol. Bitter Chocolate: Investigating the Dark Side of the World's Most Seductive Sweet. St Lucia: U of Queensland P, 2008. Paxson, Heather. “Slow Food in a Fat Society: Satisfying Ethical Appetites.” Gastronomica: The Journal of Food and Culture 5.1 (2005): 14–18. Pietrykowski, Bruce. “You Are What You Eat: The Social Economy of the Slow Food Movement.” Review of Social Economy 62:3 (2004): 307–21. Pollan, Michael. The Omnivore’s Dilemma: A Natural History of Four Meals. New York: The Penguin Press, 2006. Regan, Tom. The Case for Animal Rights. Berkeley: University of California Press, 1983. Scholz, Christopher A., Thomas C. Johnson, Andrew S. Cohen, John W. King, John A. Peck, Jonathan T. Overpeck, Michael R. Talbot, Erik T. Brown, Leonard Kalindekafe, Philip Y. O. Amoako, Robert P. Lyons, Timothy M. Shanahan, Isla S. Castañeda, Clifford W. Heil, Steven L. Forman, Lanny R. McHargue, Kristina R. Beuning, Jeanette Gomez, and James Pierson. “East African Megadroughts between 135 and 75 Thousand Years Ago and Bearing on Early-modern Human Origins.” PNAS: Proceedings of the National Academy of the Sciences of the United States of America 104.42 (16 Oct. 2007): 16416–21. Sinclair, Upton. The Jungle. New York: Doubleday, Jabber & Company, 1906. Singer, Peter. Animal Liberation. New York: HarperCollins, 1975. (SFFB) Slow Food Foundation for Biodiversity. “Ark of Taste.” 2009. 24 Aug. 2009 < http://www.fondazioneslowfood.it/eng/arca/lista.lasso >. (UNISG) University of Gastronomic Sciences. “Who We Are.” 2009. 24 Aug. 2009 < http://www.unisg.it/eng/chisiamo.php >. Vileisis, Ann. Kitchen Literacy: How We Lost Knowledge of Where Food Comes From and Why We Need to Get It Back. Washington: Island Press/Shearwater Books, 2008. Weissbrodt, David, and Anti-Slavery International. Abolishing Slavery and its Contemporary Forms. New York and Geneva: Office of the United Nations High Commissioner for Human Rights, United Nations, 2002. Zeder, Melinda A. “The Neolithic Macro-(R)evolution: Macroevolutionary Theory and the Study of Culture Change.” Journal of Archaeological Research 17 (2009): 1–63.
Estilos ABNT, Harvard, Vancouver, APA, etc.
37

Robards, Brady. "Digital Traces of the Persona through Ten Years of Facebook". M/C Journal 17, n.º 3 (11 de junho de 2014). http://dx.doi.org/10.5204/mcj.818.

Texto completo da fonte
Resumo:
When I think, rarely, about the articulation of the set of traces that I am leaving, I have the immediate apprehension that it is not the real me that’s out there on the Web. I know the times when I have censored myself (oh problematic concept!) and when I have performed actions to complement—and frequently to confound—a trace. […] Taken globally, the set of traces that we leave in the world does without doubt add up to something. It is through operations on sets of traces that I understand an event that I take part in. (Bowker 23) Over the past decade, Facebook has become integrated into the everyday lives of many of its 1.28 billion active users to the point that Facebook can no longer be considered “new media.” The site is driven by the “disclosures” (Stutzman, Gross and Acquisti) users make on the site—by uploading photos, writing status updates, commenting on posts made by others, sharing news items, entering biographical details, and so on. These digital traces of life are archived by default, persisting indefinitely as etches in Facebook’s servers around the world. Especially for young users who have grown up using Facebook, significant parts of their social and cultural lives have been played out on the site. As spaces in which the persona is enacted and made visible, social network sites like Facebook also effectively capture growing up stories through a chronicle of mediated, transitional experiences: birthdays, graduations, the beginning (and end) of relationships, first jobs, travel, and so on. For these reasons, Facebook also comes to serve as a site of memorialisation for users who have passed away. To mark its tenth anniversary (2014), Facebook drew attention to the great depth and wealth of experiences users had traced upon its pages through the release of one-minute “look back“ videos, chronicling the life of individual users over their time on Facebook. These videos have become short manifestations of the personas presented on the site, crafted through an algorithmic selection of critical moments in the user’s life (as shared on the site) to tell that user’s story. To turn Bowker’s musings in the above quote into a question, what do these sets of traces that we leave in the world add up to? In this article, I undertake a critical reading of Facebook’s look back videos to argue that they serve as the strongest reminder yet about the function of Facebook as memory archive. I draw on several sources: my own analysis of the structure of the videos themselves, the Facebook corporate blog describing the roll out of the videos, and the public campaign played out on YouTube by John Berlin to have a look back video generated for his deceased son. I argue that Facebook comes to serve two critical functions for users, as both the site upon which life narratives are performed and organised, and also the site through which the variously public and private disclosures that constitute a persona are recalled and reflected upon. In setting out these arguments, I divide this paper into three parts: first, a description and reflection upon my own experience of the look back video; second, a consideration of critical moments selected for inclusion in the look back videos by algorithm as persona; and third, a discussion of death and memorialisation, as a sharp example of the significance of the digital traces we leave behind. The Look Back Video Gentle piano music rises as the “camera” pans across an assortment of photos. The flute joins the piano, and you are reminded that you started your Facebook journey in 2006. Here is your first profile picture—you with your arm around one of your good mates when you were twenty years old. Faster now, and here are “your first moments,” presented as images you have shared: March 2008, some of your closest friends who you met during your undergraduate studies, standing around sharing a drink; April 2008, a photo of a friend eating a biscuit, mid-conversation (she’d hate this one); and one last photo from April 2008, the biscuit-eating friend’s ex-boyfriend looking coy (you no longer speak to him, but he is still on your Friends list). Now enter the violins, seventeen seconds in. Things are getting nostalgic. Here are “your most liked posts”: July 2012, “thesis submitted for examination, yo” (46 likes); November 2012, “Trust me, I’m a Doctor… of Philosophy” (98 likes); February 2013, a mess of text announcing that you’ve found a job and you’ll be leaving your hometown (106 likes). Thirty-five seconds in now, and the pace of the music changes—look how far you have come. Here are some photos you have shared: December 2008, you at a bowling alley with your arm around one of your best friends who now lives overseas; October 2009, friends trying to sleep on your couch, being disturbed by the flash of your camera; June 2010, a family shot at your mother’s birthday. The pace quickens now, as we move into the final quarter of the video: September 2010, you on the beach with friends visiting from overseas; October 2011, you with some people you met in Canada whose names you don’t recall; (images now moving faster and faster) November 2011, ice skating with friends; March 2012, a wedding in Hawaii where you were the best man; December 2012, celebrating the conferral of your PhD with two colleagues; and finally July 2013, farewelling colleagues at a going away party. In the final ten seconds, the music reaches its crescendo and the camera pans backwards to reveal a bigger collage of photos of you and your nearest and dearest. Facebook’s trademark “thumbs up”/like symbol signals the end of the retrospective, looking back on the critical moments from the last eight and a half years of your life. Underneath the video, as if signing off a card accompanying a birthday present, is “Mark” (Zuckerberg, Facebook CEO, in a faux hand-written font) “and the Facebook Team.” Facebook is you, the note seems to imply; for our anniversary, we present you back to yourself (see fig. 1). On 4 February 2014, the look back video feature was made available to all Facebook users. Some 200 million watched their videos, and more than 50% shared them with their networks (Spiridonov and Bandaru). In other words, around 100 million Facebook users held up their own individually generated look back videos as a record of the persona they had crafted through the site, and shared that persona retrospective with their networks. The videos work in the same way that television news programs piece together memorial clips for celebrities who have passed away, blending emotive music with visuals that conjure up memories and reflections. The first point of difference is that Facebook’s look back videos were intended for the living (although this function shifted as I will explain in a case study towards the end of this piece) to reflect on their own personas presented through the site, and then (about half the time) shared with their networks. The second difference is the technical, automated process of piecing together, rendering, storing, and streaming these videos on a large scale. Spiridonov and Bandaru, two Facebook engineers writing on the site’s Engineering Blog, described the rapid development and rollout of the videos. They explain the enormous pool of technical resources and human capital that were brought to bear on the project, including thirty teams across the company, in just 25 days. They end their explanatory post with an homage to “the things [they] love about Facebook culture” that the project represented for them, including “helping hundreds of millions of people connect with those who are important to them” (Spiridonov and Bandaru). The look back videos also serve a deeper purpose that isn’t addressed explicitly in any explanatory notes or press releases: to demonstrate the great depth of disclosures users make and are implicated in by others on the site. In a one-minute look back video, these disclosures come to serve as the very digital traces that Bowker was interested in, forming a longitudinal record of the persona. Algorithms and Critical Moments Although the explanatory post by Spiridonov and Bandaru did not go into details, the algorithm that determines which photos and status updates go into the look back videos appears to consider the quantity of likes and (potentially) comments on posts, while also seeking to sample disclosures made across the user’s time on the site. The latter consideration works to reinforce the perception of the longitudinal nature of the site’s memory, and the extent to which the life of the user has become entangled with, enmeshed in, and mediated through Facebook. Through the logic of the look back algorithm, critical moments in the user’s life course—those experiences that mark out narratives of growing up—become measured not in terms of their value for individuals, but instead through a quantitative metric of “likes.” While after the initial release of the look back feature, Facebook did provide users with the functionality to alter their videos with some limited control over which images could be featured, the default was determined by the algorithm. Social network sites have come to serve as spaces for reflexive identity work, for the development of personas for young people (boyd; Livingstone; Hodkinson and Lincoln; Lincoln; Robards). The transition towards adulthood is punctuated and shaped by “critical moments” (Thomson et al.) such as moving out of home, dropping out of school, entering a relationship, learning to drive, a death in the family, going clubbing for the first time, and so on. In Giddens’ terms, the “fateful moment” (from which Thomson et al. borrow in conceptualising the critical moment), is “highly consequential for a person’s destiny” (121), and should be understood as distinct from but certainly affecting the inconsequential goings-on of daily life. When these critical moments are articulated and made visible on social network sites like Facebook, and then subsequently archived by way of the persistent nature of these sites, they become key markers in a mediated growing up story for young people. Livingstone points towards the role of these sites for young people who are “motivated to construct identities, to forge new social groupings, and to negotiate alternatives to given cultural meanings” (4). Sharing, discussing, and remembering these critical moments becomes an important activity on social network sites, and thus the look back video serves to neatly capture critical moments in a one minute retrospective. Facebook has also started prompting users to record critical moments through predetermined, normative categories (see fig. 2) such as romance (a first kiss), health (losing weight and not smoking), purchases (buying a house and a car), and civic duty (voting and military service). These disclosure prompts operate at a deeper level to the logic of sharing whatever you are doing right now, and instead feed into that longitudinal memory of the site. As I have argued elsewhere (see Robards) it is clear that not all critical moments are disclosed equally on social network sites. Users may choose not to disclose some critical moments – such as breakups and periods of depression or anxiety – instead preferring to present an “idealised self.” Goffman explains that idealised presentations are aspirational, and that individuals will perform the best version of themselves (44). This isn’t a fake persona or a deception, but simply a presentation of what the individual regards to be the best qualities and appearances, contingent upon what Goffman described as the standards of the region (110). What constitutes an “authentic” persona on Facebook is clearly subjective, and dependent on those region specific standards. In my earlier research on MySpace, the quantity of friends one had was an indicator of popularity, or a quantitative measure of social capital, but over time and with the shift to Facebook this appeared to change, such that smaller networks became more “authentic” (Robards). Similarly, the kinds of disclosures users make on Facebook will vary depending on the conventions of use they have established within their own networks. Importantly, the look back algorithm challenges the user’s capacity to value their own critical moments, or indeed any moments or disclosures that might mark out a narrative of self, and instead chooses moments for the user. In this scenario, at least initially, the look back algorithm co-constructs the retrospective persona summary for the user. Only with effort, and only to a certain extent, can the user exercise curatorial control over that process. Death and Other Conclusions Although the initial function of the look back videos was for users to reflect on their own personas presented through Facebook, users who had lost loved ones quickly sought look back videos for the deceased. John Berlin, a Facebook user who had lost his son Jesse in 2012, tried to access a look back video for his son but was unsuccessful. He posted his plea to YouTube, which received almost three million views, and was eventually successful, after his request “touched the hearts of everyone who heard it” including Facebook staff (Price and DiSclafani). After receiving numerous similar requests, Facebook established a form where people could make have videos for deceased users rendered. In the words of Facebook staff, this was part of the site’s commitment to “preserve legacies on Facebook” (Price and DiSclafani). There is a growing body of research on the digital traces we leave behind after death. Leaver points out that when social media users die, the “significant value of the media traces a user leaves behind” is highlighted. Certainly, this has been the case with the look back videos, further supporting Leaver’s claim. John Berlin’s plea to have his deceased son’s look back video made available to him was presented as a key factor in Facebook’s decision to make these videos available to loved ones. Although the video’s narrative was unchanged (still pitched to users themselves, rather than their loved ones) John Berlin shared his son’s look back video on YouTube to a much wider network than he or his son may have previously imagined. Indeed, Gibson has argued that “digital remains cannot easily be claimed back into a private possessive sphere of ownership” (214). Although Jesse Berlin’s look back video did not reach the millions of viewers his father’s plea reached, on YouTube it still had some 423,000 views, clearly moving beyond Gibson’s “private possessive sphere” (214) to became a very public memorial. Bowker makes the observation that his friends and acquaintances who died before 1992 are sparsely represented online. In 1992, the first widely adopted web browser Mosaic made the Internet accessible for ordinary people in an everyday context. Bowker goes on to explain that his friends who died post-Mosaic “carry on a rich afterlife [… they] still receive email messages; links to their website rot very slowly; their informal thoughts are often captured on list-serv archives, on comments they have left on a website” (23). For Bowker, the rise of the Internet has brought about a “new regime of memory practices” (34). The implications of this new “paradigm of the trace” for Facebook users are only now becoming clear, multiplied in depth and complexity compared to the forms of digital traces Bowker was discussing. The dead, of course, have always left traces—letters, bureaucratic documents, photographs, and so on. There is nothing particularly new about the social and cultural traces that the dead leave behind, only in the way these traces persist and are circulated as the Berlin case study makes clear. The look back video brings the significance of the digital trace into a new light, challenging concepts of personal histories and the longevity of everyday personas. Now that Facebook has developed the infrastructure and the processes for rolling out these look back features, there is the possibility that we will see more in the future. The site already provides annual summaries of the user’s year on Facebook in December. It is possible that look back videos could mark out other moments, too: birthdays, new relationships, potentially even the deaths of loved ones. Might Facebook look back videos – in future forms and iterations, no doubt distinct from the ten-year anniversary video described here – come to serve as a central mechanism for memory, nostalgia, and memorialisation? I don’t have the same kind of apprehension that Bowker expresses in the quote at the top of this article, where he reflects on whether or not it is the “real” him out there on the web. Through Goffman’s dramaturgical lens, I am convinced that there is no single “authentic” persona, but rather many sides to the personas we present to others and to ourselves. The Facebook look back video figures into that presentation and that reflection, albeit through an algorithm that projects a curated set of critical moments back to us. In this sense, these videos become mirrors through which Facebook users experience the personas they have mediated on the site. Facebook is surely aware of this significance, and will no doubt continue to build the importance and depth of the digital traces users inscribe on the site into their plans for the future. References Bowker, Geoffrey C. “The Past and the Internet.” Structures of Participation in Digital Culture. New York: Social Science Research Council, 2007. 20-36. boyd, danah. “Social Network Sites as Networked Publics: Affordances, Dynamics, and Implications.” A Networked Self: Identity, Community, and Culture on Social Network Sites. New York: Routledge, 2011. 39-58. Gibson, Margaret. “Digital Objects of the Dead: Negotiating Electronic Remains.” The Social Construction of Death: Interdisciplinary Perspectives. Ed. Leen van Brussel and Nico Carpentier. Palgrave, 2014: 212-229. Giddens, Anthony. Modernity and Self-Identity: Self and Society in the Late Modern Age. London: Palgrave Macmillan, 1993. Goffman, Erving. The Presentation of Self in Everyday Life. London: Penguin, 1959. Hodkinson, Paul, and Sian Lincoln. “Online Journals as Virtual Bedrooms? Young People, Identity and Personal Space.” Young 16.1 (2008): 27-46. Leaver, Tama. “The Social Media Contradiction: Data Mining and Digital Death.” M/C Journal 16.2 (2013). Lincoln, Siân. Youth Culture and Private Space. London: Palgrave Macmillan, 2012. Stutzman, Fred, Robert Capra, and Jamila Thompson. “Factors Mediating Disclosure in Social Network Sites.” Computers in Human Behavior 27.1 (2011): 590-598. Livingstone, Sonia. “Taking Risky Opportunities in Youthful Content Creation: Teenagers' Use of Social Networking Sites for Intimacy, Privacy and Self-Expression.” New Media & Society 10.3 (2008): 393-411. Robards, Brady. “Leaving MySpace, Joining Facebook: ‘Growing Up’ on Social Network Sites.” Continuum 26.3 (2012): 385-398. Thomson, Rachel, et al. “Critical Moments: Choice, Chance and Opportunity in Young People's Narratives of Transition.” Sociology 36.2 (2002): 335-354.
Estilos ABNT, Harvard, Vancouver, APA, etc.
38

Bowles, Kate. "Academia 1.0: Slow Food in a Fast Food Culture? (A Reply to John Hartley)". M/C Journal 12, n.º 3 (15 de julho de 2009). http://dx.doi.org/10.5204/mcj.169.

Texto completo da fonte
Resumo:
"You could think of our kind of scholarship," he said, "as something like 'slow food' in a fast-food culture."— Ivan Kreilkamp, co-editor of Victorian Studies(Chronicle of Higher Education, March 2009) John Hartley’s entertaining and polemical defense of a disappearing art form (the print copy journal designed to be ripped eagerly from its envelope and read from cover to cover like a good book) came my way via the usual slightly disconcerting M/C Journal overture: I believe that your research interests and background make you a potential expert reviewer of the manuscript, "LAMENT FOR A LOST RUNNING ORDER? OBSOLESCENCE AND ACADEMIC JOURNALS," which has been submitted to the '' [sic] issue of M/C Journal. The submission's extract is inserted below, and I hope that you will consider undertaking this important task for us. Automated e-mails like these keep strange company, with reminders about overdue library items and passwords about to expire. Inevitably their tone calls to mind the generic flattery of the internet scam that announces foreign business opportunities or an unexpectedly large windfall from a deceased relative. At face value, this e-mail confirms John Hartley’s suspicions about the personalised craft of journal curation. Journal editing, he implies, is going the way of drywalling and smithying—by the time we realise these ancient and time-intensive skills have been lost, it’ll be too late. The usual culprit is to the fore—the internet—and the risk presented by obsolescence is very significant. At stake is the whole rich and messy infrastructure of academic professional identity: scholarly communication, goodwill, rank, trust, service to peers, collegiality, and knowledge itself. As a time-poor reader of journals both online and in print I warmed to this argument, and enjoyed reading about the particularities of journal editing: the cultivation and refinement of a specialised academic skill set involving typefaces, cover photographs and running order. Journal editors are our creative directors. Authors think selfishly and not always consistently about content, position and opportunity, but it’s the longer term commitment of editors to taking care of their particular shingle in the colourful and crowded bazaar of scholarly publishing, that keeps the market functioning in a way that also works for inspectors and administrators. Thinking of all the print journals I’ve opened and shut and put on shelves (sometimes still in their wrappers) and got down again, and photocopied, and forgotten about, I realised that I do retain a dim sense of their look and shape, and that in practical ways this often helps me remember what was in them. Nevertheless, even having been through the process he describes, whereby “you have to log on to some website and follow prompts in order to contribute both papers and the assessment of papers; interactions with editors are minimal,” I came to the conclusion that he had underestimated the human in the practice of refereeing. I wasn’t sure made me an expert reviewer for this piece, except perhaps that in undertaking the review itself I was practising a kind of expertise that entitled me to reflect on what I was doing. So as a way of wrestling with the self-referentiality of the process of providing an anonymous report on an article whose criticism of blind refereeing I shared, I commented on the corporeality and collegiality of the practice: I knew who I was writing about (and to), and I was conscious of both disagreeing and wondering how to avoid giving offence. I was also cold in my office, and wondering about a coffee. “I suspect the cyborg reviewer is (like most cyborgs) a slightly romantic, or at least rhetorical, fantasy,” I added, a bit defensively. “Indeed, the author admits to practising editorship via a form of human intersubjectivity that involves email, so the mere fact that the communication in some cases is via a website doesn’t seem to render the human obsolete.” The cyborg reviewer wasn’t the only thing bothering me about the underlying assumptions concerning electronic scholarly publishing, however. The idea that the electronic disaggregation of content threatens the obsolescence of the print journal and its editor is a little disingenuous. Keyword searches do grab articles independently of issues, it’s true, but it’s a stretch to claim that this functionality is what’s turning diligent front-to-back readers and library flaneurs into the kinds of online mercenaries we mean when we say “users”. Quite the opposite: journal searches are highly seductive invitations to linger and explore. Setting out from the starting point of a single article, readers can now follow a citation trail, or chase up other articles by the same author or on similar topics, all the while keeping in plain sight the running order that was designed by the editors as an apt framework for the piece when it first appeared. Journal publishers have the keenest investment in nurturing the distinctive brand of each of their titles, and as a result the journal name is never far from view. Even the cover photo and layout is now likely to be there somewhere, and to crop up often as readers retrace their steps and set out again in another direction. So to propose that online access makes the syntactical form of a journal issue irrelevant to readers is to underestimate both the erotics of syntax, and the capacity of online readers to cope with a whole new libidinous economy of searching characterised by multiple syntactical options. And if readers are no longer sequestered within the pages of an individual hard copy journal—there really is a temptation to mention serial monogamy here—their freedom to operate more playfully only draws attention to the structural horizontalities of the academic public sphere, which is surely the basis of our most durable claims to profess expertise. Precisely because we are hyperlinked together across institutions and disciplines, we can justly argue that we are perpetually peer-reviewing each other, in a fairly disinterested fashion, and no longer exclusively in the kinds of locally parochial clusters that have defined (and isolated) the Australian academy. So although disaggregation irritates journal editors, a more credible risk to their craft comes from the disintermediation of scholarly communication that is one of the web’s key affordances. The shift towards user generated content, collaboratively generated, openly accessible and instantly shareable across many platforms, does make traditional scholarly publishing, with its laborious insistence on double blind refereeing, look a bit retro. How can this kind of thing not become obsolete given how long it takes for new ideas to make their way into print, what with all that courtly call and response between referees, editors and authors, and the time consumed in arranging layout and running order and cover photos? Now that the hegemons who propped up the gold standard journals are blogging and podcasting their ideas, sharing their bookmarks, and letting us know what they’re doing by the hour on Twitter, with presumably no loss of quality to their intellectual presence, what kind of premium or scarcity value can we place on the content they used to submit to print and online journals? So it seems to me that the blogging hegemon is at least as much of a problem for the traditional editor as the time challenged browser hoping for a quick hit in a keyword search. But there are much more complicated reasons why the journal format itself is not at risk, even from www.henryjenkins.org. Indeed, new “traditional” journals are being proposed and launched all the time. The mere award of an A* for the International Journal of Cultural Studies in the Australian journal rankings (Australian Research Council) confirms that journals are persistently evaluated in their own right, that the brand of the aggregating instrument still outranks the bits and pieces of disaggregated content, and that the relative standing of different journals depends precisely on the quantification of difficulty in meeting the standards (or matching the celebrity status) of their editors, editorial boards and peer reviewing panels. There’s very little indication in this process that either editors or reviewers are facing obsolescence; too many careers still depend on their continued willingness to stand in the way of the internet’s capacity to let anyone have a go at presenting ideas and research in the public domain. As the many inputs to the ERA exercise endlessly, and perhaps a bit tediously, confirmed, it’s the reputation of editors and their editorial practices that signals the exclusivity of scholarly publishing: in the era of wikis and blogs, an A* journal is one club that’s not open to all. Academia 1.0 is resilient for all these straightforward reasons. Not only in Australia, tenure and promotion depend on it. As a result, since the mid 1990s, editors, publishers, librarians and other stakeholders in scholarly communication have been keeping a wary eye on the pace and direction of change to either its routines or its standards. Their consistent attention has been on the proposition the risk comes from something loosely defined as “digital”. But as King, Tenopir and Clark point out in their study of journal readership in the sciences, the relevance of journal content itself has been extensively disputed and investigated across the disciplines since the 1960s. Despite the predictions of many authors in the 1990s that electronic publishing and pre-publishing would challenge the professional supremacy of the print journal, it seems just as likely that the simple convenience of filesharing has made more vetted academic material available, more easily, to more readers. As they note in a waspish foonote, even the author of one of the most frequently cited predictions that scholarly journals were on the way out had to modify his views, “perhaps due to the fact that his famous 1996 [sic] article "Tragic Loss or Good Riddance? The Impending Demise of Traditional Scholarly Journals" has had thousands of hits or downloads on his server alone.” (King et al,; see also Odlyzko, " Tragic Loss" and "Rapid Evolution"). In other words, all sides now seem to agree that “digital” has proved to be both opportunity and threat to scholarly publication. Odlyzko’s prediction of the disappearance of the print journal and its complex apparatus of self-perpetuation was certainly premature in 1996. So is John Hartley right that it’s time to ask the question again? Earlier this year, the Chronicle of Higher Education’s article “Humanities Journals Confront Identity Crisis”, which covered much of the same ground, generated brisk online discussion among journal editors in the humanities (Howard; see also the EDITOR-L listserv archive). The article summarised the views of a number of editors of “traditional” journals, and offset these with the views of a group representing the Council of Editors of Learned Journals, canvassing the possibility that scholarly publishing could catch up to the opportunities that we tend to shorthand as “web 2.0”. The short-lived CELJ blog discussion led by Jo Guldi in February 2009 proposed four principles we might expect to shape the future of scholarly publishing in the humanities: technical interoperability, which is pretty uncontroversial; the expansion of scholarly curation to a role in managing and making sense of “the noise of the web”; diversification of content types and platforms; and a more inclusive approach to the contribution of non-academic experts. (Guldi et al.) Far from ceding the inexorability of their own obsolescence, the four authors of this blog (each of them journal editors) have re-imagined the craft of editing, and have drafted an amibitious but also quite achievable manifesto for the renovation of scholarly communication. This is focused on developing a new and more confident role for the academy in the next phase of the development of the knowledge-building capacity of the web. Rather than confining themselves to being accessed only by their professional peers (and students) via university libraries in hardcopy or via institutional electronic subscription, scholars should be at the forefront of the way knowledge is managed and developed in the online public sphere. This would mean developing metrics that worked as well for delicious and diigo as they do for journal rankings; and it would mean a more upfront contribution to quality assurance and benchmarking of information available on the web, including information generated from outside the academy. This resonates with John Hartley’s endorsement of wiki-style open refereeing, which as an idea contains a substantial backwards nod to Ginsparg’s system of pre-publication of the early 1990s (see Ginsparg). It also suggests a more sophisticated understanding of scholarly collaboration than the current assumption that this consists exclusively of a shift to multiply-authored content, the benefit of which has tended to divide scholars in the humanities (Young). But it was not as a reviewer or an author that this article really engaged me in thinking about the question of human obsolescence. Recently I’ve been studying the fragmentation, outsourcing and automation of work processes in the fast food industry or, as it calls itself, the Quick Service Restaurant trade. I was drawn into this study by thinking about the complex reorganisation of time and communication brought about by the partial technologisation of the McDonalds drive-thru in Australia. Now that drive-thru orders are taken through a driveway speaker, the order window (and its operator) have been rendered obsolete, and this now permanently closed window is usually stacked high with cardboard boxes. Although the QSR industry in the US has experimented with outsourcing ordering to call centres at other locations (“May I take your order?”), in Australia the task itself has simply been added to the demands of customer engagement at the paying window, with the slightly odd result that the highest goal of customer service at this point is to be able to deal simultaneously with two customers at two different stages of the drive-thru process—the one who is ordering three Happy Meals and a coffee via your headset, and the one who is sitting in front of you holding out money—without offending or confusing either. This formal approval of a shift from undivided customer attention to the time-efficiency of multitasking is a small but important reorientation of everyday service culture, making one teenager redundant and doubling the demands placed on the other. The management of quick service restaurant workers and their productivity offers us a new perspective on the pressures we are experiencing in the academic labour market. Like many of my colleagues, I have been watching with a degree of ambivalence the way in which the national drive to quantify excellence in research in Australia has resulted in some shallow-end thinking about how to measure what it is that scholars do, and how to demonstrate that we are doing it competitively. Our productivity is shepherded by the constant recalibration of our workload, conceived as a bundle of discrete and measurable tasks, by anxious institutions trying to stay ahead in the national game of musical chairs, which only offers a limited number of seats at the research table—while still keeping half an eye on their enterprise bargaining obligations. Or, as the Quick Service Restaurant sector puts it: Operational margins are narrowing. While you need to increase the quality, speed and accuracy of service, the reality is that you also need to control labor costs. If you reduce unnecessary labor costs and improve workforce productivity, the likelihood of expanding your margins increases. Noncompliance can cost you. (Kronos) In their haste to increase quality, speed and accuracy of academic work, while lowering labor costs and fending off the economic risk of noncompliance, our institutions have systematically overlooked the need to develop meaningful ways to accommodate the significant scholarly work of reading, an activity that takes real time, and that in its nature is radically incompatible with the kinds of multitasking we are all increasingly using to manage the demands placed on us. Without a measure of reading, we fall back on the exceptionally inadequate proxy of citation. As King et al. point out, citation typically skews towards a small number of articles, and the effect of using this as a measure of reading is to suggest that the majority of articles are never read at all. Their long-term studies of what scientists read, and why, have been driven by the need to challenge this myth, and they have demonstrated that while journals might not be unwrapped and read with quite the Christmas-morning eagerness that John Hartley describes, their content is eventually read more than once, and often more than once by the same person. Both electronic scholarly publishing, and digital redistribution of material original published in print, have greatly assisted traditional journals in acquiring something like the pass-on value of popular magazines in dentists’ waiting rooms. But for all this to work, academics have to be given time to sit and read, and as it would be absurd to try to itemise and remunerate this labour specifically, then this time needs to be built into the normative workload for anyone who is expected to engage in any of the complex tasks involved in the collaborative production of knowledge. With that in mind, I concluded my review on what I hoped was a constructive note of solidarity. “What’s really under pressure here—forms of collegiality, altruism and imaginative contributions to a more outward-facing type of scholarship—is not at risk from search engines, it seems to me. What is being pressured into obsolescence, risking subscriptions to journals as much as purchases of books, is the craft and professional value placed on reading. This pressure is not coming from the internet, but from all the other bureaucratic rationalities described in this paper, that for the time being do still value journals selectively above other kinds of public contribution, but fail to appreciate the labour required to make them appear in any form, and completely overlook the labour required to absorb their contents and respond.” For obvious reasons, my warm thanks are due to John Hartley and to the two editors of this M/C Journal issue for their very unexpected invitation to expand on my original referee’s report.References Australian Research Council. “The Excellence in Research for Australia (ERA) Initiative: Journal Lists.” 2009. 3 July 2009 ‹http://www.arc.gov.au/era/era_journal_list.htm›. Ginsparg, Paul. “Can Peer Review be Better Focused?” 2003. 1 July 2009 ‹http://people.ccmr.cornell.edu/~ginsparg/blurb/pg02pr.html›. Guldi, Jo, Michael Widner, Bonnie Wheeler, and Jana Argersinger. The Council of Editors of Learned Journals Blog. 2009. 1 July 2009 ‹http://thecelj.blogspot.com›. Howard, Jennifer. “Humanities Journals Confront Identity Crisis.” The Chronicle of Higher Education 27 Mar. 2009. 1 July 2009 ‹http://chronicle.com/free/v55/i29/29a00102.htm›. King, Donald, Carol Tenopir, and Michael Clarke. "Measuring Total Reading of Journal Articles." D-Lib Magazine 12.10 (2006). 1 July 2009 ‹http://www.dlib.org/dlib/october06/king/10king.html›. Kronos Incorporated. “How Can You Reduce Your Labor Costs without Sacrificing Speed of Service?” (2009). 1 July 2009 ‹http://www.qsrweb.com/white_paper.php?id=1738&download=1›.“May I Take Your Order? Local McDonald's Outsources to a Call Center.” Billings Gazette, Montana, 5 July 2006. SharedXpertise Forum. 1 July 2009 ‹http://www.sharedxpertise.org/file/3433/mcdonalds-outsourcing-to-call-center.html›.Odlyzko, Andrew. “The Rapid Evolution of Scholarly Publishing.” Learned Publishing 15.1 (2002): 7-19. ———. “Tragic Loss or Good Riddance? The Impending Demise of Traditional Scholarly Journals.” International Journal of Human-Computer Studies 42 (1995): 71-122. Young, Jeffrey. “Digital Humanities Scholars Collaborate More on Journal Articles than 'Traditional' Researchers.” The Chronicle of Higher Education 27 April 2009. 1 July 2009 ‹http://chronicle.com/wiredcampus/article/3736/digital-humanities-scholars-collaborate-more-on-journal-articles-than-on-traditional-researchers›.
Estilos ABNT, Harvard, Vancouver, APA, etc.
39

Khamis, Susie. "Nespresso: Branding the "Ultimate Coffee Experience"". M/C Journal 15, n.º 2 (2 de maio de 2012). http://dx.doi.org/10.5204/mcj.476.

Texto completo da fonte
Resumo:
Introduction In December 2010, Nespresso, the world’s leading brand of premium-portioned coffee, opened a flagship “boutique” in Sydney’s Pitt Street Mall. This was Nespresso’s fifth boutique opening of 2010, after Brussels, Miami, Soho, and Munich. The Sydney debut coincided with the mall’s upmarket redevelopment, which explains Nespresso’s arrival in the city: strategic geographic expansion is key to the brand’s growth. Rather than panoramic ubiquity, a retail option favoured by brands like McDonalds, KFC and Starbucks, Nespresso opts for iconic, prestigious locations. This strategy has been highly successful: since 2000 Nespresso has recorded year-on-year per annum growth of 30 per cent. This has been achieved, moreover, despite a global financial downturn and an international coffee market replete with brand variety. In turn, Nespresso marks an evolution in the coffee market over the last decade. The Nespresso Story Founded in 1986, Nespresso is the fasting growing brand in the Nestlé Group. Its headquarters are in Lausanne, Switzerland, with over 7,000 employees worldwide. In 2012, Nespresso had 270 boutiques in 50 countries. The brand’s growth strategy involves three main components: premium coffee capsules, “mated” with specially designed machines, and accompanied by exceptional customer service through the Nespresso Club. Each component requires some explanation. Nespresso offers 16 varieties of Grand Crus coffee: 7 espresso blends, 3 pure origin espressos, 3 lungos (for larger cups), and 3 decaffeinated coffees. Each 5.5 grams of portioned coffee is cased in a hermetically sealed aluminium capsule, or pod, designed to preserve the complex, volatile aromas (between 800 and 900 per pod), and prevent oxidation. These capsules are designed to be used exclusively with Nespresso-branded machines, which are equipped with a patented high-pressure extraction system designed for optimum release of the coffee. These machines, of which there are 28 models, are developed with 6 machine partners, and Antoine Cahen, from Ateliers du Nord in Lausanne, designs most of them. For its consumers, members of the Nespresso Club, the capsules and machines guarantee perfect espresso coffee every time, within seconds and with minimum effort—what Nespresso calls the “ultimate coffee experience.” The Nespresso Club promotes this experience as an everyday luxury, whereby café-quality coffee can be enjoyed in the privacy and comfort of Club members’ homes. This domestic focus is a relatively recent turn in its history. Nestlé patented some of its pod technology in 1976; the compatible machines, initially made in Switzerland by Turmix, were developed a decade later. Nespresso S. A. was set up as a subsidiary unit within the Nestlé Group with a view to target the office and fine restaurant sector. It was first test-marketed in Japan in 1986, and rolled out the same year in Switzerland, France and Italy. However, by 1988, low sales prompted Nespresso’s newly appointed CEO, Jean-Paul Gillard, to rethink the brand’s focus. Gillard subsequently repositioned Nespresso’s target market away from the commercial sector towards high-income households and individuals, and introduced a mail-order distribution system; these elements became the hallmarks of the Nespresso Club (Markides 55). The Nespresso Club was designed to give members who had purchased Nespresso machines 24-hour customer service, by mail, phone, fax, and email. By the end of 1997 there were some 250,000 Club members worldwide. The boom in domestic, user-friendly espresso machines from the early 1990s helped Nespresso’s growth in this period. The cumulative efforts by the main manufacturers—Krups, Bosch, Braun, Saeco and DeLonghi—lowered the machines’ average price to around US $100 (Purpura, “Espresso” 88; Purpura, “New” 116). This paralleled consumers’ growing sophistication, as they became increasingly familiar with café-quality espresso, cappuccino and latté—for reasons to be detailed below. Nespresso was primed to exploit this cultural shift in the market and forge a charismatic point of difference: an aspirational, luxury option within an increasingly accessible and familiar field. Between 2006 and 2008, Nespresso sales more than doubled, prompting a second production factory to supplement the original plant in Avenches (Simonian). In 2008, Nespresso grew 20 times faster than the global coffee market (Reguly B1). As Nespresso sales exceeded $1.3 billion AU in 2009, with 4.8 billion capsules shipped out annually and 5 million Club members worldwide, it became Nestlé’s fastest growing division (Canning 28). According to Nespresso’s Oceania market director, Renaud Tinel, the brand now represents 8 per cent of the total coffee market; of Nespresso specifically, he reports that 10,000 cups (using one capsule per cup) were consumed worldwide each minute in 2009, and that increased to 12,300 cups per minute in 2010 (O’Brien 16). Given such growth in such a brief period, the atypical dynamic between the boutique, the Club and the Nespresso brand warrants closer consideration. Nespresso opened its first boutique in Paris in 2000, on the Avenue des Champs-Élysées. It was a symbolic choice and signalled the brand’s preference for glamorous precincts in cosmopolitan cities. This has become the design template for all Nespresso boutiques, what the company calls “brand embassies” in its press releases. More like art gallery-style emporiums than retail spaces, these boutiques perform three main functions: they showcase Nespresso coffees, machines and accessories (all elegantly displayed); they enable Club members to stock up on capsules; and they offer excellent customer service, which invariably equates to detailed production information. The brand’s revenue model reflects the boutique’s role in the broader business strategy: 50 per cent of Nespresso’s business is generated online, 30 per cent through the boutiques, and 20 per cent through call centres. Whatever floor space these boutiques dedicate to coffee consumption is—compared to the emphasis on exhibition and ambience—minimal and marginal. In turn, this tightly monitored, self-focused model inverts the conventional function of most commercial coffee sites. For several hundred years, the café has fostered a convivial atmosphere, served consumers’ social inclinations, and overwhelmingly encouraged diverse, eclectic clientele. The Nespresso boutique is the antithesis to this, and instead actively limits interaction: the Club “community” does not meet as a community, and is united only in atomised allegiance to the Nespresso brand. In this regard, Nespresso stands in stark contrast to another coffee brand that has been highly successful in recent years—Starbucks. Starbucks famously recreates the aesthetics, rhetoric and atmosphere of the café as a “third place”—a term popularised by urban sociologist Ray Oldenburg to describe non-work, non-domestic spaces where patrons converge for respite or recreation. These liminal spaces (cafés, parks, hair salons, book stores and such locations) might be private, commercial sites, yet they provide opportunities for chance encounters, even therapeutic interactions. In this way, they aid sociability and civic life (Kleinman 193). Long before the term “third place” was coined, coffee houses were deemed exemplars of egalitarian social space. As Rudolf P. Gaudio notes, the early coffee houses of Western Europe, in Oxford and London in the mid-1600s, “were characterized as places where commoners and aristocrats could meet and socialize without regard to rank” (670). From this sanguine perspective, they both informed and animated the modern public sphere. That is, and following Habermas, as a place where a mixed cohort of individuals could meet and discuss matters of public importance, and where politics intersected society, the eighteenth-century British coffee house both typified and strengthened the public sphere (Karababa and Ger 746). Moreover, and even from their early Ottoman origins (Karababa and Ger), there has been an historical correlation between the coffee house and the cosmopolitan, with the latter at least partly defined in terms of demographic breadth (Luckins). Ironically, and insofar as Nespresso appeals to coffee-literate consumers, the brand owes much to Starbucks. In the two decades preceding Nespresso’s arrival, Starbucks played a significant role in refining coffee literacy around the world, gauging mass-market trends, and stirring consumer consciousness. For Nespresso, this constituted major preparatory phenomena, as its strategy (and success) since the early 2000s presupposed the coffee market that Starbucks had helped to create. According to Nespresso’s chief executive Richard Giradot, central to Nespresso’s expansion is a focus on particular cities and their coffee culture (Canning 28). In turn, it pays to take stock of how such cities developed a coffee culture amenable to Nespresso—and therein lays the brand’s debt to Starbucks. Until the last few years, and before celebrity ambassador George Clooney was enlisted in 2005, Nespresso’s marketing was driven primarily by Club members’ recommendations. At the same time, though, Nespresso insisted that Club members were coffee connoisseurs, whose knowledge and enjoyment of coffee exceeded conventional coffee offerings. In 2000, Henk Kwakman, one of Nestlé’s Coffee Specialists, explained the need for portioned coffee in terms of guaranteed perfection, one that demanding consumers would expect. “In general”, he reasoned, “people who really like espresso coffee are very much more quality driven. When you consider such an intense taste experience, the quality is very important. If the espresso is slightly off quality, the connoisseur notices this immediately” (quoted in Butler 50). What matters here is how this corps of connoisseurs grew to a scale big enough to sustain and strengthen the Nespresso system, in the absence of a robust marketing or educative drive by Nespresso (until very recently). Put simply, the brand’s ascent was aided by Starbucks, specifically by the latter’s success in changing the mainstream coffee market during the 1990s. In establishing such a strong transnational presence, Starbucks challenged smaller, competing brands to define themselves with more clarity and conviction. Indeed, working with data that identified just 200 freestanding coffee houses in the US prior to 1990 compared to 14,000 in 2003, Kjeldgaard and Ostberg go so far as to state that: “Put bluntly, in the US there was no local coffee consumptionscape prior to Starbucks” (Kjeldgaard and Ostberg 176). Starbucks effectively redefined the coffee world for mainstream consumers in ways that were directly beneficial for Nespresso. Starbucks: Coffee as Ambience, Experience, and Cultural Capital While visitors to Nespresso boutiques can sample the coffee, with highly trained baristas and staff on site to explain the Nespresso system, in the main there are few concessions to the conventional café experience. Primarily, these boutiques function as material spaces for existing Club members to stock up on capsules, and therefore they complement the Nespresso system with a suitably streamlined space: efficient, stylish and conspicuously upmarket. Outside at least one Sydney boutique for instance (Bondi Junction, in the fashionable eastern suburbs), visitors enter through a club-style cordon, something usually associated with exclusive bars or hotels. This demarcates the boutique from neighbouring coffee chains, and signals Nespresso’s claim to more privileged patrons. This strategy though, the cultivation of a particular customer through aesthetic design and subtle flattery, is not unique. For decades, Starbucks also contrived a “special” coffee experience. Moreover, while the Starbucks model strikes a very different sensorial chord to that of Nespresso (in terms of décor, target consumer and so on) it effectively groomed and prepped everyday coffee drinkers to a level of relative self-sufficiency and expertise—and therein is the link between Starbucks’s mass-marketed approach and Nespresso’s timely arrival. Starbucks opened its first store in 1971, in Seattle. Three partners founded it: Jerry Baldwin and Zev Siegl, both teachers, and Gordon Bowker, a writer. In 1982, as they opened their sixth Seattle store, they were joined by Howard Schultz. Schultz’s trip to Italy the following year led to an entrepreneurial epiphany to which he now attributes Starbucks’s success. Inspired by how cafés in Italy, particularly the espresso bars in Milan, were vibrant social hubs, Schultz returned to the US with a newfound sensitivity to ambience and attitude. In 1987, Schultz bought Starbucks outright and stated his business philosophy thus: “We aren’t in the coffee business, serving people. We are in the people business, serving coffee” (quoted in Ruzich 432). This was articulated most clearly in how Schultz structured Starbucks as the ultimate “third place”, a welcoming amalgam of aromas, music, furniture, textures, literature and free WiFi. This transformed the café experience twofold. First, sensory overload masked the dull homogeny of a global chain with an air of warm, comforting domesticity—an inviting, everyday “home away from home.” To this end, in 1994, Schultz enlisted interior design “mastermind” Wright Massey; with his team of 45 designers, Massey created the chain’s decor blueprint, an “oasis for contemplation” (quoted in Scerri 60). At the same time though, and second, Starbucks promoted a revisionist, airbrushed version of how the coffee was produced. Patrons could see and smell the freshly roasted beans, and read about their places of origin in the free pamphlets. In this way, Starbucks merged the exotic and the cosmopolitan. The global supply chain underwent an image makeover, helped by a “new” vocabulary that familiarised its coffee drinkers with the diversity and complexity of coffee, and such terms as aroma, acidity, body and flavour. This strategy had a decisive impact on the coffee market, first in the US and then elsewhere: Starbucks oversaw a significant expansion in coffee consumption, both quantitatively and qualitatively. In the decades following the Second World War, coffee consumption in the US reached a plateau. Moreover, as Steven Topik points out, the rise of this type of coffee connoisseurship actually coincided with declining per capita consumption of coffee in the US—so the social status attributed to specialised knowledge of coffee “saved” the market: “Coffee’s rise as a sign of distinction and connoisseurship meant its appeal was no longer just its photoactive role as a stimulant nor the democratic sociability of the coffee shop” (Topik 100). Starbucks’s singular triumph was to not only convert non-coffee drinkers, but also train them to a level of relative sophistication. The average “cup o’ Joe” thus gave way to the latte, cappuccino, macchiato and more, and a world of coffee hitherto beyond (perhaps above) the average American consumer became both regular and routine. By 2003, Starbucks’s revenue was US $4.1 billion, and by 2012 there were almost 20,000 stores in 58 countries. As an idealised “third place,” Starbucks functioned as a welcoming haven that flattened out and muted the realities of global trade. The variety of beans on offer (Arabica, Latin American, speciality single origin and so on) bespoke a generous and bountiful modernity; while brochures schooled patrons in the nuances of terroir, an appreciation for origin and distinctiveness that encoded cultural capital. This positioned Starbucks within a happy narrative of the coffee economy, and drew patrons into this story by flattering their consumer choices. Against the generic sameness of supermarket options, Starbucks promised distinction, in Pierre Bourdieu’s sense of the term, and diversity in its coffee offerings. For Greg Dickinson, the Starbucks experience—the scent of the beans, the sound of the grinders, the taste of the coffees—negated the abstractions of postmodern, global trade: by sensory seduction, patrons connected with something real, authentic and material. At the same time, Starbucks professed commitment to the “triple bottom line” (Savitz), the corporate mantra that has morphed into virtual orthodoxy over the last fifteen years. This was hardly surprising; companies that trade in food staples typically grown in developing regions (coffee, tea, sugar, and coffee) felt the “political-aesthetic problematization of food” (Sassatelli and Davolio). This saw increasingly cognisant consumers trying to reconcile the pleasures of consumption with environmental and human responsibilities. The “triple bottom line” approach, which ostensibly promotes best business practice for people, profits and the planet, was folded into Starbucks’s marketing. The company heavily promoted its range of civic engagement, such as donations to nurses’ associations, literacy programs, clean water programs, and fair dealings with its coffee growers in developing societies (Simon). This bode well for its target market. As Constance M. Ruch has argued, Starbucks sought the burgeoning and lucrative “bobo” class, a term Ruch borrows from David Brooks. A portmanteau of “bourgeois bohemians,” “bobo” describes the educated elite that seeks the ambience and experience of a counter-cultural aesthetic, but without the political commitment. Until the last few years, it seemed Starbucks had successfully grafted this cultural zeitgeist onto its “third place.” Ironically, the scale and scope of the brand’s success has meant that Starbucks’s claim to an ethical agenda draws frequent and often fierce attack. As a global behemoth, Starbucks evolved into an iconic symbol of advanced consumer culture. For those critical of how such brands overwhelm smaller, more local competition, the brand is now synonymous for insidious, unstoppable retail spread. This in turn renders Starbucks vulnerable to protests that, despite its gestures towards sustainability (human and environmental), and by virtue of its size, ubiquity and ultimately conservative philosophy, it has lost whatever cachet or charm it supposedly once had. As Bryant Simon argues, in co-opting the language of ethical practice within an ultimately corporatist context, Starbucks only ever appealed to a modest form of altruism; not just in terms of the funds committed to worthy causes, but also to move thorny issues to “the most non-contentious middle-ground,” lest conservative customers felt alienated (Simon 162). Yet, having flagged itself as an ethical brand, Starbucks became an even bigger target for anti-corporatist sentiment, and the charge that, as a multinational giant, it remained complicit in (and one of the biggest benefactors of) a starkly inequitable and asymmetric global trade. It remains a major presence in the world coffee market, and arguably the most famous of the coffee chains. Over the last decade though, the speed and intensity with which Nespresso has grown, coupled with its atypical approach to consumer engagement, suggests that, in terms of brand equity, it now offers a more compelling point of difference than Starbucks. Brand “Me” Insofar as the Nespresso system depends on a consumer market versed in the intricacies of quality coffee, Starbucks can be at least partly credited for nurturing a more refined palate amongst everyday coffee drinkers. Yet while Starbucks courted the “average” consumer in its quest for market control, saturating the suburban landscape with thousands of virtually indistinguishable stores, Nespresso marks a very different sensibility. Put simply, Nespresso inverts the logic of a coffee house as a “third place,” and patrons are drawn not to socialise and relax but to pursue their own highly individualised interests. The difference with Starbucks could not be starker. One visitor to the Bloomingdale boutique (in New York’s fashionable Soho district) described it as having “the feel of Switzerland rather than Seattle. Instead of velvet sofas and comfy music, it has hard surfaces, bright colours and European hostesses” (Gapper 9). By creating a system that narrows the gap between production and consumption, to the point where Nespresso boutiques advertise the coffee brand but do not promote on-site coffee drinking, the boutiques are blithely indifferent to the historical, romanticised image of the coffee house as a meeting place. The result is a coffee experience that exploits the sophistication and vanity of aspirational consumers, but ignores the socialising scaffold by which coffee houses historically and perhaps naively made some claim to community building. If anything, Nespresso restricts patrons’ contemplative field: they consider only their relationships to the brand. In turn, Nespresso offers the ultimate expression of contemporary consumer capitalism, a hyper-individual experience for a hyper-modern age. By developing a global brand that is both luxurious and niche, Nespresso became “the Louis Vuitton of coffee” (Betts 14). Where Starbucks pursued retail ubiquity, Nespresso targets affluent, upmarket cities. As chief executive Richard Giradot put it, with no hint of embarrassment or apology: “If you take China, for example, we are not speaking about China, we are speaking about Shanghai, Hong Kong, Beijing because you will not sell our concept in the middle of nowhere in China” (quoted in Canning 28). For this reason, while Europe accounts for 90 per cent of Nespresso sales (Betts 15), its forays into the Americas, Asia and Australasia invariably spotlights cities that are already iconic or emerging economic hubs. The first boutique in Latin America, for instance, was opened in Jardins, a wealthy suburb in Sao Paulo, Brazil. In Nespresso, Nestlé has popularised a coffee experience neatly suited to contemporary consumer trends: Club members inhabit a branded world as hermetically sealed as the aluminium pods they purchase and consume. Besides the Club’s phone, fax and online distribution channels, pods can only be bought at the boutiques, which minimise even the potential for serendipitous mingling. The baristas are there primarily for product demonstrations, whilst highly trained staff recite the machines’ strengths (be they in design or utility), or information about the actual coffees. For Club members, the boutique service is merely the human extension of Nespresso’s online presence, whereby product information becomes increasingly tailored to increasingly individualised tastes. In the boutique, this emphasis on the individual is sold in terms of elegance, expedience and privilege. Nespresso boasts that over 70 per cent of its workforce is “customer facing,” sharing their passion and knowledge with Club members. Having already received and processed the product information (through the website, boutique staff, and promotional brochures), Club members need not do anything more than purchase their pods. In some of the more recently opened boutiques, such as in Paris-Madeleine, there is even an Exclusive Room where only Club members may enter—curious tourists (or potential members) are kept out. Club members though can select their preferred Grands Crus and checkout automatically, thanks to RFID (radio frequency identification) technology inserted in the capsule sleeves. So, where Starbucks exudes an inclusive, hearth-like hospitality, the Nespresso Club appears more like a pampered clique, albeit a growing one. As described in the Financial Times, “combine the reception desk of a designer hotel with an expensive fashion display and you get some idea what a Nespresso ‘coffee boutique’ is like” (Wiggins and Simonian 10). Conclusion Instead of sociability, Nespresso puts a premium on exclusivity and the knowledge gained through that exclusive experience. The more Club members know about the coffee, the faster and more individualised (and “therefore” better) the transaction they have with the Nespresso brand. This in turn confirms Zygmunt Bauman’s contention that, in a consumer society, being free to choose requires competence: “Freedom to choose does not mean that all choices are right—there are good and bad choices, better and worse choices. The kind of choice eventually made is the evidence of competence or its lack” (Bauman 43-44). Consumption here becomes an endless process of self-fashioning through commodities; a process Eva Illouz considers “all the more strenuous when the market recruits the consumer through the sysiphian exercise of his/her freedom to choose who he/she is” (Illouz 392). In a status-based setting, the more finely graded the differences between commodities (various places of origin, blends, intensities, and so on), the harder the consumer works to stay ahead—which means to be sufficiently informed. Consumers are locked in a game of constant reassurance, to show upward mobility to both themselves and society. For all that, and like Starbucks, Nespresso shows some signs of corporate social responsibility. In 2009, the company announced its “Ecolaboration” initiative, a series of eco-friendly targets for 2013. By then, Nespresso aims to: source 80 per cent of its coffee through Sustainable Quality Programs and Rainforest Alliance Certified farms; triple its capacity to recycle used capsules to 75 per cent; and reduce the overall carbon footprint required to produce each cup of Nespresso by 20 per cent (Nespresso). This information is conveyed through the brand’s website, press releases and brochures. However, since such endeavours are now de rigueur for many brands, it does not register as particularly innovative, progressive or challenging: it is an unexceptional (even expected) part of contemporary mainstream marketing. Indeed, the use of actor George Clooney as Nespresso’s brand ambassador since 2005 shows shrewd appraisal of consumers’ political and cultural sensibilities. As a celebrity who splits his time between Hollywood and Lake Como in Italy, Clooney embodies the glamorous, cosmopolitan lifestyle that Nespresso signifies. However, as an actor famous for backing political and humanitarian causes (having raised awareness for crises in Darfur and Haiti, and backing calls for the legalisation of same-sex marriage), Clooney’s meanings extend beyond cinema: as a celebrity, he is multi-coded. Through its association with Clooney, and his fusion of star power and worldly sophistication, the brand is imbued with semantic latitude. Still, in the television commercials in which Clooney appears for Nespresso, his role as the Hollywood heartthrob invariably overshadows that of the political campaigner. These commercials actually pivot on Clooney’s romantic appeal, an appeal which is ironically upstaged in the commercials by something even more seductive: Nespresso coffee. References Bauman, Zygmunt. “Collateral Casualties of Consumerism.” Journal of Consumer Culture 7.1 (2007): 25–56. Betts, Paul. “Nestlé Refines its Arsenal in the Luxury Coffee War.” Financial Times 28 Apr. (2010): 14. Bourdieu, Pierre. Distinction: A Social Critique of the Judgement of Taste. Cambridge: Harvard University Press, 1984. Butler, Reg. “The Nespresso Route to a Perfect Espresso.” Tea & Coffee Trade Journal 172.4 (2000): 50. Canning, Simon. “Nespresso Taps a Cultural Thirst.” The Australian 26 Oct. (2009): 28. Dickinson, Greg. “Joe’s Rhetoric: Finding Authenticity at Starbucks.” Rhetoric Society Quarterly 32.4 (2002): 5–27. Gapper, John. “Lessons from Nestlé’s Coffee Break.” Financial Times 3 Jan. (2008): 9. Gaudio, Rudolf P. “Coffeetalk: StarbucksTM and the Commercialization of Casual Conversation.” Language in Society 32.5 (2003): 659–91. Habermas, Jürgen. The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society. Cambridge: MIT Press, 1962. Illouz, Eva. “Emotions, Imagination and Consumption: A New Research Agenda.” Journal of Consumer Culture 9 (2009): 377–413. Karababa, EmInegül, and GüIIz Ger. “Early Modern Ottoman Coffehouse Culture and the Formation of the Consumer Subject." Journal of Consumer Research 37.5 (2011): 737–60 Kjeldgaard, Dannie, and Jacob Ostberg. “Coffee Grounds and the Global Cup: Global Consumer Culture in Scandinavia”. Consumption, Markets and Culture 10.2 (2007): 175–87. Kleinman, Sharon S. “Café Culture in France and the United States: A Comparative Ethnographic Study of the Use of Mobile Information and Communication Technologies.” Atlantic Journal of Communication 14.4 (2006): 191–210. Luckins, Tanja. “Flavoursome Scraps of Conversation: Talking and Hearing the Cosmopolitan City, 1900s–1960s.” History Australia 7.2 (2010): 31.1–31.16. Markides, Constantinos C. “A Dynamic View of Strategy.” Sloan Management Review 40.3 (1999): 55. Nespresso. “Ecolaboration Initiative Directs Nespresso to Sustainable Success.” Nespresso Media Centre 2009. 13 Dec. 2011. ‹http://www.nespresso.com›. O’Brien, Mary. “A Shot at the Big Time.” The Age 21 Jun. (2011): 16. Oldenburg, Ray. The Great Good Place: Cafés, Coffee Shops, Community Centers, Beauty Parlors, General Stores, Bars, Hangouts, and How They Get You Through the Day. New York: Paragon House, 1989. Purpura, Linda. “New Espresso Machines to Tempt the Palate.” The Weekly Home Furnishings Newspaper 3 May (1993): 116. Purpura, Linda. “Espresso: Grace under Pressure.” The Weekly Home Furnishings Newspaper 16 Dec. (1991): 88. Reguly, Eric. “No Ordinary Joe: Nestlé Pulls off Caffeine Coup.” The Globe and Mail 6 Jul. (2009): B1. Ruzich, Constance M. “For the Love of Joe: The Language of Starbucks.” The Journal of Popular Culture 41.3 (2008): 428–42. Sassatelli, Roberta, and Federica Davolio. “Consumption, Pleasure and Politics: Slow Food and the Politico-aesthetic Problematization of Food.” Journal of Consumer Culture 10.2 (2010): 202–32. Savitz, Andrew W. The Triple Bottom Line: How Today’s Best-run Companies are Achieving Economic, Social, and Environmental Success—And How You Can Too. San Francisco: Jossey-Bass, 2006. Scerri, Andrew. “Triple Bottom-line Capitalism and the ‘Third Place’.” Arena Journal 20 (2002/03): 57–65. Simon, Bryant. “Not Going to Starbucks: Boycotts and the Out-sourcing of Politics in the Branded World.” Journal of Consumer Culture 11.2 (2011): 145–67. Simonian, Haig. “Nestlé Doubles Nespresso Output.” FT.Com 10 Jun. (2009). 2 Feb. 2012 ‹http://www.ft.com/cms/s/0/0dcc4e44-55ea-11de-ab7e-00144feabdc0.html#axzz1tgMPBgtV›. Topik, Steven. “Coffee as a Social Drug.” Cultural Critique 71 (2009): 81–106. Wiggins, Jenny, and Haig Simonian. “How to Serve a Bespoke Cup of Coffee.” Financial Times 3 Apr. (2007): 10.
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia