Artículos de revistas sobre el tema "Composition et compatibilité des services web"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Composition et compatibilité des services web.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 15 mejores artículos de revistas para su investigación sobre el tema "Composition et compatibilité des services web".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Lécué, Freddy, Alain Léger y Ramy Ragab Hassen. "Les web services sémantiques, automate et intégration. II. Composition de services web, technologies et plateformes, applications industrielles". Techniques et sciences informatiques 28, n.º 2 (febrero de 2009): 263–93. http://dx.doi.org/10.3166/tsi.28.263-293.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Rouached, Mohsen, Walid Fdhila y Claude Godart. "Web Services Compositions Modelling and Choreographies Analysis". International Journal of Web Services Research 7, n.º 2 (abril de 2010): 87–110. http://dx.doi.org/10.4018/jwsr.2010040105.

Texto completo
Resumen
In Rouached et al. (2006) and Rouached and Godart (2007) the authors described the semantics of WSBPEL by way of mapping each of the WSBPEL (Arkin et al., 2004) constructs to the EC algebra and building a model of the process behaviour. With these mapping rules, the authors describe a modelling approach of a process defined for a single Web service composition. However, this modelling is limited to a local view and can only be used to model the behaviour of a single process. The authors further the semantic mapping to include Web service composition interactions through modelling Web service conversations and their choreography. This paper elaborates the models to support a view of interacting Web service compositions extending the mapping from WSBPEL to EC, and including Web service interfaces (WSDL) for use in modelling between services. The verification and validation techniques are also exposed while automated induction-based theorem prover is used as verification back-end.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Pellier, Damien y Humbert Fiorino. "Un modèle de composition automatique et distribuée de services web par planification". Revue d'intelligence artificielle 23, n.º 1 (24 de febrero de 2009): 13–46. http://dx.doi.org/10.3166/ria.23.13-46.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Abraham, Ajith, Sung-Bae Cho, Thomas Hite y Sang-Yong Han. "Special Issue on Web Services Practices". Journal of Advanced Computational Intelligence and Intelligent Informatics 10, n.º 5 (20 de septiembre de 2006): 703–4. http://dx.doi.org/10.20965/jaciii.2006.p0703.

Texto completo
Resumen
Web services – a new breed of self-contained, self-describing, modular applications published, located, and invoked across the Web – handle functions, from simple requests to complicated business processes. They are defined as network-based application components with a services-oriented architecture (SOA) using standard interface description languages and uniform communication protocols. SOA enables organizations to grasp and respond to changing trends and to adapt their business processes rapidly without major changes to the IT infrastructure. The Inaugural International Conference on Next-Generation Web Services Practices (NWeSP'05) attracted researchers who are also the world's most respected authorities on the semantic Web, Web-based services, and Web applications and services. NWeSP'05 was held in cooperation with the IEEE Computer Society Task Force on Electronic Commerce, the Technical Committee on Internet, and the Technical Committee on Scalable Computing. This special issue presents eight papers focused on different aspects of Web services and their applications. Papers were selected based on fundamental ideas and concepts rather than the thoroughness of techniques employed. Papers are organized as follows: <I>Taher et al.</I> present the first paper, on a Quality of Service Information and Computational framework (QoS-IC) supporting QoS-based service selection for SOA. The framework's functionality is expanded using a QoS constraints model that establishes an association relationship between different QoS properties and is used to govern QoS-based service selection in the underlying algorithm. Using a prototype implementation, the authors demonstrate how QoS constraints improve QoS-based service selection and save consumers valuable time. Due to the complex infrastructure of web applications, response times perceived by clients may be significantly longer than desired. To overcome some of the current problems, <I>Vilas et al.</I>, in the second paper, propose a cache-based extension of the architecture that enhances the current web services architecture, which is mainly based on program-logic or protocol-dependent optimization. In the third paper, Jo and Yoo present authorization for securing XML sources on the Web. One of the disadvantages of existing access control is that the DOM tree must be loaded into memory while all XML documents are parsed to generate the DOM tree, such that a lot of memory is used in repetitive search for tree to authorize access to all nodes in the DOM tree. The complex authorization evaluation process required thus lowers system performance. Existing access control fails to consider information structure and semantics sufficiently due to basic HTML limitations. The authors overcome some of these limitations in the proposed model. In the fourth paper, Jung and Cho propose a novel behavior-network-based method for Web service composition. The behavior network selects services automatically through internal and external links with environmental information from sensors and goals. An optimal service is selected at each step, resulting in a globally optimal service sequence for achieving preset goals. The authors detail experimental results for the proposed model by comparing them with rule-based system and user tests. <I>Kong et al.</I> present an efficient method in the fifth paper for merging heterogeneous ontologies – no ontology building standard currently exists – and the many ontology-building tools available are based on different ontology languages, mostly focusing on how to create, edit and infer the ontology efficiently. Even ontologies about the same domain differ because ontology experts hold different view points. For these reasons, interoperability between ontologies is very low. The authors propose merging heterogeneous domain ontologies by overcoming some of the above limitations. In the sixth paper, Chen and Che provide polynomial-time tree pattern query minimization algorithm whose efficiency stems from two key observations: (i) Inherent redundant "components" usually exist inside the rudimentary query provided by the user, and (ii) nonedundant nodes may become redundant when constraints such as co-occurrence and required child/descendant are given. They show that the algorithm obtained by first augmenting the input tree pattern using constraints, then applying minimization, invariably finds a unique minimal equivalent to the original query. Chen and Che present a polynomial-time algorithm for tree pattern query (TPQ) minimization without XML constraints in the seventh paper. The two-part algorithm is a dynamic programming strategy for finding all matching subtrees within a TPQ. The algorithm consists of one for subtree recognization and a second for subtree deletion. In the last paper, <I>Bagchi et al.</I> present the mobile distributed virtual memory (MDVM) concept and architecture for cellular networks containing server-groups (SG). They detail a two-round randomized distributed algorithm to elect a unique leader and co-leader of the SG that is free of any assumption about network topology, and buffer space limitations and is based on dynamically elected coordinators eliminating single points of failure. As guest editors, we thank all authors featured in this special issue for their contributions and the referees for critically evaluating the papers within the short time allotted. We sincerely believe that readers will share our enjoyment of this special issue and find the information it presents both timely and useful.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Ramachandra, T. V. "Innovative ecological approaches to ensure clean and adequate water for all". Journal of Environmental Biology 43, n.º 03 (2 de mayo de 2022): i—ii. http://dx.doi.org/10.22438/jeb/43/3/editorial.

Texto completo
Resumen
The Western Ghats, a range of ancient hills extends between 8° N and 21° N latitude, and 73° E and 77° E longitude(from the tip of peninsular India at Kanyakumari to Gujarat). The Western Ghats runs parallel to the west coast of India, covering approximately 160,000 sq. km, which constitutes less than 5% of India's geographical extent. Numerous streams originate in the Western Ghats, which drain millions of hectares, ensuring water and food security for 245 million people and hence are aptly known as the water tower of peninsular India(Ramachandra and Bharath, 2019; Bharath et al., 2021). The region is endowed with diverse ecological regions depending on altitude, latitude, rainfall, and soil characteristics. The Western Ghats are among the eight hottest hotspots of biodiversity and 36 global biodiversity hotspots with exceptional endemic flora and fauna. Natural forests of Western Ghats have been providing various goods and services and are endowed with species of 4,600+ flowering plants (38% endemics), 330 butterflies (11% endemics), 156 reptiles (62% endemics), 508 birds (4% endemics), 120 mammals (12% endemics), 289 fishes (41% endemics) and 135 amphibians (75% endemics). The Western Ghats, gifted with enormous natural resource potential, and the mandate of sustainable development based on the foundation of prudent management of ecosystems, is yet a reality. Various unplanned developmental programs, which are proclaimed to be functioning on sustainability principles, have only been disrupting the complex web of life, impacting ecosystems, and causing a decline in overall productivity, including four major sectors: forestry, fisheries, agriculture, and water (Ramachandra and Bharath, 2019).The prevalence of barren hilltops, conversion of perennial streams to intermittent or seasonal streams, frequent floods and droughts, changes in water quality, soil erosion and sedimentation, the decline of endemic flora, and fauna, etc. highlights the consequences of unplanned developmental activities with a huge loss to the regional economy during the last century. The development goals need to be ecologically, economically, and socially sustainable, which can be achieved through the conservation and prudent management of ecosystems. Sustainability implies the equilibrium between society, ecosystem integrity, and sustenance of natural resources. Water sustenance in streams and rivers depends on the integrity of the catchment (watershed), as vegetation helps in retarding the velocity of water by allowing impoundment and recharging of groundwater through infiltration (Ramachandra et al., 2020). As water moves in the terrestrial ecosystem, part of it is percolated (recharging groundwater resources and contributing to sub-surface flow during post-monsoon seasons), while another fraction gets back to the atmosphere through evaporation and transpiration. Forests with native vegetation act as a sponge by retaining and regulating water transfer between land and the atmosphere. The mechanism by which vegetation controls flow regime is dependent on various bio-physiographic characteristics, namely, type of vegetation, species composition, maturity, density, root density and depth, hydro-climatic condition, etc. Roots of vegetation help (i) in binding soil, ii) improve soil structure by enhancing the stability of aggregates, which provide habitat for diverse microfauna and flora, leading to higher porosity of the soil, thereby creating the conduit for infiltration through the soil. An undisturbed native forest has a consistent hydrologic regime with sustained flows during lean seasons. Native species of vegetation with the assemblage of diverse native species help in recharging the groundwater, mitigating floods, and other hydro-ecological processes (Ramachandra et al., 2020; Bharath et al., 2021). Hence, it necessitates safeguarding and maintaining native forest patches and restoring existing degraded lands to sustain the hydrological regime, which caters to biotic (ecological and societal) demands. A comparative assessment of people's livelihood with soil water properties and water availability in sub-catchments of four major river basins in the Western Ghats reveals that streams in catchments with > 60% vegetation of native species are perennial with higher soil moisture (Ramachandra et al., 2020). The higher soil moisture due to water availability during all seasons facilitates farming of commercial crops with higher economic returns to the farmers, unlike the farmers who face water crises during the lean season. In contrast, streams are intermittent (6-8 months of water) in catchments dominated by monoculture plantations and seasonal (4 months, monsoon period) in catchments with vegetation cover lower than 30%. The study highlights the need to maintain ecosystem integrity to sustain water. Also, lower instances of COVID 19 in villages with native forests emphasize ecosystems' role in maintaining the health of biota. The need to maintain native vegetation in the catchment and its potential to support people's livelihood with water availability at local and regional levels is evident from the revenue of Rs. Rs.2,74,658 ha-1 yr-1 (in villages with perennial streams and farmers growing cash crops or three crops a year due to water availability), Rs. 1,50,679 ha-1 yr-1 (in villages with intermittent streams) and Rs. 80000 ha-1 yr-1 (in villages with seasonal streams). Also, the crop yield (at least 1.5 to 1.8 times) is higher in agriculture fields due to efficient pollination with the prevalence of diverse pollinators in the vicinity of native forests. The study emphasizes the need for maintaining the natural flow regime and prudent management of watershed to i) sustain higher faunal diversity, ii) maintain the health of water body, and iii) sustain people's livelihood with higher revenues. Hence, the premium should be on conserving the forests with native species to sustain water and biotic diversity in the water bodies, vital for food security. There still exists a chance to restore the lost natural ecosystems through appropriate ecological restoration approaches, with location-specific conservation and management practices to ensure adequate and clean water for all. GDP (Gross Domestic Product), a measure of the current economic well-being of a population, based on the market exchange of material well-being, will indicate resource depletion/degradation only through a positive gain in the economy and will not represent the decline in these assets (wealth) at all. Thus, the existing GDP growth percentages used as yardsticks to measure the development and well-being of citizens in decision-making processes are substantially misleading, yet they are being used. The traditional national accounts need to include resource depletion or degradation due to developmental activities and climate change. The country should move toward adopting Green GDP by accounting for the environmental consequences of the growth in the conventional GDP, which entails monetizing the services provided by ecosystems, the degradation cost of ecosystems, and accounts for costs caused by climate change. The forest ecosystems are under severe threat due to anthropogenic pressures, which are mostly related to the GDP.The appraisal of forest ecosystem services and biodiversity can help clarify trade­-offs among conflicting environmental, social, and economic goals in the development and implementation of policies and to improve the management in order biodiversity.Natural capital accounting and valuation of ecosystem services reveal that forest ecosystems provide (i) provisioning services (timber, fuelwood, food, NTFP, medicines, genetic materials) of Rs 2,19,494 ha-1 yr-1, (ii) regulating services (global climate regulation - carbon sequestration, soil conservation, and soil fertility, water regulation and groundwater recharge, water purification, pollination, waste treatment, air filtration, local climate regulation) of Rs 3,31,216 ha-1 yr-1 and (iii) cultural services (aesthetic, spiritual, tourism and recreation, education and scientific research) of Rs.1,04,561 ha-1 yr-1. Total ecosystem supply value (TESV), an aggregation of provisioning, regulating, and cultural services, amounts to Rs. 6,56,172 ha-1 yr-1, and the Net Present Value (NPV) of one hectare of forests amounts to 16.88 million rupees ha-1. NPV helps in estimating ecological compensation while diverting forest lands for other purposes. The recovery of an ecosystem with respect to its health, integrity, and sustainability is evident from an initiative of planting (500 saplings of 49 native species) in a degraded landscape (dominated by invasive species) of two hectares in the early 1990s at the Indian Institute of Science campus (Ramachandra et al., 2016),and the region has now transformed into a mini forest with numerous benefits such as improvements in groundwater at 3-6 m (compared to 30-40 m in 1990), moderated microclimate (with lower temperature) and numerous fauna (including four families of Slender Loris). While confirming the linkages of hydrology, ecology, and biodiversity, the experiment advocates the need for integrated watershed approaches based on sound ecological and engineering protocols to sustain water and ensure adequate water for all. A well-known and successful model of integrated wetlands ecosystem (Secondary treatment plant integrated with constructed wetlands and algae pond) at Jakkur Lake in Bangalore (Ramachandra et al., 2018) provides insights into the optimal treatment of wastewater and mitigation of pollution. Complete removal of nutrients and chemical contaminants happens when partially treated sewage (secondary treated) passes through constructed wetlands and algae pond (sedimentation pond), undergoes bio-physical and chemical processes. The water in the lake is almost potable with minimal nutrients and microbial counts. This model has been functioning successfully for the last ten years after interventions to rejuvenate the lake. This system is one of the self-sustainable ways of lake management while benefitting all stakeholders - washing, fishing, irrigation, and local people. Wells in the buffer zone (500 m), now have higher water levels and are without any nutrients (nitrate). Groundwater quality assessment in 25 wells in the same region during 2005 (before the rejuvenation of Jakkur Lake) had higher nitrate values. Adopting this model ensures optimal sewage treatment at decentralized levels, and letting treated water to the lake also provides nutrient-free and clean groundwater. The Jal Shakti ministry,the Government of India, through Jal Jeevan Mission, has embarked on the noble and novel mission of providing tap water supply to all rural households and public institutions in villages such as schools, health centers, panchayat buildings, etc. The success of this program depends on the availability of water. The imminent threat of acute water scarcity due to climate changes with global warming necessitates implementing integrated watershed development (planting of native species in the watershed of water bodies), rainwater harvesting (rooftop harvesting at individual household levels, and retaining rainwater in rejuvenated lakes, which also helps in recharge of groundwater) and reuse of wastewater through treatment at decentralized levels (a model similar to Jakkur lake at Bangalore). These prudent management initiatives at decentralized levels throughout the country aid in achieving the goals of providing clean and adequate water to the local community.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Cobanoglu, Cihan, Muhittin Cavusoglu y Gozde Turktarhan. "A beginner’s guide and best practices for using crowdsourcing platforms for survey research: The Case of Amazon Mechanical Turk (MTurk)". Journal of Global Business Insights 6, n.º 1 (marzo de 2021): 92–97. http://dx.doi.org/10.5038/2640-6489.6.1.1177.

Texto completo
Resumen
Introduction Researchers around the globe are utilizing crowdsourcing tools to reach respondents for quantitative and qualitative research (Chambers & Nimon, 2019). Many social science and business journals are receiving studies that utilize crowdsourcing tools such as Amazon Mechanical Turk (MTurk), Qualtrics, MicroWorkers, ShortTask, ClickWorker, and Crowdsource (e.g., Ahn, & Back, 2019; Ali et al., 2021; Esfahani, & Ozturk, 2019; Jeong, & Lee, 2017; Zhang et al., 2017). Even though the use of these tools presents a great opportunity for sharing large quantities of data quickly, some challenges must also be addressed. The purpose of this guide is to present the basic ideas behind the use of crowdsourcing for survey research and provide a primer for best practices that will increase their validity and reliability. What is crowdsourcing research? Crowdsourcing describes the collection of information, opinions, or other types of input from a large number of people, typically via the internet, and which may or may not receive (financial) compensation (Hargrave, 2019; Oxford Dictionary, n.d.). Within the behavioral science realm, crowdsourcing is defined as the use of internet services for hosting research activities and for creating opportunities for a large population of participants. Applications of crowdsourcing techniques have evolved over the decades, establishing the strong informational power of crowds. The advent of Web 2.0 has expanded the possibilities of crowdsourcing, with new online tools such as online reviews, forums, Wikipedia, Qualtrics, or MTurk, but also other platforms such as Crowdflower and Prolific Academic (Peer et al., 2017; Sheehan, 2018). Crowdsourcing platforms in the age of Web 2.0 use remote labor recruited via the internet to assist employers complete tasks that cannot be left to machines. Key characteristics of crowdsourcing include payment for workers, their recruitment from any location, and the completion of tasks (Behrend et al., 2011). They also allow for a relatively quick collection of data compared to data collection in the field, and participants are rewarded with an incentive—often financial compensation. Crowdsourcing not only offers a large participation pool but also a streamlined process for the study design, participant recruitment, and data collection as well as integrated participant compensation system (Buhrmester et al., 2011). Also, compared to other traditional marketing firms, crowdsourcing makes it easier to detect possible sampling biases (Garrow et al., 2020). Due to advantages such as reduced costs, diversity of participants, and flexibility, crowdsourcing platforms have surged in popularity for researchers. Advantages MTurk is one of the most popular crowdsourcing platforms among researchers, allowing Requesters to submit tasks for Workers to complete (Cummings & Sibona, 2017). MTurk has been used as an online crowdsourcing platform for the recruitment of human subjects for research purposes (Paolacci & Chandler, 2014). Research has also shown MTurk to be a reliable and cost-effective tool, capable of providing representative data for research in the behavioral sciences (e.g., Crump et al., 2013; Goodman et al., 2013; Mason & Suri, 2012; Rand, 2012; Simcox & Fiez, 2014). In addition to its use in social science studies, the platform has been used in marketing, hospitality and tourism, psychology, political science, communication, and sociology contexts (Sheehan, 2018). To illustrate, between 2012 and 2017, more than 40% of the studies published in the Journal of Consumer Research used crowdsourcing websites for their data collection (Goodman & Paolacci, 2017). Disadvantages Although researchers have assessed crowdsourcing platforms as reliable and cost-effective for data collection in the behavioral sciences, they are not exempt of flaws. One disadvantage is the possibility of unsatisfactory data quality. In fact, the virtual setting of the survey implies that the investigator is physically separated from the participant, and this lack of monitoring could lead to data quality issues (Sheehan, 2018). In addition, participants in survey research on crowdsourcing platforms are not always who they claim to be, creating issues of trust with the data provided and, ultimately, the quality of the research findings (McGonagle, 2015; Smith et al., 2016). A recurrent concern with MTurk workers, for instance, is their assessment as experienced survey takers (Chandler et al., 2015). This experience is mainly acquired through completion of dozens of surveys per day, especially when they are faced with similar items and scales. Smith et al. (2016) identified two types of problems performing data collection using MTurk; namely, cheaters and speeders. As compared to Qualtrics—which has a strict screening and quality-control processes to ensure that participants are who they claim to be—MTurk appears to be less exigent regarding the workers. However, a downside for data collection with Qualtrics is more expensive fees—about $5.00 per questionnaire on Qualtrics, against $0.50 to $1.50 on MTurk (Ford, 2017). Hence, few researchers were able to conduct surveys and compare respondent pools with Qualtrics or other traditional marketing research firms (Garrow et al., 2020). Another challenge using MTurk arises when trying to collect a desired number of responses from a population targeted to a specific city or area (Ross et al., 2010). The issues inherent to the selection process of MTurk have been the subject of investigations in several studies (e.g., Berinsky et al., 2012; Chandler et al., 2014; 2015; Harms & DeSimone, 2015; Paolacci et al., 2010; Rand, 2012). Feitosa et al. (2015) pointed out that international respondents may still identify themselves as U.S. respondents with the use of fake addresses and accounts. They found that 5% to 10% of participants identifying themselves as U.S. respondents were actually from overseas locations. Moreover, Babin et al. (2016) assessed that the use of trap questions allowed researchers to uncover that many respondents change their genders, ages, careers, or income within the course of a single survey. The issues of (a) experienced workers for the quality control of questions and (b) speeders, which, for MTurk can be attributed to the platform being the main source of revenue for a given respondent, remain the inherent issues of crowdsourcing platforms used for research purposes. Best practices Some best practices can be recommended in the use of crowdsourcing platforms for data collection purposes. Workers IDs can be matched with IDs from previous studies, thus allowing researchers to exclude responses from workers who had answered previous similar studies (Goodman & Paolacci, 2017). Furthermore, proceed to a manual assignment of qualification on MTurk prior to data collection (Litman et al., 2015; Park & Park, 2020). When dealing with experienced workers, both using multiple attention checks and optimizing the survey in a way to have the participants exposed to the stimuli for a sufficient length of time to better address the questions are also recommended (Sheehan, 2018). In this sense, shorter surveys are preferred to longer ones, which affect the participant’s concentration, and may, in turn, adversely impact the quality of their answers. Most importantly, pretest the survey to make sure that all parts are working as expected. Researchers should also keep in mind that in the context of MTurk, the primary method for measurement is the web interface. Thus, to avoid method biases, researchers should ponder whether or not method factors emerge in the latent measurement models (Podsakoff et al., 2012). As such, time-lagged research designs may be preferred as predictor and criterion variables can be measured at different points in time or administered in different platforms, such as Qualtrics vs MTurk (Cheung et al., 2017). In general, the use of crowdsourcing platforms including MTurk may be appropriate according to the research question; and the quality of data is reliant on the quality-control strategies used by researchers to enhance data quality. Trade-offs between various validity types need to be prioritized according to the research objectives (Cheung et al., 2017). From our experience using crowdsourcing tools for our own research as the editorial team members of several journals and chair of several conferences, we provide the best practices as outlined below: MTurk Worker (Respondent) Selection: Researchers should consider their study population before using MTurk for data collection. The MTurk platform should be used for the appropriate study population. For example, if the study targets restaurant owners or company CEOs, MTurk workers may not be suitable for the study. However, if the target population is diners, hotel guests, grocery shoppers, online shoppers, students, or hourly employees, utilizing a sample from MTurk would be suitable. Researchers should use the selection tool in the software. For example, if you target workers only from one country, exclude responses that came from an internet protocol (IP) address outside the targeted country and report the results in the method section. Researchers should consider the demographics of workers on MTurk which must reflect the study targeted population. For example, if the study focuses on baby boomers use of technology, then the MTurk sample should include only baby boomers. Similarly, the gender balance, racial composition, and income of people on MTurk should mirror the targeted population. Researchers should use multiple screening tools that identify quality respondents and avoid problematic response patterns. For example, MTurk provides the approval rate for the respondents. This refers to how many times a respondent is rejected for various reasons (i.e., wrong code entered). We recommend using a 90% or higher approval rate. Researchers should include screening questions in different places with different type of questions to make sure that the respondents are appropriate for your study. One way is to use knowledge-based questions about the subject. For example, rather than asking “How experienced are you with accounting practices?”, a supplemental question such as “Which of the following is a component of an income statement?” should be integrated into the study in a different section of the survey. Survey Validity: Researchers should conduct a pilot survey from MTurk workers to identify and fix any potential data quality and programming problems before the entire data set is collected. Researcher can estimate time required to complete the survey from the pilot study. This average time should be used in calculating incentive payment for the workers in such a way that the payment should equate or exceed minimum wage in the targeted country. Researchers should build multiple validity-check tools into the survey. One of them is to ask attention check questions such as “please click on ‘strongly agree’ in this question” or “What is 2+2? Please choose 5” (Cobanoglu et al., 2016) Even though these attention questions are good and should be implemented, experienced survey takers or bots easily identify them and answer them correctly, but then give random answers to other questions. Instead, we recommend building in more involved validity check questions. One of the best is asking the same question in different places and in different forms. For example, asking the age of the respondent in the beginning of the survey and then asking them the year of their birth at the end of the survey is an effective way to check that they are replying to the survey honestly. Exclude all those who answered the same question differently. Report the results of these validity checks in the methodology. Cavusoglu (2019) found that almost 20% of the surveys were eliminated due to the failure of the validity check questions which were embedded in different places and in different forms in his survey. Researchers should be aware of internet bot, which is a software that runs automated tasks. Some respondents use a bot to reply to the surveys. To avoid this, use Captcha verification, which forces respondents to perform random tasks such as moving the bar to a certain area, clicking in boxes that has cars, or checking boxes to verify the person taking the survey is not a bot. Whenever appropriate, researchers should use time limit options offered by online survey tools such as Qualtrics to control the time that a survey taker must spend to advance to the next question. We found that this is a great tool, especially when you want the respondents to watch a video, read a scenario, or look at a picture before they respond to other questions. Researchers should collect data in different days and times during the week to collect a more diverse and representative sample. Data Cleaning: Researchers should be aware that some respondents do not read questions. They simply select random answers or type nonsense text. To exclude them from the study, manually inspect the data. Exclude anyone who filled out the survey too quickly. We recommend excluding all responses filled out less than 40% of the average time to take the survey. For example, if it takes 10 minutes to fill out a survey, we exclude everyone who fills out this survey in 4 minutes or less. After we separated these two groups, we compared them and found that the speeders’ (aka cheaters) data was significantly different than the regular group. Researchers should always collect more data than needed. Our rule of thumb is to collect 30% more data than needed. For example, if 500 clean data responses are wanted, collect at least 650 data. The targeted number of data will still be available after cleaning the data. Report the process of cleaning data in the method section of your article, showing the editor and reviewers that you have taken steps to increase the validity and reliability of the survey responses. Calculating a response rate for the samples using MTurk is not possible. However, it is possible to calculate active response rate (Ali et al., 2021). It can be calculated as the raw response numbers deducted from all screening and validity check question results. For example, if you have 1000 raw responses and you eliminated 100 responses for coming from IP address outside of the United States, another 100 surveys for failing the validity check questions, then your active response rate would be 800/1000= 80%.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Temglit, N., H. ALIANE y M. Ahmed Nacer. "Un modèle de composition des services web sémantiques". Revue Africaine de la Recherche en Informatique et Mathématiques Appliquées Volume 11, 2009 - Special... (24 de septiembre de 2009). http://dx.doi.org/10.46298/arima.1928.

Texto completo
Resumen
International audience The work presented here aims to provide a composition model of semantic web services. This model is based on a semantic representation of domain concepts handled by web services, namely, operations and the static concepts used to describe static properties of Web services. Different levels of abstraction are given to the concept of operation to allow gradual access to concret services. Thus, two different levels of the composition plan are generated (abstract and concret). This will reuse plans already constructed to meet similar needs even with modified preferences. Le travail présenté ici vise à proposer un modèle pour la composition des services web sémantiques. Ce modèle est basé sur une représentation sémantique de l'ensemble des concepts manipulés par les services web d’un domaine d'application, à savoir, les opérations et les concepts statiques utilisés pour décrire les propriétés des services web. Différents niveaux d'abstraction sont donnés au concept opération pour permettre un accès progressif aux services concrets. Ainsi, deux plans de composition à granularités différentes (abstrait et concrets) sont générés. Ceci permettrade réutiliser des plans déjà construits pour répondre à des besoins similaires et même avec despréférences modifiées.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Ould Mohamed, Mohamed Salem, Amor Keziou, Hassan Fenniri y Georges Delaunay. "Nouveau critère de séparation aveugle de sources cyclostationnaires au second ordre". Revue Africaine de la Recherche en Informatique et Mathématiques Appliquées Volume 12, 2010 (5 de octubre de 2010). http://dx.doi.org/10.46298/arima.1929.

Texto completo
Resumen
International audience Le travail présenté ici vise à proposer un modèle pour la composition des services web sémantiques. Ce modèle est basé sur une représentation sémantique de l'ensemble des concepts manipulés par les services web d’un domaine d'application, à savoir, les opérations et les concepts statiques utilisés pour décrire les propriétés des services web. Différents niveaux d'abstraction sont donnés au concept opération pour permettre un accès progressif aux services concrets. Ainsi, deux plans de composition à granularités différentes (abstrait et concrets) sont générés. Ceci permettrade réutiliser des plans déjà construits pour répondre à des besoins similaires et même avec despréférences modifiées. The work presented here aims to provide a composition model of semantic web services. This model is based on a semantic representation of domain concepts handled by web services, namely, operations and the static concepts used to describe static properties of Web services. Different levels of abstraction are given to the concept of operation to allow gradual access to concret services. Thus, two different levels of the composition plan are generated (abstract and concret). This will reuse plans already constructed to meet similar needs even with modified preferences.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Macklin, James, David Shorthouse y Falko Glöckler. "I Know Something You Don’t Know: The annotation saga continues…". Biodiversity Information Science and Standards 7 (14 de septiembre de 2023). http://dx.doi.org/10.3897/biss.7.112715.

Texto completo
Resumen
Over the past 20 years, the biodiversity informatics community has pursued components of the digital annotation landscape with varying degrees of success. We will provide an historical overview of the theory, the advancements made through a few key projects, and will identify some of the ongoing challenges and opportunities. The fundamental principles remain unchanged since annotations were first proposed. Someone (or something): (1) has an enhancement to make elsewhere from the source where original data or information are generated or transcribed; (2) wishes to broadcast these statements to the originator and to others who may benefit; and (3) expects persistence, discoverability, and attribution for their contributions alongside the source. The Filtered Push project (Morris et al. 2013) considered several use cases and pioneered development of services based on the technology of the day. The exchange of data between parties in a universally consistent way necessitated the development of a novel draft standard for data annotations via an extension of the World Wide Web Consortium’s Web Annotation Working Group standard (Sanderson et al. 2013) to be sufficiently informative for a data curator to confidently make a decision. Figure 2 from Morris et al. (2013), reproduced here as Fig. 1, outlines the composition of an annotation data package for a taxonomic identification. The package contains the data object(s) associated with an occurrence, an expression of the motivation(s) for updating, some evidence for an assertion, and a stated expectation for how the receiving entity should take action. The Filtered Push and Annosys (Tschöpe et al. 2013) projects also considered implementation strategies involving collection management systems (e.g., Symbiota) and portals (e.g., European Distributed Institute of Taxonomy, EDIT). However, there remain technological barriers for these systems to operate at scale, the least of which is the absence of globally unique, persistent, resolvable identifiers for shared objects and concepts. Major aggregation infrastructures like the Global Biodiversity Information Facility (GBIF) and the Distributed System of Scientific Collections (DiSSCo) rely on data enhancement to improve the quality of their resources and have annotation services in their work plans. More recently, the Digital Extended Specimen (DES) concept (Hardisty et al. 2022) will rely on annotation services as key components of the proposed infrastructure. Recent work on annotation services more generally has considered various new forms of packaging and delivery such as Frictionless Data (Fowler et al. 2018), Journal Article Tag Suite XML (Agosti et al. 2022), or nanopublications (Kuhn et al. 2018). There is risk in fragmentation of this landscape and disenfranchisement of both biological collections and the wider research community if we fail to align the purpose, content, and structure of these packages or if these fail to remain aligned with FAIR principles. Institutional collection management systems currently represent the canonical data store that provides data to researchers and data aggregators. It is critical that information and/or feedback about the data they release be round-tripped back to them for consideration. However, the sheer volume of annotations that could be generated by both human and machine curation processes will overwhelm local data curators and the systems supporting them. One solution to this is to create a central annotation store with write and discovery services that best support the needs of all stewards of data. This will require an international consortium of parties with a governance and technical model to assure its sustainability.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Soiland-Reyes, Stian, Leyla Jael Castro, Daniel Garijo, Marc Portier, Carole Goble y Paul Groth. "Updating Linked Data practices for FAIR Digital Object principles". Research Ideas and Outcomes 8 (12 de octubre de 2022). http://dx.doi.org/10.3897/rio.8.e94501.

Texto completo
Resumen
Background The FAIR principles (Wilkinson et al. 2016) are fundamental for data discovery, sharing, consumption and reuse; however their broad interpretation and many ways to implement can lead to inconsistencies and incompatibility (Jacobsen et al. 2020). The European Open Science Cloud (EOSC) has been instrumental in maturing and encouraging FAIR practices across a wide range of research areas. Linked Data in the form of RDF (Resource Description Framework) is the common way to implement machine-readability in FAIR, however the principles do not prescribe RDF or any particular technology (Mons et al. 2017). FAIR Digital Object FAIR Digital Object (FDO) (Schultes and Wittenburg 2019) has been proposed to improve researcher’s access to digital objects through formalising their metadata, types, identifiers and exposing their computational operations, making them actionable FAIR objects rather than passive data sources. FDO is a set of principles (Bonino et al. 2019), implementable in multiple ways. Current realisations mostly use Digital Object Interface Protocol (DOIPv2) (DONA Foundation 2018), with the main implementation CORDRA. We can consider DOIPv2 as a simplified combination of object-oriented (CORBA, SOAP) and document-based (HTTP, FTP) approaches. More recently, the FDO Forum has prepared detailed recommendations, currently open for comments, including a DOIP endorsement and updated FDO requirements. These point out Linked Data as another possible technology stack, which is the focus of this work. Linked Data Linked Data standards (LD), based on the Web architecture, are commonplace in sciences like bioinformatics, chemistry and medical informatics – in particular to publish Open Data as machine-readable resources. LD has become ubiquitous on the general Web, the schema.org vocabulary is used by over 10 million sites for indexing by search engines – 43% of all websites use JSON-LD. Although LD practices align to FAIR (Hasnain and Rebholz-Schuhmann 2018), they do not fully encompass active aspects of FDOs. The HTTP protocol is used heavily for applications (e.g. mobile apps and cloud services), with REST APIs of customised JSON structures. Approaches that merge the LD and REST worlds include Linked Data Platform (LDP), Hydra and Web Payments. Meeting FDO principles using Linked Data standards Considering the potential of FDOs when combined with the mature technology stack of LD, here we briefly discuss how FDO principles in Bonino et al. (2019) can be achieved using existing standards. The general principles (G1–G9) apply well: Open standards with HTTP being stable for 30 years, JSON-LD is widely used, FAIR practitioners mainly use RDF, and a clear abstraction between the RDF model with stable bindings available in multiple serialisations. However, when considering the specific principles (FDOF1–FDOF12) we find that additional constraints and best practices need to be established – arbitrary LD resources cannot be assumed to follow FDO principles. This is equivalent to how existing use of DOIP is not FDO-compliant without additional constraints. Namely, persistent identifiers (PIDs) (McMurry et al. 2017) (FDOF1) are common in LD world (e.g. using http://purl.org/ or https://w3id.org/), however they don’t always have a declared type (FDOF2), or the PID may not even appear in the metadata. URL-based PIDs are resolvable (FDOF3), typically over HTTP using redirections and content-negotiation. One great advantage of RDF is that all attributes are defined semantic artefacts with PIDs (FDOF4), and attributes can be reused across vocabularies. While CRUD operations (FDOF6) are supported by native HTTP operations (GET/PUT/POST/DELETE) as in LDP , there is little consistency on how to define operation interfaces in LD (FDOF5). Existing REST approaches like OpenAPI and URI templates are mature and good candidates, and should be related to defined types to support machine-actionable composition (FDOF7). HTTP error code 410 Gone is used in tombstone pages for removed resources (FDOF12), although more frequent is 404 Not Found. Metadata is resolved to HTTP documents with their own URIs, but these frequently don’t have their own PID (FDOF8). RDF-Star and nanopublications (Kuhn et al. 2021) give ways to identify and trace provenance of individual assertions. Different metadata levels (FDOF9) are frequently developed for LD vocabularies across different communities (FDOF10), such as FHIR for health data, Bioschemas for bioinformatics and &gt;1000 more specific bioontologies. Increased declaration and navigation of profiles is therefore essential for machine-actionability and consistent consumption across FAIR endpoints. Several standards exist for rich collections (FDOF11), e.g. OAI-ORE, DCAT, RO-Crate, LDP. These are used and extended heterogeneously across the Web, but consistent machine-actionable FDOs will need specific choices of core standards and vocabularies. Another challenge is when multiple PIDs refer to “almost the same” concept in different collections – significant effort have created manual and automated semantic mappings (Baker et al. 2013, de Mello et al. 2022). Currently the FDO Forum has suggested the use of LDP as a possible alternative for implementing FAIR Digital Objects (Bonino da Silva Santos 2021), which proposes a novel approach of content-negotiation with custom media types. Discussion The Linked Data stack provides a set of specifications, tools and guidelines in order to help the FDO principles become a reality. This mature approach can accelerate uptake of FDO by scholars and existing research infrastructures such as the European Open Science Cloud (EOSC). However, the amount of standards and existing metadata vocabularies poses a potential threat for adoption and interoperability. Yet, the challenges for agreeing on usage profiles apply equally to DOIP as LD approaches. We have worked with different scientific communities to define RO-Crate (Soiland-Reyes et al. 2022), a lightweight method to package research outputs along with their metadata. While RO-Crate’s use of schema.org shows just one possible metadata model, it's powerful enough to be able to express FDOs, and familiar to web developers. We have also used FAIR Signposting (Van de Sompel et al. 2022) with HTTP Link: headers as a way to support navigation to the individual core properties of an FDO (PID, type, metadata, licence, bytestream) that does not require heuristics of content-negotiation and is agnostic to particular metadata vocabularies and serialisations. We believe that by adopting Linked Data principles, we can accelerate FDO today – and even start building practical ways to assist scientists in efficiently answering topical questions based on knowledge graphs.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Hookway, Nicholas y Tim Graham. "‘22 Push-Ups for a Cause’: Depicting the Moral Self via Social Media Campaign #Mission22". M/C Journal 20, n.º 4 (16 de agosto de 2017). http://dx.doi.org/10.5204/mcj.1270.

Texto completo
Resumen
IntroductionIn 2016, the online cause #Mission22 went viral on social media. Established to raise awareness about high suicide rates among US military veterans, the campaign involves users posting a video of themselves doing 22 push-ups for 22 days, and on some platforms, to donate and recruit others to do the same. Based on a ‘big data’ analysis of Twitter data (over 225,883 unique tweets) during the height of the campaign, this article uses #Mission22 as a site in which to analyse how people depict, self-represent and self-tell as moral subjects using social media campaigns. In addition to spotlighting how such movements are mobilised to portray moral selves in particular ways, the analysis focuses on how a specific online cause like #Mission22 becomes popularly supported from a plethora of possible causes and how this selection and support is shaped by online networks. We speculate that part of the reason why Mission22 went ‘viral’ in the highly competitive attention economies of social media environments was related to visual depictions of affective bodily, fitness and moral practices.Web 2.0 Culture: Self and Mass DepictionWeb 2.0 culture such as social networking sites (eg., Facebook; Instagram), the advent of video sharing technologies (eg., YouTube) and more recently, micro-blogging services like Twitter have created new and transformative spaces to create, depict and display identity. Web 2.0 is primarily defined by user-generated content and interaction, whereby users are positioned as both consumer and producers, or ‘produsers’ of Web content (Bruns and Schmidt). Challenging traditional “broadcast” media models, Web 2.0 gives users a platform to produce their own content and for “the many” to communicate “with the many” (Castells). The growth of mass self communication, supported by broadband and wireless technologies, gives unprecedented power to individuals and groups to depict and represent their identities and relationships to a potential global audience.The rise of user-generated communication technologies dovetails with broader analyses of the changing contours of self and identity in late-modern times. Individuals in the early decades of the 21st century must take charge for how they depict, portray and self-tell as distinctive, unique and individual subjects (Beck and Beck-Gernsheim; Giddens; Bauman). As contemporary lives become less bound to the strictures of tradition, community and religion, the self becomes a project to be worked out and developed. These theorists suggest that via processes of individualisation, detraditionalisation and globalisation, contemporary subjects have become disconnected from the traditional coordinates of community and are thus faced with the imperative of self-construction and reinvention (Elliott and Lemert).More recently, theoretical and empirical work has attempted to interpret and evaluate how networks of mass self-depiction powered by new digital and wireless technologies are reshaping identity practices. For some theorists, like Bauman (Consuming 2) and Turkle, Web 2.0 is a worrying trend. Bauman suggests in the “confessional society” – think reality TV, talk shows, social media – people are compelled to curate and reflect upon their lives in the public realm. These public acts of self-depiction are part of a move to treating the self as a brand to be consumed, “as products capable of drawing attention, and attracting demands and customers” (Bauman and Lyon 33). The consumer quality of new communications sees connections replace relationships as social bonds become short-term and brittle. Turkle makes a similar argument, suggesting that our preoccupation with online curation centres on controlling our identities and depicting “perfect” versions of ourselves. The result is diminished forms of intimacy and connection; we preach authenticity and realness but practice self-curation and self-stylisation.A more positive body of literature has examined how Web technologies work as tools for the formation of self. This literature is based on more close-up and detailed readings of particular platforms and practices rather than relying on sweeping claims about technology and social change. Following Foucault, Bakardjieva & Gaden argue that personal blogs and social networking site (SNS) profiles constitute a contemporary technology of the self, whereby users employ Web 2.0 technologies in everyday life as practices of self care and self-formation. In a similar way, Sauter argues that SNSs, and in particular Facebook, are tools for self-formation through the way in which status updates provide a contemporary form of self-writing. Eschewing the notion of social media activity as narcissistic or self-obsessive, Sauter argues that SNSs are a techno-social practice of self-writing that facilitate individuals to “form relations to self and others by exposing themselves to others and obtaining their feedback” (Sauter 836). Other research has explored young people’s sustained use of social media, particularly Facebook, and how these sites are used to tell and archive “growing up” narratives and key rites of passage (Robards and Lincoln).One area of research that has been overlooked is how people use social media to construct and depict moral identity. Following Sauter’s arguments about the self work that occurs through networked self-writing, we can extend this to include the ethical self work performed and produced through online depictions. One exception is work by Hookway which analyses how people use blogs – an earlier Web 2.0 form – to write and self-examine their moral experiences. This research shows how bloggers use blogging as a form of online self-writing to construct a do-it-yourself form of morality that emphasises the self, emotions, body and ideals of authenticity. Hookway highlights the idea that morality is less about obedience to a code of rules or following external laws to becoming a particular moral person through a set of self-practices. Paralleling broader shifts in identity construction, people are no longer bound to the inherited guidelines of the past, morality becomes a project to be worked out, designed and depicted in relation to Others (Hookway).In Foucault’s terms, morality involves a process of ethical self-stylisation – an “aesthetics of existence” – based on “the ethical work of the self on the self” (Foucault 91). “Care of the self” involves a “set of occupations” or “labours” that connect and link the self to the Other through guidance, counselling and communication (Foucault 50). For Foucault, self-creation and self-care imply “care for others” as individuals perform a mutual concern with achieving an “art of existence”. This is a reciprocated ethics that obligates the individual to care for others in order to help them care for themselves.This stylisation of the ethical self has been drastically reshaped by the new opportunities for self-expression, belonging and communication offered in our digitally networked society. Digital worlds and spaces create new multi-media modes for individuals and groups to depict, perform and communicate particular moral identities and positions. Web 2.0 technologies are seeing the boundaries between the private and public sphere collapse as more people are willing to share the most intimate part of their moral lives with a diverse mix of strangers, friends, family and associates.The confessional quality of online spaces provide a unique opportunity to analyse “lay morality” or everyday moral understandings, constructions and depictions and how this is co-produced in relation to new technological affordances. Following Sayer (951), morality is defined as “how people should treat others and be treated by them, which of course is crucial for their subjective and objective well-being”. Morality is understood as a relational and evaluative practice that involves being responsive to how people are faring and whether they are suffering or flourishing.In this article, we use the #Mission22 campaign – a campaign that went “viral” across multiple social media platforms – as a unique site to analyse and visualise lay moral depictions and constructions. Specifically, we analyse the #Mission22 campaign on Twitter using a big data analysis. Much of the empirical work on online self construction and depiction is either purely theoretical in the vein of Bauman, Turkle and Sauter or based on small qualitative samples such as the work by Lincoln and Robards and Author A. This article is unique not only in investigating the crafting of moral depictions in Web 2.0 forums but also in the scale of the textual and visual representation of mass moral self-depictions it captures and analyses. Big Data Analysis of #Mission22 on TwitterIn order to empirically examine the #Mission22 campaign on Twitter, we used the Twitter API to collect over three months of tweets that contained the campaign hashtag (from 20 Aug. 2016 to 1 Dec. 2016). This resulted in a dataset of 2,908,559 tweets, of which 225,883 were non-duplicated (i.e., some tweets were collected multiple times by the crawler).There were 3,230 user accounts participating during this period, with each user tweeting 70 times on average. As Figure 1 shows, a sizeable percentage of users were quite active at the height of the campaign, although there is clearly a number of users who only tweeted once or twice. More specifically, there were 1,232 users (or 38%) who tweeted at least 100 times, and on the other hand 1080 users (or 33%) who only tweeted two times or less. In addition, a tiny number of ‘power users’ (18 or 0.006%) tweeted more than 400 times during this period. Figure 1: Frequency distribution of #Mission22 tweets for each user in the datasetTo get a sense of what users were talking about during the campaign, we constructed a wordcloud out of the text data extracted from the tweets (see Figure 2). To provide more information and context, usernames (preceded with @) and hashtags (preceded with #) were included along with the words, providing a set of terms. As a result, the wordcloud also shows the user accounts and hashtags that were mentioned most often (note that #Mission22 was excluded from the data as it, by definition of the data collection process, has to occur in every tweet). In order to remove meaningless terms from the dataset we applied several text processing steps. First, all terms were converted to lowercase, such that “Veteran” and “veteran” are treated as the same term. Next, we applied a technique known as term frequency-inverse document frequency (tf-idf) to the tweet text data. Tf-idf effectively removes terms that occur so frequently that they provide no interesting information (e.g., the term “mission22”), and also terms that occur extremely infrequently. Finally, we removed English “stop words” from the text data, thereby eliminating common words such as “the” and “and”. Figure 2: Wordcloud of the #Mission22 tweet contentAs Figure 2 shows, the most frequent terms revolve around the campaign message and call-to-action for suicide awareness, including, for example, “day”, “veteran”, “support”, “push-ups”, “band”, “challenge”, “suicide”, “fight”, and “alone”. A number of user accounts are also frequently mentioned, which largely relate to the heavily retweeted users (discussed further below). Furthermore, alongside the central #mission22 hashtag, a number of other popular hashtags were in circulation during the campaign, including “#veteran”, “#americasmission”, “#22kill”, and “#22adayis22toomany”. Table 1 provides the top 50 most frequently occurring terms in decreasing order.Table 1: Top 50 words in the #Mission22 tweet content (decreasing order)1-1011-2021-3031-4041-50day@mrbernardedlong@uc_vetsnothingveteran#veteranbetter@kappasigmauceverysupporteverydaybelieve@ucthetachimissionpush-upschallengetodaytakehelp@sandratxassuicidehaulone#22kill@defensebaronveteransawarenessjustsay@the_usofightaccepted@piedmontlax#veterans@nbcnewsaloneptsdgoodweaknessbandvets22kwrong#nevertrumpcimmunity [sic]#americasmissionshoutoutgodwillA surprising finding of our study is that the vast majority of tweets are simply just retweets of other users. The number of retweets was 223,666, which accounts for about 99% of all tweets in the dataset. Even more surprising was that the vast majority of these retweets are from a single tweet. Indeed, 221,088 (or 98%) of all tweets in the dataset were retweets of the following tweet that was authored on 2 March 2015 by @SandraTXAS (see Figure 3). Clearly we can say that this tweet went ‘viral’ (Jenders et al) in the sense that it became frequently retweeted and gained an increasing amount of attention due to its cumulative popularity and visibility over time. Figure 3: #1 most retweeted #Mission22 tweet – @SandraTXAS (https://twitter.com/SandraTXAS)This highly retweeted or viral #Mission22 tweet provides a point of departure to examine what aspects of the tweet content influence the virality or popularity of #Mission22 tweets during the height of the campaign. To do this, we extracted the next nine most retweeted tweets from our dataset, providing an analysis of the “top 10” retweets (including the @SandraTXAS tweet above). Figure 4: #2 most retweeted - @mrbernarded (https://twitter.com/mrbernarded/status/776221040582295553)This tweet was retweeted 715 times in our dataset. Figure 5: #4 most retweeted - @Mission22 (https://twitter.com/Mission22/status/799872548863414272)This was retweeted 317 times in our dataset. Figure 6: #4 most retweeted - @UCThetaChi (https://twitter.com/UCThetaChi/status/784775641430384640)This was retweeted 180 times in our dataset. Figure 7: #5 most retweeted - @PamKeith2016 (https://twitter.com/PamKeith2016/status/782975576550305792)This was retweeted 121 times in our dataset. Figure 8: #6 most retweeted - @PiedmontLax (https://twitter.com/PiedmontLax/status/770749891698122752)This was retweeted 105 times in our dataset. Figure 9: #7 most retweeted - @PiedmontLax (https://twitter.com/PiedmontLax/status/771181070066692098) This was retweeted 78 times in our dataset. Figure 10: #8 most retweeted - @PatriotBrother (https://twitter.com/PatriotBrother/status/804387050728394752) This was retweeted 59 times in our dataset. Figure 11: #9 most retweeted - @alexgotayjr (https://twitter.com/alexgotayjr/status/787112936644849664) This was retweeted 49 times in our dataset. Figure 12: #10 most retweeted - @csjacobson89 (https://twitter.com/csjacobson89/status/772921614044233729) This was retweeted 45 times in our dataset.DiscussionThis article has provided the first “big data” analysis of the #Mission22 movement that went viral across multiple social media platforms in 2016. We began by arguing that Web 2.0 has ushered in profound changes to how people depict and construct identities that articulate with wider transformations in self and identity in conditions of late-modernity. The “confessional” quality of Web 2.0 means individuals and groups are presented with unprecedented opportunities to “mass self-depict” through new communication and Internet technologies. We suggest that the focus on how Web technologies are implicated in the formation of moral subjectivities is something that has been overlooked in the extant research on identity and Web 2.0 technologies.Filling this gap, we used the #Mission22 movement on Twitter as an empirical site to analyse how contemporary subjects construct and visually depict moral identities in online contexts. A central finding of our analysis of 225883 Twitter posts is that most engagement with #Mission22 was through retweeting. Our data show that retweets were by far the most popular way to interact and engage with the movement. In other words, most people were not producing original or new content in how they participated in the movement but were re-sharing – re-depicting – what others had shared. This finding highlights the importance of paying attention to the architectural affordances of social media platforms, in this case, the affordances of the ‘retweet’ button, and how they shape online identity practices and moral expression. We use moral expression here as a broad term to capture the different ways individuals and groups make moral evaluations based on a responsiveness to how people are faring and whether they are suffering or flourishing (Sayer). This approach provides an emic account of everyday morality and precludes, for example, wider philosophical debates about whether patriotism or nationalistic solidarity can be understood as moral values.The prominence of the retweet in driving the shape and nature of #Mission22 raises questions about the depth of moral engagement being communicated. Is the dominance of the retweet suggestive of a type of “moral slacktivism”? Like its online political equivalent, does the retweet highlight a shallow and cursory involvement with a cause or movement? Did online engagement translate to concrete moral actions such as making a donation to the cause or engaging in some other form of civic activity to draw attention to the movement? These questions are beyond the scope of this article but it is interesting to consider the link between the affordances of the platform, capacity for moral expression and how this translates to face-to-face moral action. Putting aside questions of depth, people are compelled not to ignore these posts, they move from “seeing” to “posting”, to taking action within the affordances of the architectural platform.What then is moving Twitter users to morally engage with this content? How did this movement go viral? What helped bust this movement out of the “long tail distribution” which characterises most movements – that is, few movements “take-off” and become durable within the congested attention economies of social media environments. The Top 10 most retweeted tweets provide powerful answers here. All of them feature highly emotive and affective visual depictions, either high impact photos and statements, or videos of people/groups doing pushups in solidarity together. The images and videos align affective, bodily and fitness practices with nationalistic and patriotic themes to produce a powerful and moving moral cocktail. The Top 50 words also capture the emotionally evocative use of moral language: words like: alone, fight, challenge, better, believe, good, wrong, god, help, mission, weakness and will.The emotional and embodied visual depictions that characterise the the Top 10 retweets and Top 50 words highlight how moral identity is not just a cerebral practice, but one that is fundamentally emotional and bodily. We do morality not just with our minds and heads but also with our bodies and our hearts. Part of the power of this movement, then, is the way it mobilises interest and involvement with the movement through a physical and embodied practice – doing push-ups. Visually depicting oneself doing push-ups online is a powerful display of morality identity. The “lay morality” being communicated is that not only are you somebody who cares about the flourishing and suffering of Others, you are also a fit, active and engaged citizen. And of course, the subject who actively takes responsibility for their health and well-being is highly valued in neoliberal risk contexts (Lupton).There is also a strong gendered dimensions to the visual depictions used in #Mission22. All of the Top 10 retweets feature images of men, mostly doing push-ups in groups. In the case of the second most popular retweet, it is two men in suits doing push-ups while three sexualised female singers “look-on” admiringly. Further analysis needs to be done to detail the gendered composition of movement participation, but it is interesting to speculate whether men were more likely to participate. The combination of demonstrating care for Other via a strong assertion of physical strength makes this a potentially more masculinised form of moral self-expression.Overall, Mission22 highlights how online self-work and cultivation can have a strong moral dimension. In Foucault’s language, the self-work involved in posting a video or image of yourself doing push-ups can be read as “an intensification of social relations”. It involves an ethics that is about self-creation through visual and textual depictions. Following the more pessimistic line of Bauman or Turkle, posting images of oneself doing push-ups might be seen as evidence of narcissism or a consumerist self-absorption. Rather than narcissism, we want to suggest that Mission22 highlights how a self-based moral practice – based on bodily, emotional and visual depictions – can extend to Others in an act of mutual care and exchange. Again Foucault helps clarify our argument: “the intensification of the concern for the self goes hand in hand with a valorisation of the Other”. What our work does, is show how this operates empirically on a large-scale in the new confessional contexts of Web 2.0 and its cultures of mass self-depiction. ReferencesBakardjieva, Maria, and Georgia Gaden. “Web 2.0 Technologies of the Self.” Philosophy & Technology 25.3 (2012): 399–413.Bauman, Zygmunt. Liquid Modernity. Cambridge: Polity, 2000.———. Consuming Life. Cambridge: Polity, 2007.———, and David Lyon. Liquid Surveillance. Cambridge: Polity, 2013.Beck, Ulrich, and Elizabeth Beck-Gernsheim. Individualisation. London: Sage, 2001.Bruns, Axel, and Jan-Hinrik Schmidt. “Produsage: A Closer Look at Continuing Developments.” New Review of Hypermedia and Multimedia 17.1 (2011): 3–7.Dutta-Bergman, Mohan J. “Primary Sources of Health Information: Comparisons in the Domain of Health Attitudes, Health Cognitions, and Health Behaviors.” Health Communication 16.3 (2004): 273–288.Elliott, Anthony, and Charles Lemert. The New Individualism: The Emotional Costs of Globalization. New York: Routledge, 2006.Foucault, Michel. The Care of the Self: The History of Sexuality. Vol. 3. New York: Random House, 1986.Giddens, Anthony. Modernity and Self-Identity: Self and Society in the Late Modern Age. Cambridge: Polity, 1991.Hookway, Nicholas. “The Moral Self: Class, Narcissism and the Problem of Do-It-Yourself Moralities.” The Sociological Review, 15 Mar. 2017. <http://journals.sagepub.com/doi/abs/10.1177/0038026117699540?journalCode=sora>.Jenders, Maximilian, et al. “Analyzing and Predicting Viral Tweets.” Proceedings of the 22nd International Conference on World Wide Web (WWW). Rio de Janeiro, 13-17 May 2013.Kata, Anna. “Anti-Vaccine Activists, Web 2.0, and the Postmodern Paradigm: An Overview of Tactics and Tropes Used Online by the Anti-Vaccination Movement.” Vaccine 30.25 (2012): 3778–89.Lincoln, Sian, and Brady Robards. “Editing the Project of the Self: Sustained Facebook Use and Growing Up Online.” Journal of Youth Studies 20.4 (2017): 518–531.Lupton, Deborah. The Imperative of Health: Public Health and the Regulated Body. London: Sage, 1995.Sauter, Theresa. ‘“What's on Your Mind?’ Writing on Facebook as a Tool for Self-Formation.” New Media & Society 16.5 (2014): 823–839.Sayer, Andrew. Why Things Matter to People: Social Science, Values and Ethical Life. Cambridge: Cambridge University Press, 2011.Smith, Gavin J.D., and Pat O’Malley. “Driving Politics: Data-Driven Governance and Resistance.” The British Journal of Criminology 56.1 (2016): 1–24.Turkle, Sherry. Reclaiming Conversation: The Power of Talk in a Digital Age. Penguin: New York, 2015.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Bac, Bui Van. "Effects of Land use Change on Coprini dung Beetles in Tropical Karst Ecosystems of Puluong Nature Reserve". VNU Journal of Science: Natural Sciences and Technology 35, n.º 4 (23 de diciembre de 2019). http://dx.doi.org/10.25073/2588-1140/vnunst.4930.

Texto completo
Resumen
I examined variation in community structure, species richness, biomass and abundance of Coprini dung beetles from 45 trapping sites in meadows, 35-year-old secondary forests and primary forests in tropical, high-elevation karst ecosystems of Puluong Nature Reserve, Thanh Hoa Province. My main aim was to explore community response to the influence of land use change. By comparing the structure and community attributes of the beetles between 35-year-old secondary forests and primary forests, I expected to give indications on the conservation value of the old secondary forests for beetle conservation. Community structure significantly differed among land-use types. Species richness, abundance and biomass were significantly higher in forest habitats than in meadows. The cover of ground vegetation, soil clay content and tree diameter are important factors structuring Coprini communities in karst ecosystems of Pu Luong. The secondary forests, after 35 years of regrowth showed similarities in species richness, abundance and biomass to primary forests. This gives hope for the recovery of Coprini communities during forest succession. Keywords: Coprini, dung beetles, karst ecosystems, land use change, Pu Luong. References: [1] I. Hanski, Y. Cambefort, Dung beetle ecology, Princeton University Press, Princeton, 1991.[2] C.H. Scholtz, A.L.V. Davis, U. Kryger, Evolutionary biology and conservation of dung beetles, Pensoft Publisher, Bulgaria, 2009.[3] E. Nichols, S. Spector, J. Louzada, T. Larsen, S. Amezquita, M.E. Favila et al., Ecological functions and ecosystem services provided by Scarabaeinae dung beetles, Biol. Conserv. 141 (2008) 1461-1474. https://doi.org/10.1016/j.biocon.2008.04.011.[4] H.K. Gibbsa, A.S. Rueschb, F. Achardc, M.K. Claytond, P. Holmgrene, N. Ramankuttyf, J.A. Foleyg, Tropical forests were the primary sources of new agricultural land in the 1980s and 1990s, Proc Natl Acad Sci USA 107 (2010) 16732-16737. https://doi.org/10.1073/pnas.0910275107.[5] L.D. Audino, J. Louzada, L. Comita, Dung beetles as indicators of tropical forest restoration success: is it possible to recover species and functional diversity? Biol. Conserv. 169 (2014) 248-257. https://doi.org/10.1016/j.biocon.2013.11.023.[6] W. Beiroz, E.M. Slade, J. Barlow, J.M. Silveira, J. Louzada, E. Sayer, Dung beetle community dynamics in undisturbed tropical forests: implications for ecological evaluations of land-use change, Insect Conservation and Diversity 10 (2017) 94-106. https://doi.org/10.1111/icad.12206.[7] S. Boonrotpong, S. Sotthibandhu, C. Pholpunthin, Species composition of dung beetles in the primary and secondary forests at Ton Nga Chang Wildlife Sanctuary, ScienceAsia 30 (2004) 59-65. https: // doi.org/10.2306/scienceasia1513-1874.2004.30.059.[8] S. Boonrotpong, S. Sotthibandhu, C. Satasook, Species turnover and diel flight activity of species of dung beetles, Onthophagus, in the tropical lowland forest of peninsular Thailand, Journal of Insect Science 12 (77) (2012). https://doi.org/10. 1673/031.012.7701.[9] A.J. Davis, J.D. Holloway, H. Huijbregts, J. Krikken, A.H. Kirk-Spriggs, S.L. Sutton, Dung beetles as indicators of change in the forests of northern Borneo, Journal of Applied Ecology 38 (2001) 593-616. https://doi.org/10.1046/j.1365-2664.2001.00619.x.[10] K. Frank, M. Hülsmann, T. Assmann, T. Schmitt, N. Blüthgen, Land use affects dung beetle communities and their ecosystem service in forests and grasslands, Agriculture, Ecosystems & Environment 243 (2017) 114-122.[11] T.A. Gardner, M.I.M. Hernández, J. Barlow, C.A. Peres, Understanding the biodiversity consequences of habitat change: the value of secondary and plantation forests for neotropical dung beetles, Journal of Applied Ecology 45 (2008) 883-893. https://doi.org/10.1111/j.1365-2664. 2008.01454.x.[12] L. Hayes, D.J. Mann, A.L. Monastyrskii, O.T. Lewis, Rapid assessments of tropical dung beetle and butterfly assemblages: contrasting trends along a forest disturbance gradient, Insect Conservation and Diversity 2 (2009) 194-203. https://doi.org/ 10.1111/j.1752-4598.2009.00058.x.[13] I. Quintero, T. Roslin, Rapid recovery of dung beetle communities following habitat fragmentation in central Amazonia, Ecology 12 (2005) 3303-3311. https://doi.org/10.1890/04-1960.[14] Shahabuddin, C.H. Schulze, T. Tscharntke, Changes of dung beetle communities from rainforests towards agroforestry systems and annual cultures in Sulawesi (Indonesia), Biodiversity and Conservation 14 (2005) 863-877. https://doi.org/10.1007/s10531-004-0654-7.[15] K. Vulinec, Dung beetle communities and seed dispersal in primary forest and disturbed land in Amazonia, Biotropica 34 (2002) 297-309. https:// doi.org/10.1111/j.1744-7429.2002.tb00541.x.[16] K. Vulinec, J.E. Lambert, D.J. Mellow, Primate and dung beetle communities in secondary growth rain forests: implications for conservation of seed dispersal systems, International Journal of Primatology 27 (2006) 855-879. https://doi.org/10. 1007/s10764-006-9027-2.[17] E. Nichols, T. Larsen, S. Spector, A.L. Davis, F. Escobar, M. Favila, K. Vulinec, Global dung beetle response to tropical forest modification and fragmentation: a quantitative literature review and meta-analysis, Biological Conservation 137 (2007) 1-19. https://doi.org/10.1016/j.biocon.2007.01.023.[18] R. Clements, N.S. Sodhi, M. Schilthuizen, K.L.Ng. Peter, Limestone karsts of Southeast Asia: imperiled arks of biodiversity, BioScience 56 (2006) 733-742. https://doi.org/10.1641/0006-3568(2006)56[733:LKOSAI]2.0.CO;2.[19] M. Schilthuizen, T.S. Liew, B.B. Elahan, I. Lackman-Ancrenaz, Effects of karst forest degradation on pulmonate and prosobranch land snail communities in Sabah, Malaysian Borneo, Conservation Biology 19 (2005) 949-954. https://doi.org/10.1111/j.1523-1739.2005.00209.x.[20] C. Costa, V.H.F. Oliveira, R. Maciel, W. Beiroz, V. Korasaki, J. Louzada, Variegated tropical landscapes conserve diverse dung beetle communities, PeerJ 5 (2017). https://doi.org/10. 7717/peerj.3125.[21] R.P. Salomão, D. González-Tokmana, W. Dáttilo, J.C. López-Acosta, M.E. Favila, Landscape structure and composition define the body condition of dung beetles (Coleoptera: Scarabaeinae) in a fragmented tropical rainforest, Ecol. Indic. 88 (2018) 144-151. https://doi.org/ 10.1016/j.ecolind.2018.01.033.[22] R.C. Campos, M.I.M. Hernández, Dung beetle assemblages (Coleoptera, Scarabaeinae) in Atlantic forest fragments in southern Brazil, Revista Brasileira de Entomologia 57 (2013) 47-54.[23] E. Nichols, Fear begets function in the ‘brown’ world of detrital food webs, Journal of Animal Ecology 82(4) (2013) 717-720. https://doi.org/ 10.1111/1365-2656.12099.[24] Tixier, J.M.G. Bloor, J.-P. Lumaret, Species-specific effects of dung beetle abundance on dung removal and leaf litter decomposition, Acta Oecologica 69 (2015) 31-34. https://doi.org/10. 1016/j.actao.2015.08.003.[25] P.M. Farias, L. Arellano, M.I.M. Hernández, S.L. Ortiz, Response of the copro- necrophagous beetle (Coleoptera: Scarabaeinae) assemblage to a range of soil characteristics and livestock management in a tropical landscape, Journal of Insect Conservation 19 (2015) 947-960. https://doi.org/10.1007/s 108 41-015-9812-3.[26] D.C. Osberg, B.M. Doube, S.A. Hanrahan, Habitat specificity in African dung beetles: the effect of soil type on the survival ofdung beetle immatures (Coleoptera: Scarabaeidae), Tropical Zoology 7 (1994) 1-10. https://doi.org/10.1080/03946975. 1994.10539236.[27] E. Andresen, S. Laurance, Possible indirect effects of mammal hunting on dung beetle assemblages in Panama, Biotropica 39 (2006) 141-146. https://doi.org/10.1111/j.1744-7429.2006.00239.x.[28] H. Enari, S. Koike, H. Sakamaki, Influences of different large mammalian fauna on dung beetle diversity in beech forests, Journal of Insect Science 13(54)(2013).https://doi.org/10.1673/031.013. 5401.[29] A. Estrada, D.A. Anzuras, R. Coastes-Estrada, Tropical forest fragmentation, howler monkeys (Alouatta palliata) and dung beetles at Los Tuxtlas, Mexico, American Journal of Primatology 48 (1999) 353-362.[30] C.A. Harvey, J. Gonzalez, E. Somarriba, Dung beetle and terrestrial mammal diversity in forests, indigenous agroforestry systems and plantain monocultures in Talamanca, Costa Rica, Biodiversity and Conservation 15 (2006) 555-585. https://doi.org/10.1007/s10531-005-2088-2.[31] K.V. Nguyễn, T.H. Nguyễn, K.L. Phan, T.H. Nguyễn, Bản đồ sinh khí hậu Việt Nam, Nhà xuất bản Đại học Quốc gia, Hà Nội, 2000.[32] E.J. Sterling, M.M. Hurley, M.D. Le, Vietnam–a natural history, Yale University Press, New Haven, CT, 2006.[33] T. Do, Characteristics of karst ecosystems of Vietnam and their vulnerability to human impact, Acta Geologica Sinica 75 (2001) 325-329.[34] V.T. Thái, Thảm thực vật rừng Việt Nam, Nhà xuất bản Khoa học và kỹ thuật, Hà Nội, 1978. [35] P.G.d. Silva, M.I.M. Hernández, Spatial patterns of movement of dung beetle species in a tropical forest suggest a new trap spacing for dung beetle biodiversity studies. PloS ONE 10 (5) (e0126112) (2015). https://doi.org/10.1371/journal.pone.0126112.[36] V.B. Bui, K. Dumack, M. Bonkowski, Two new species and one new record for the genus Copris (Coleoptera: Scarabaeidae: Scarabaeinae) from Vietnam with a key to Vietnamese species, European Journal of Entomology 115 (2018) 167-191. https://doi.org/10.14411/eje.2018.016.[37] V.B. Bui, M. Bonkowski, Synapsis puluongensis sp. nov. and new data on the poorly known species Synapsis horaki (Coleoptera: Scarabaeidae) from Vietnam with a key to Vietnamese species. Acta Entomologica Musei Nationalis Pragae 58 (2018) 407-418. https://doi.org/10.2478/aemnp-2018-0032.[38] O.N. Kabakov, A. Napolov, Fauna and ecology of Lamellicornia of subfamily Scarabaeinae of Vietnam and some parts of adjacent countries: South China, Laos, and Thailand, Latvijas Entomologs 37 (1999) 58-96.[39] J.E. Brower, J.H. Zar, C.N. Von-Ende, Field and laboratory methods for general ecology, 4th ed. Boston, WCB. McGraw-Hill, 1998.[40] R Core Team, R: A Language and Environment for Statistical Computing. https://www.R-project.org/ (accessed 15 May 2017).[41] J. Oksanen, F.G. Blanchet, M. Friendly, R. Kindt, P. Legendre, D. McGlinn et al., Vegan: Community Ecology Package, R package version 2.4–5 (2017). https://cran.r-project.org/web/packages/vegan.[42] R. Clements, P.K.L. Nga, X.X. Lub, S. Ambu, M. Schilthuizen, C.J.A. Bradshaw, Using biogeographical patterns of endemic land snails to improve conservation planning for limestone karsts, Biological Conservation 141 (2751e2764) (2008). https://doi.org/10.1016/j.biocon.2008.08.011.[43] P.K.L. Ng, D. Guinot, T.M. Iliffe, Revision of the anchialine varunine crabs of the genus Orcovita Ng & Tomascik, 1994 (Crustacea: Decapoda: Brachyura: Grapsidae), with descriptions of four new species, Raffles Bulletin of Zoology 44 (1996) 109-134.[44] P.K.L. Ng, Cancrocaeca xenomorpha, new genus and species, a blind troglobitic freshwater hymenosomatid (Crustacea: Decapoda: Brachyura) from Sulawesi, Indonesia, Raffles Bulletin of Zoology 39 (1991) 59-73.[45] V. Balthasar, Monographie der Scarabaeidae und Aphodiidae der Palaearktischen und Orientalischen Region. Coleoptera: Lamellicornia. Band 1. Allgemeiner Teil, Systematischer Teil: 1. Scarabaeinae, 2. Coprinae (Pinotini, Coprini). Verlag der Tschechoslowakischen Akademie der Wissenschaften, Prag, 1963.[46] Y. Hanboonsong, K. Masumoto, T. Ochi, Dung beetles (Coleoptera, Scarabaeidae) of Thailand. Part 5. Genera Copris and Microcopris (Coprini), Elytra 31 (2003) 103-124.[47] D. Král, J. Rejsek, Synapsis naxiorum sp. n. from Yunnan (Coleoptera: Scarabaeidae), Acta Societatis Zoologicae Bohemicae 64 (2000) 267-270.[48] D. Král, Distribution and taxonomy of some Synapsis species, with description of S. strnadi sp. n. from Vietnam (Coleoptera: Scarabaeidae), Acta Societatis Zoologicae Bohemicae 66 (2002) 279-289.[49] T. Ochi, M. Kon, Notes on the coprophagous scarab beetles (Coleoptera, Scarabaeidae) from Southeast Asia (IV). A new horned species of Microcopris from Vietnam and a new subspecies of Copris erratus from Peleng off Sulawesi, Kogane 5 (2004) 25-30.[50] T. Ochi, M. Kon, H.T. Pham, Five new taxa of Copris (Coleoptera: Scarabaeidae) from Vietnam and Laos, Giornale Italiano di Entomologia 15 (64) (2019) 435-446.[51] T. Ochi, M. Kon, H.T. Pham, Two new species of Copris (Copris) (Coleoptera: Scarabaeidae) and a new subspecies of Phelotrupes (Sinogeotrupes) strnadi Král, Malý & Schneider (Coleoptera: Geotrupidae) from Vietnam, Giornale Italiano di Entomologia 15 (63) (2018) 159-168.[52] J. Zídek, S. Pokorný, Review of Synapsis Bates (Scarabaeidae: Scarabaeinae: Coprini), with description of a new species, Insecta Mundi 142 (2010) 1-21.[53] H.F. Howden, V.G. Nealis, Observations on height of perching in some tropical dung beetles (Scarabaeidae), Biotropica 10 (1978) 43-46. https://doi.org/10.1111/j.1752-4598.2009.00058.x.[54] T.H. Larsen, A. Lopera, A. Forsyth, Understanding trait-dependent community disassembly: Dung beetles, density functions, and forest fragmentation, Conservation Biology 22 (2008) 1288-1298. https://doi.org/10.1111/j.1523-1739.2008.00969.x.[55] S.B. Peck, A. Forsyth, Composition, structure, and competitive behaviour in a guild of Ecuadorian rain forest dung beetles (Coleoptera; Scarabaeidae), Canadian Journal of Zoology 60(7) (1982) 1624-1634. https://doi.org/10.1139/z82-213.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Paull, John. "Beyond Equal: From Same But Different to the Doctrine of Substantial Equivalence". M/C Journal 11, n.º 2 (1 de junio de 2008). http://dx.doi.org/10.5204/mcj.36.

Texto completo
Resumen
A same-but-different dichotomy has recently been encapsulated within the US Food and Drug Administration’s ill-defined concept of “substantial equivalence” (USFDA, FDA). By invoking this concept the genetically modified organism (GMO) industry has escaped the rigors of safety testing that might otherwise apply. The curious concept of “substantial equivalence” grants a presumption of safety to GMO food. This presumption has yet to be earned, and has been used to constrain labelling of both GMO and non-GMO food. It is an idea that well serves corporatism. It enables the claim of difference to secure patent protection, while upholding the contrary claim of sameness to avoid labelling and safety scrutiny. It offers the best of both worlds for corporate food entrepreneurs, and delivers the worst of both worlds to consumers. The term “substantial equivalence” has established its currency within the GMO discourse. As the opportunities for patenting food technologies expand, the GMO recruitment of this concept will likely be a dress rehearsal for the developing debates on the labelling and testing of other techno-foods – including nano-foods and clone-foods. “Substantial Equivalence” “Are the Seven Commandments the same as they used to be, Benjamin?” asks Clover in George Orwell’s “Animal Farm”. By way of response, Benjamin “read out to her what was written on the wall. There was nothing there now except a single Commandment. It ran: ALL ANIMALS ARE EQUAL BUT SOME ANIMALS ARE MORE EQUAL THAN OTHERS”. After this reductionist revelation, further novel and curious events at Manor Farm, “did not seem strange” (Orwell, ch. X). Equality is a concept at the very core of mathematics, but beyond the domain of logic, equality becomes a hotly contested notion – and the domain of food is no exception. A novel food has a regulatory advantage if it can claim to be the same as an established food – a food that has proven its worth over centuries, perhaps even millennia – and thus does not trigger new, perhaps costly and onerous, testing, compliance, and even new and burdensome regulations. On the other hand, such a novel food has an intellectual property (IP) advantage only in terms of its difference. And thus there is an entrenched dissonance for newly technologised foods, between claiming sameness, and claiming difference. The same/different dilemma is erased, so some would have it, by appeal to the curious new dualist doctrine of “substantial equivalence” whereby sameness and difference are claimed simultaneously, thereby creating a win/win for corporatism, and a loss/loss for consumerism. This ground has been pioneered, and to some extent conquered, by the GMO industry. The conquest has ramifications for other cryptic food technologies, that is technologies that are invisible to the consumer and that are not evident to the consumer other than via labelling. Cryptic technologies pertaining to food include GMOs, pesticides, hormone treatments, irradiation and, most recently, manufactured nano-particles introduced into the food production and delivery stream. Genetic modification of plants was reported as early as 1984 by Horsch et al. The case of Diamond v. Chakrabarty resulted in a US Supreme Court decision that upheld the prior decision of the US Court of Customs and Patent Appeal that “the fact that micro-organisms are alive is without legal significance for purposes of the patent law”, and ruled that the “respondent’s micro-organism plainly qualifies as patentable subject matter”. This was a majority decision of nine judges, with four judges dissenting (Burger). It was this Chakrabarty judgement that has seriously opened the Pandora’s box of GMOs because patenting rights makes GMOs an attractive corporate proposition by offering potentially unique monopoly rights over food. The rear guard action against GMOs has most often focussed on health repercussions (Smith, Genetic), food security issues, and also the potential for corporate malfeasance to hide behind a cloak of secrecy citing commercial confidentiality (Smith, Seeds). Others have tilted at the foundational plank on which the economics of the GMO industry sits: “I suggest that the main concern is that we do not want a single molecule of anything we eat to contribute to, or be patented and owned by, a reckless, ruthless chemical organisation” (Grist 22). The GMO industry exhibits bipolar behaviour, invoking the concept of “substantial difference” to claim patent rights by way of “novelty”, and then claiming “substantial equivalence” when dealing with other regulatory authorities including food, drug and pesticide agencies; a case of “having their cake and eating it too” (Engdahl 8). This is a clever slight-of-rhetoric, laying claim to the best of both worlds for corporations, and the worst of both worlds for consumers. Corporations achieve patent protection and no concomitant specific regulatory oversight; while consumers pay the cost of patent monopolization, and are not necessarily apprised, by way of labelling or otherwise, that they are purchasing and eating GMOs, and thereby financing the GMO industry. The lemma of “substantial equivalence” does not bear close scrutiny. It is a fuzzy concept that lacks a tight testable definition. It is exactly this fuzziness that allows lots of wriggle room to keep GMOs out of rigorous testing regimes. Millstone et al. argue that “substantial equivalence is a pseudo-scientific concept because it is a commercial and political judgement masquerading as if it is scientific. It is moreover, inherently anti-scientific because it was created primarily to provide an excuse for not requiring biochemical or toxicological tests. It therefore serves to discourage and inhibit informative scientific research” (526). “Substantial equivalence” grants GMOs the benefit of the doubt regarding safety, and thereby leaves unexamined the ramifications for human consumer health, for farm labourer and food-processor health, for the welfare of farm animals fed a diet of GMO grain, and for the well-being of the ecosystem, both in general and in its particularities. “Substantial equivalence” was introduced into the food discourse by an Organisation for Economic Co-operation and Development (OECD) report: “safety evaluation of foods derived by modern biotechnology: concepts and principles”. It is from this document that the ongoing mantra of assumed safety of GMOs derives: “modern biotechnology … does not inherently lead to foods that are less safe … . Therefore evaluation of foods and food components obtained from organisms developed by the application of the newer techniques does not necessitate a fundamental change in established principles, nor does it require a different standard of safety” (OECD, “Safety” 10). This was at the time, and remains, an act of faith, a pro-corporatist and a post-cautionary approach. The OECD motto reveals where their priorities lean: “for a better world economy” (OECD, “Better”). The term “substantial equivalence” was preceded by the 1992 USFDA concept of “substantial similarity” (Levidow, Murphy and Carr) and was adopted from a prior usage by the US Food and Drug Agency (USFDA) where it was used pertaining to medical devices (Miller). Even GMO proponents accept that “Substantial equivalence is not intended to be a scientific formulation; it is a conceptual tool for food producers and government regulators” (Miller 1043). And there’s the rub – there is no scientific definition of “substantial equivalence”, no scientific test of proof of concept, and nor is there likely to be, since this is a ‘spinmeister’ term. And yet this is the cornerstone on which rests the presumption of safety of GMOs. Absence of evidence is taken to be evidence of absence. History suggests that this is a fraught presumption. By way of contrast, the patenting of GMOs depends on the antithesis of assumed ‘sameness’. Patenting rests on proven, scrutinised, challengeable and robust tests of difference and novelty. Lightfoot et al. report that transgenic plants exhibit “unexpected changes [that] challenge the usual assumptions of GMO equivalence and suggest genomic, proteomic and metanomic characterization of transgenics is advisable” (1). GMO Milk and Contested Labelling Pesticide company Monsanto markets the genetically engineered hormone rBST (recombinant Bovine Somatotropin; also known as: rbST; rBGH, recombinant Bovine Growth Hormone; and the brand name Prosilac) to dairy farmers who inject it into their cows to increase milk production. This product is not approved for use in many jurisdictions, including Europe, Australia, New Zealand, Canada and Japan. Even Monsanto accepts that rBST leads to mastitis (inflammation and pus in the udder) and other “cow health problems”, however, it maintains that “these problems did not occur at rates that would prohibit the use of Prosilac” (Monsanto). A European Union study identified an extensive list of health concerns of rBST use (European Commission). The US Dairy Export Council however entertain no doubt. In their background document they ask “is milk from cows treated with rBST safe?” and answer “Absolutely” (USDEC). Meanwhile, Monsanto’s website raises and answers the question: “Is the milk from cows treated with rbST any different from milk from untreated cows? No” (Monsanto). Injecting cows with genetically modified hormones to boost their milk production remains a contested practice, banned in many countries. It is the claimed equivalence that has kept consumers of US dairy products in the dark, shielded rBST dairy farmers from having to declare that their milk production is GMO-enhanced, and has inhibited non-GMO producers from declaring their milk as non-GMO, non rBST, or not hormone enhanced. This is a battle that has simmered, and sometimes raged, for a decade in the US. Finally there is a modest victory for consumers: the Pennsylvania Department of Agriculture (PDA) requires all labels used on milk products to be approved in advance by the department. The standard issued in October 2007 (PDA, “Standards”) signalled to producers that any milk labels claiming rBST-free status would be rejected. This advice was rescinded in January 2008 with new, specific, department-approved textual constructions allowed, and ensuring that any “no rBST” style claim was paired with a PDA-prescribed disclaimer (PDA, “Revised Standards”). However, parsimonious labelling is prohibited: No labeling may contain references such as ‘No Hormones’, ‘Hormone Free’, ‘Free of Hormones’, ‘No BST’, ‘Free of BST’, ‘BST Free’,’No added BST’, or any statement which indicates, implies or could be construed to mean that no natural bovine somatotropin (BST) or synthetic bovine somatotropin (rBST) are contained in or added to the product. (PDA, “Revised Standards” 3) Difference claims are prohibited: In no instance shall any label state or imply that milk from cows not treated with recombinant bovine somatotropin (rBST, rbST, RBST or rbst) differs in composition from milk or products made with milk from treated cows, or that rBST is not contained in or added to the product. If a product is represented as, or intended to be represented to consumers as, containing or produced from milk from cows not treated with rBST any labeling information must convey only a difference in farming practices or dairy herd management methods. (PDA, “Revised Standards” 3) The PDA-approved labelling text for non-GMO dairy farmers is specified as follows: ‘From cows not treated with rBST. No significant difference has been shown between milk derived from rBST-treated and non-rBST-treated cows’ or a substantial equivalent. Hereinafter, the first sentence shall be referred to as the ‘Claim’, and the second sentence shall be referred to as the ‘Disclaimer’. (PDA, “Revised Standards” 4) It is onto the non-GMO dairy farmer alone, that the costs of compliance fall. These costs include label preparation and approval, proving non-usage of GMOs, and of creating and maintaining an audit trail. In nearby Ohio a similar consumer versus corporatist pantomime is playing out. This time with the Ohio Department of Agriculture (ODA) calling the shots, and again serving the GMO industry. The ODA prescribed text allowed to non-GMO dairy farmers is “from cows not supplemented with rbST” and this is to be conjoined with the mandatory disclaimer “no significant difference has been shown between milk derived from rbST-supplemented and non-rbST supplemented cows” (Curet). These are “emergency rules”: they apply for 90 days, and are proposed as permanent. Once again, the onus is on the non-GMO dairy farmers to document and prove their claims. GMO dairy farmers face no such governmental requirements, including no disclosure requirement, and thus an asymmetric regulatory impost is placed on the non-GMO farmer which opens up new opportunities for administrative demands and technocratic harassment. Levidow et al. argue, somewhat Eurocentrically, that from its 1990s adoption “as the basis for a harmonized science-based approach to risk assessment” (26) the concept of “substantial equivalence” has “been recast in at least three ways” (58). It is true that the GMO debate has evolved differently in the US and Europe, and with other jurisdictions usually adopting intermediate positions, yet the concept persists. Levidow et al. nominate their three recastings as: firstly an “implicit redefinition” by the appending of “extra phrases in official documents”; secondly, “it has been reinterpreted, as risk assessment processes have … required more evidence of safety than before, especially in Europe”; and thirdly, “it has been demoted in the European Union regulatory procedures so that it can no longer be used to justify the claim that a risk assessment is unnecessary” (58). Romeis et al. have proposed a decision tree approach to GMO risks based on cascading tiers of risk assessment. However what remains is that the defects of the concept of “substantial equivalence” persist. Schauzu identified that: such decisions are a matter of “opinion”; that there is “no clear definition of the term ‘substantial’”; that because genetic modification “is aimed at introducing new traits into organisms, the result will always be a different combination of genes and proteins”; and that “there is no general checklist that could be followed by those who are responsible for allowing a product to be placed on the market” (2). Benchmark for Further Food Novelties? The discourse, contestation, and debate about “substantial equivalence” have largely focussed on the introduction of GMOs into food production processes. GM can best be regarded as the test case, and proof of concept, for establishing “substantial equivalence” as a benchmark for evaluating new and forthcoming food technologies. This is of concern, because the concept of “substantial equivalence” is scientific hokum, and yet its persistence, even entrenchment, within regulatory agencies may be a harbinger of forthcoming same-but-different debates for nanotechnology and other future bioengineering. The appeal of “substantial equivalence” has been a brake on the creation of GMO-specific regulations and on rigorous GMO testing. The food nanotechnology industry can be expected to look to the precedent of the GMO debate to head off specific nano-regulations and nano-testing. As cloning becomes economically viable, then this may be another wave of food innovation that muddies the regulatory waters with the confused – and ultimately self-contradictory – concept of “substantial equivalence”. Nanotechnology engineers particles in the size range 1 to 100 nanometres – a nanometre is one billionth of a metre. This is interesting for manufacturers because at this size chemicals behave differently, or as the Australian Office of Nanotechnology expresses it, “new functionalities are obtained” (AON). Globally, government expenditure on nanotechnology research reached US$4.6 billion in 2006 (Roco 3.12). While there are now many patents (ETC Group; Roco), regulation specific to nanoparticles is lacking (Bowman and Hodge; Miller and Senjen). The USFDA advises that nano-manufacturers “must show a reasonable assurance of safety … or substantial equivalence” (FDA). A recent inventory of nano-products already on the market identified 580 products. Of these 11.4% were categorised as “Food and Beverage” (WWICS). This is at a time when public confidence in regulatory bodies is declining (HRA). In an Australian consumer survey on nanotechnology, 65% of respondents indicated they were concerned about “unknown and long term side effects”, and 71% agreed that it is important “to know if products are made with nanotechnology” (MARS 22). Cloned animals are currently more expensive to produce than traditional animal progeny. In the course of 678 pages, the USFDA Animal Cloning: A Draft Risk Assessment has not a single mention of “substantial equivalence”. However the Federation of Animal Science Societies (FASS) in its single page “Statement in Support of USFDA’s Risk Assessment Conclusion That Food from Cloned Animals Is Safe for Human Consumption” states that “FASS endorses the use of this comparative evaluation process as the foundation of establishing substantial equivalence of any food being evaluated. It must be emphasized that it is the food product itself that should be the focus of the evaluation rather than the technology used to generate cloned animals” (FASS 1). Contrary to the FASS derogation of the importance of process in food production, for consumers both the process and provenance of production is an important and integral aspect of a food product’s value and identity. Some consumers will legitimately insist that their Kalamata olives are from Greece, or their balsamic vinegar is from Modena. It was the British public’s growing awareness that their sugar was being produced by slave labour that enabled the boycotting of the product, and ultimately the outlawing of slavery (Hochschild). When consumers boycott Nestle, because of past or present marketing practices, or boycott produce of USA because of, for example, US foreign policy or animal welfare concerns, they are distinguishing the food based on the narrative of the food, the production process and/or production context which are a part of the identity of the food. Consumers attribute value to food based on production process and provenance information (Paull). Products produced by slave labour, by child labour, by political prisoners, by means of torture, theft, immoral, unethical or unsustainable practices are different from their alternatives. The process of production is a part of the identity of a product and consumers are increasingly interested in food narrative. It requires vigilance to ensure that these narratives are delivered with the product to the consumer, and are neither lost nor suppressed. Throughout the GM debate, the organic sector has successfully skirted the “substantial equivalence” debate by excluding GMOs from the certified organic food production process. This GMO-exclusion from the organic food stream is the one reprieve available to consumers worldwide who are keen to avoid GMOs in their diet. The organic industry carries the expectation of providing food produced without artificial pesticides and fertilizers, and by extension, without GMOs. Most recently, the Soil Association, the leading organic certifier in the UK, claims to be the first organisation in the world to exclude manufactured nonoparticles from their products (Soil Association). There has been the call that engineered nanoparticles be excluded from organic standards worldwide, given that there is no mandatory safety testing and no compulsory labelling in place (Paull and Lyons). The twisted rhetoric of oxymorons does not make the ideal foundation for policy. Setting food policy on the shifting sands of “substantial equivalence” seems foolhardy when we consider the potentially profound ramifications of globally mass marketing a dysfunctional food. If there is a 2×2 matrix of terms – “substantial equivalence”, substantial difference, insubstantial equivalence, insubstantial difference – while only one corner of this matrix is engaged for food policy, and while the elements remain matters of opinion rather than being testable by science, or by some other regime, then the public is the dupe, and potentially the victim. “Substantial equivalence” has served the GMO corporates well and the public poorly, and this asymmetry is slated to escalate if nano-food and clone-food are also folded into the “substantial equivalence” paradigm. Only in Orwellian Newspeak is war peace, or is same different. It is time to jettison the pseudo-scientific doctrine of “substantial equivalence”, as a convenient oxymoron, and embrace full disclosure of provenance, process and difference, so that consumers are not collateral in a continuing asymmetric knowledge war. References Australian Office of Nanotechnology (AON). Department of Industry, Tourism and Resources (DITR) 6 Aug. 2007. 24 Apr. 2008 < http://www.innovation.gov.au/Section/Innovation/Pages/ AustralianOfficeofNanotechnology.aspx >.Bowman, Diana, and Graeme Hodge. “A Small Matter of Regulation: An International Review of Nanotechnology Regulation.” Columbia Science and Technology Law Review 8 (2007): 1-32.Burger, Warren. “Sidney A. Diamond, Commissioner of Patents and Trademarks v. Ananda M. Chakrabarty, et al.” Supreme Court of the United States, decided 16 June 1980. 24 Apr. 2008 < http://caselaw.lp.findlaw.com/cgi-bin/getcase.pl?court=US&vol=447&invol=303 >.Curet, Monique. “New Rules Allow Dairy-Product Labels to Include Hormone Info.” The Columbus Dispatch 7 Feb. 2008. 24 Apr. 2008 < http://www.dispatch.com/live/content/business/stories/2008/02/07/dairy.html >.Engdahl, F. William. Seeds of Destruction. Montréal: Global Research, 2007.ETC Group. Down on the Farm: The Impact of Nano-Scale Technologies on Food and Agriculture. Ottawa: Action Group on Erosion, Technology and Conservation, November, 2004. European Commission. Report on Public Health Aspects of the Use of Bovine Somatotropin. Brussels: European Commission, 15-16 March 1999.Federation of Animal Science Societies (FASS). Statement in Support of FDA’s Risk Assessment Conclusion That Cloned Animals Are Safe for Human Consumption. 2007. 24 Apr. 2008 < http://www.fass.org/page.asp?pageID=191 >.Grist, Stuart. “True Threats to Reason.” New Scientist 197.2643 (16 Feb. 2008): 22-23.Hochschild, Adam. Bury the Chains: The British Struggle to Abolish Slavery. London: Pan Books, 2006.Horsch, Robert, Robert Fraley, Stephen Rogers, Patricia Sanders, Alan Lloyd, and Nancy Hoffman. “Inheritance of Functional Foreign Genes in Plants.” Science 223 (1984): 496-498.HRA. Awareness of and Attitudes toward Nanotechnology and Federal Regulatory Agencies: A Report of Findings. Washington: Peter D. Hart Research Associates, 25 Sep. 2007.Levidow, Les, Joseph Murphy, and Susan Carr. “Recasting ‘Substantial Equivalence’: Transatlantic Governance of GM Food.” Science, Technology, and Human Values 32.1 (Jan. 2007): 26-64.Lightfoot, David, Rajsree Mungur, Rafiqa Ameziane, Anthony Glass, and Karen Berhard. “Transgenic Manipulation of C and N Metabolism: Stretching the GMO Equivalence.” American Society of Plant Biologists Conference: Plant Biology, 2000.MARS. “Final Report: Australian Community Attitudes Held about Nanotechnology – Trends 2005-2007.” Report prepared for Department of Industry, Tourism and Resources (DITR). Miranda, NSW: Market Attitude Research Services, 12 June 2007.Miller, Georgia, and Rye Senjen. “Out of the Laboratory and on to Our Plates: Nanotechnology in Food and Agriculture.” Friends of the Earth, 2008. 24 Apr. 2008 < http://nano.foe.org.au/node/220 >.Miller, Henry. “Substantial Equivalence: Its Uses and Abuses.” Nature Biotechnology 17 (7 Nov. 1999): 1042-1043.Millstone, Erik, Eric Brunner, and Sue Mayer. “Beyond ‘Substantial Equivalence’.” Nature 401 (7 Oct. 1999): 525-526.Monsanto. “Posilac, Bovine Somatotropin by Monsanto: Questions and Answers about bST from the United States Food and Drug Administration.” 2007. 24 Apr. 2008 < http://www.monsantodairy.com/faqs/fda_safety.html >.Organisation for Economic Co-operation and Development (OECD). “For a Better World Economy.” Paris: OECD, 2008. 24 Apr. 2008 < http://www.oecd.org/ >.———. “Safety Evaluation of Foods Derived by Modern Biotechnology: Concepts and Principles.” Paris: OECD, 1993.Orwell, George. Animal Farm. Adelaide: ebooks@Adelaide, 2004 (1945). 30 Apr. 2008 < http://ebooks.adelaide.edu.au/o/orwell/george >.Paull, John. “Provenance, Purity and Price Premiums: Consumer Valuations of Organic and Place-of-Origin Food Labelling.” Research Masters thesis, University of Tasmania, Hobart, 2006. 24 Apr. 2008 < http://eprints.utas.edu.au/690/ >.Paull, John, and Kristen Lyons. “Nanotechnology: The Next Challenge for Organics.” Journal of Organic Systems (in press).Pennsylvania Department of Agriculture (PDA). “Revised Standards and Procedure for Approval of Proposed Labeling of Fluid Milk.” Milk Labeling Standards (2.0.1.17.08). Bureau of Food Safety and Laboratory Services, Pennsylvania Department of Agriculture, 17 Jan. 2008. ———. “Standards and Procedure for Approval of Proposed Labeling of Fluid Milk, Milk Products and Manufactured Dairy Products.” Milk Labeling Standards (2.0.1.17.08). Bureau of Food Safety and Laboratory Services, Pennsylvania Department of Agriculture, 22 Oct. 2007.Roco, Mihail. “National Nanotechnology Initiative – Past, Present, Future.” In William Goddard, Donald Brenner, Sergy Lyshevski and Gerald Iafrate, eds. Handbook of Nanoscience, Engineering and Technology. 2nd ed. Boca Raton, FL: CRC Press, 2007.Romeis, Jorg, Detlef Bartsch, Franz Bigler, Marco Candolfi, Marco Gielkins, et al. “Assessment of Risk of Insect-Resistant Transgenic Crops to Nontarget Arthropods.” Nature Biotechnology 26.2 (Feb. 2008): 203-208.Schauzu, Marianna. “The Concept of Substantial Equivalence in Safety Assessment of Food Derived from Genetically Modified Organisms.” AgBiotechNet 2 (Apr. 2000): 1-4.Soil Association. “Soil Association First Organisation in the World to Ban Nanoparticles – Potentially Toxic Beauty Products That Get Right under Your Skin.” London: Soil Association, 17 Jan. 2008. 24 Apr. 2008 < http://www.soilassociation.org/web/sa/saweb.nsf/848d689047 cb466780256a6b00298980/42308d944a3088a6802573d100351790!OpenDocument >.Smith, Jeffrey. Genetic Roulette: The Documented Health Risks of Genetically Engineered Foods. Fairfield, Iowa: Yes! Books, 2007.———. Seeds of Deception. Melbourne: Scribe, 2004.U.S. Dairy Export Council (USDEC). Bovine Somatotropin (BST) Backgrounder. Arlington, VA: U.S. Dairy Export Council, 2006.U.S. Food and Drug Administration (USFDA). Animal Cloning: A Draft Risk Assessment. Rockville, MD: Center for Veterinary Medicine, U.S. Food and Drug Administration, 28 Dec. 2006.———. FDA and Nanotechnology Products. U.S. Department of Health and Human Services, U.S. Food and Drug Administration, 2008. 24 Apr. 2008 < http://www.fda.gov/nanotechnology/faqs.html >.Woodrow Wilson International Center for Scholars (WWICS). “A Nanotechnology Consumer Products Inventory.” Data set as at Sep. 2007. Woodrow Wilson International Center for Scholars, Project on Emerging Technologies, Sep. 2007. 24 Apr. 2008 < http://www.nanotechproject.org/inventories/consumer >.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Potts, Jason. "The Alchian-Allen Theorem and the Economics of Internet Animals". M/C Journal 17, n.º 2 (18 de febrero de 2014). http://dx.doi.org/10.5204/mcj.779.

Texto completo
Resumen
Economics of Cute There are many ways to study cute: for example, neuro-biology (cute as adaptation); anthropology (cute in culture); political economy (cute industries, how cute exploits consumers); cultural studies (social construction of cute); media theory and politics (representation and identity of cute), and so on. What about economics? At first sight, this might point to a money-capitalism nexus (“the cute economy”), but I want to argue here that the economics of cute actually works through choice interacting with fixed costs and what economists call ”the substitution effect”. Cute, in conjunction with the Internet, affects the trade-offs involved in choices people make. Let me put that more starkly: cute shapes the economy. This can be illustrated with internet animals, which at the time of writing means Grumpy Cat. I want to explain how that mechanism works – but to do so I will need some abstraction. This is not difficult – a simple application of a well-known economics model, namely the Allen-Alchian theorem, or the “third law of demand”. But I am going to take some liberties in order to represent that model clearly in this short paper. Specifically, I will model just two extremes of quality (“opera” and “cat videos”) to represent end-points of a spectrum. I will also assume that the entire effect of the internet is to lower the cost of cat videos. Now obviously these are just simplifying assumptions “for the purpose of the model”. And the purpose of the model is to illuminate a further aspect of how we might understand cute, by using an economic model of choice and its consequences. This is a standard technique in economics, but not so in cultural studies, so I will endeavour to explain these moments as we go, so as to avoid any confusion about analytic intent. The purpose of this paper is to suggest a way that a simple economic model might be applied to augment the cultural study of cute by seeking to unpack its economic aspect. This can be elucidated by considering the rise of internet animals as a media-cultural force, as epitomized by “cat videos”. We can explain this through an application of price theory and the theory of demand that was first proposed by Armen Alchian and William Allen. They showed how an equal fixed cost that was imposed to both high-quality and low-quality goods alike caused a shift in consumption toward the higher-quality good, because it is now relatively cheaper. Alchian and Allen had in mind something like transport costs on agricultural goods (such as apples). But it is also true that the same effect works in reverse (Cowen), and the purpose of this paper is to develop that logic to contribute to explaining how certain structural shifts in production and consumption in digital media, particularly the rise of blog formats such as Tumblr, a primary supplier of kittens on the Internet, can be in part understood as a consequence of this economic mechanism. There are three key assumptions to build this argument. The first is that the cost of the internet is independent of what it carries. This is certainly true at the level of machine code, and largely true at higher levels. What might be judged aesthetically high quality or low quality content – say of a Bach cantata or a funny cat video – are treated the same way if they both have the same file size. This is a physical and computational aspect of net-neutrality. The internet – or digitization – functions as a fixed cost imposed regardless of what cultural quality is moving across it. Second, while there are costs to using the internet (for example, in hardware or concerning digital literacy) these costs are lower than previous analog forms of information and cultural production and dissemination. This is not an empirical claim, but a logical one (revealed preference): if it were not so, people would not have chosen it. The first two points – net neutrality and lowered cost – I want to take as working assumptions, although they can obviously be debated. But that is not the purpose of the paper, which is instead the third point – the “Alchian-Allen theorem”, or the third fundamental law of demand. The Alchian-Allen Theorem The Alchian-Allen theorem is an extension of the law of demand (Razzolini et al) to consider how the distribution of high quality and low quality substitutes of the same good (such as apples) is affected by the imposition of a fixed cost (such as transportation). It is also known as the “shipping the good apples out” theorem, after Borcherding and Silberberg explained why places that produce a lot of apples – such as Seattle in the US – often also have low supplies of high quality apples compared to places that do not produce apples, such as New York. The puzzle of “why can’t you get good apples in Seattle?” is a simple but clever application of price theory. When a place produces high quality and low quality items, it will be rational for those in faraway places to consume the high quality items, and it will be rational for the producers to ship them, leaving only the low quality items locally.Why? Assume preferences and incomes are the same everywhere and that transport cost is the same regardless of whether the item shipped is high or low quality. Both high quality and low quality apples are more expensive in New York compared to Seattle, but because the fixed transport cost applies to both the high quality apples are relatively less expensive. Rational consumers in New York will consume more high quality apples. This makes fewer available in Seattle.Figure 1: Change in consumption ratio after the imposition of a fixed cost to all apples Another example: Australians drink higher quality Californian wine than Californians, and vice versa, because it is only worth shipping the high quality wine out. A counter-argument is that learning effects dominate: with high quality local product, local consumers learn to appreciate quality, and have different preferences (Cowen and Tabarrok).The Alchian-Allen theorem applies to any fixed cost that applies generally. For example, consider illegal drugs (such as alcohol during the US prohibition, or marijuana or cocaine presently) and the implication of a fixed penalty – such as a fine, or prison sentence, which is like a cost – applied to trafficking or consumption. Alchian-Allen predicts a shift toward higher quality (or stronger) drugs, because with a fixed penalty and probability of getting caught, the relatively stronger substance is now relatively cheaper. Empirical work finds that this effect did occur during alcohol prohibition, and is currently occurring in narcotics (Thornton Economics of Prohibition, "Potency of illegal drugs").Another application proposed by Steven Cuellar uses Alchian-Allen to explain a well-known statistical phenomenon why women taking the contraceptive pill on average prefer “more masculine” men. This is once again a shift toward quality predicted on falling relative price based on a common ‘fixed price’ (taking the pill) of sexual activity. Jean Eid et al show that the result also applies to racehorses (the good horses get shipped out), and Staten and Umbeck show it applies to students – the good students go to faraway universities, and the good student in those places do the same. So that’s apples, drugs, sex and racehorses. What about the Internet and kittens?Allen-Alchian Explains Why the Internet Is Made of CatsIn analog days, before digitization and Internet, the transactions costs involved with various consumption items, whether commodities or media, meant that the Alchian-Allen effect pushed in the direction of higher quality, bundled product. Any additional fixed costs, such as higher transport costs, or taxes or duties, or transactions costs associated with search and coordination and payment, i.e. costs that affected all substitutes in the same way, would tend to make the higher quality item relatively less expensive, increasing its consumption.But digitisation and the Internet reverse the direction of these transactions costs. Rather than adding a fixed cost, such as transport costs, the various aspects of the digital revolution are equivalent to a fall in fixed costs, particularly access.These factors are not just one thing, but a suite of changes that add up to lowered transaction costs in the production, distribution and consumption of media, culture and games. These include: The internet and world-wide-web, and its unencumbered operation The growth and increasing efficacy of search technology Growth of universal broadband for fast, wide band-width access Growth of mobile access (through smartphones and other appliances) Growth of social media networks (Facebook, Twitter; Metcalfe’s law) Growth of developer and distribution platforms (iPhone, android, iTunes) Globally falling hardware and network access costs (Moore’s law) Growth of e-commerce (Ebay, Amazon, Etsy) and e-payments (paypal, bitcoin) Expansions of digital literacy and competence Creative commons These effects do not simply shift us down a demand curve for each given consumption item. This effect alone simply predicts that we consume more. But the Alchian-Allen effect makes a different prediction, namely that we consume not just more, but also different.These effects function to reduce the overall fixed costs or transactions costs associated with any consumption, sharing, or production of media, culture or games over the internet (or in digital form). With this overall fixed cost component now reduced, it represents a relatively larger decline in cost at the lower-quality, more bite-sized or unbundled end of the media goods spectrum. As such, this predicts a change in the composition of the overall consumption basket to reflect the changed relative prices that these above effects give rise to. See Figure 2 below (based on a blog post by James Oswald). The key to the economics of cute, in consequence of digitisation, is to follow through the qualitative change that, because of the Alchian-Allen effect, moves away from the high-quality, highly-bundled, high-value end of the media goods spectrum. The “pattern prediction” here is toward more, different, and lower quality: toward five minutes of “Internet animals”, rather than a full day at the zoo. Figure 2: Reducing transaction costs lowers the relative price of cat videos Consider five dimensions in which this more and different tendency plays out. Consumption These effects make digital and Internet-based consumption cheaper, shifting us down a demand curve, so we consume more. That’s the first law of demand in action: i.e. demand curves slope downwards. But a further effect – brilliantly set out in Cowen – is that we also consume lower-quality media. This is not a value judgment. These lower-quality media may well have much higher aesthetic value. They may be funnier, or more tragic and sublime; or faster, or not. This is not about absolute value; only about relative value. Digitization operating through Allen-Alchian skews consumption toward the lower quality ends in some dimensions: whether this is time, as in shorter – or cost, as in cheaper – or size, as in smaller – or transmission quality, as in gifs. This can also be seen as a form of unbundling, of dropping of dimensions that are not valued to create a simplified product.So we consume different, with higher variance. We sample more than we used to. This means that we explore a larger information world. Consumption is bite-sized and assorted. This tendency is evident in the rise of apps and in the proliferation of media forms and devices and the value of interoperability.ProductionAs consumption shifts (lower quality, greater variety), so must production. The production process has two phases: (1) figuring out what to do, or development; and (2) doing it, or making. The world of trade and globalization describes the latter part: namely efficient production. The main challenge is the world of innovation: the entrepreneurial and experimental world of figuring out what to do, and how. It is this second world that is radically transformed by implications of lowered transaction costs.One implication is growth of user-communities based around collaborative media projects (such as open source software) and community-based platforms or common pool resources for sharing knowledge, such as the “Maker movement” (Anderson 2012). This phenomenon of user-co-creation, or produsers, has been widely recognized as an important new phenomenon in the innovation and production process, particularly those processes associated with new digital technologies. There are numerous explanations for this, particularly around preferences for cooperation, community-building, social learning and reputational capital, and entrepreneurial expectations (Quiggin and Potts, Banks and Potts). Business Models The Alchian-Allen effect on consumption and production follows through to business models. A business model is a way of extracting value that represents some strategic equilibrium between market forms, organizational structures, technological possibilities and institutional framework and environmental conditions that manifests in entrepreneurial patterns of business strategy and particular patterns of investment and organization. The discovery of effective business models is a key process of market capitalist development and competition. The Alchian-Allen effect impacts on the space of effective viable business models. Business models that used to work will work less well, or not at all. And new business models will be required. It is a significant challenge to develop these “economic technologies”. Perhaps no less so than development of the physical technologies, new business models are produced through experimental trial and error. They cannot be known in advance or planned. But business models will change, which will affect not only the constellation of existing companies and the value propositions that underlie them, but also the broader specializations based on these in terms of skill sets held and developed by people, locations of businesses and people, and so on. New business models will emerge from a process of Schumpeterian creative destruction as it unfolds (Beinhocker). The large production, high development cost, proprietary intellectual property and systems based business model is not likely to survive, other than as niche areas. More experimental, discovery-focused, fast-development-then-scale-up based business models are more likely to fit the new ecology. Social Network Markets & Novelty Bundling MarketsThe growth of variety and diversity of choice that comes with this change in the way media is consumed to reflect a reallocation of consumption toward smaller more bite-sized, lower valued chunks (the Alchian-Allen effect) presents consumers with a problem, namely that they have to make more choices over novelty. Choice over novelty is difficult for consumers because it is experimental and potentially costly due to risk of mistakes (Earl), but it also presents entrepreneurs with an opportunity to seek to help solve that problem. The problem is a simple consequence of bounded rationality and time scarcity. It is equivalent to saying that the cost of choice rises monotonically with the number of choices, and that because there is no way to make a complete rational choice, agents will use decision or choice heuristics. These heuristics can be developed independently by the agents themselves through experience, or they can be copied or adopted from others (Earl and Potts). What Potts et al call “social network markets” and what Potts calls “novelty bundling markets” are both instances of the latter process of copying and adoption of decision rules. Social network markets occur when agents use a “copy the most common” or “copy the highest rank” meta-level decision rule (Bentley et al) to deal with uncertainty. Social network markets can be efficient aggregators of distributed information, but they can also be path-dependent, and usually lead to winner-take all situations and dynamics. These can result in huge pay-offs differentials between first and second or fifth place, even when the initial quality differentials are slight or random. Diversity, rapid experimentation, and “fast-failure” are likely to be effective strategies. It also points to the role of trust and reputation in using adopted decision rules and the information economics that underlies that: namely that specialization and trade applies to the production and consumption of information as well as commodities. Novelty bundling markets are an entrepreneurial response to this problem, and observable in a range of new media and creative industries contexts. These include arts, music or food festivals or fairs where entertainment and sociality is combined with low opportunity cost situations in which to try bundles of novelty and connect with experts. These are by agents who developed expert preferences through investment and experience in consumption of the particular segment or domain. They are expert consumers and are selling their “decision rules” and not just the product. The more production and consumption of media and digital information goods and services experiences the Alchian-Allen effect, the greater the importance of novelty bundling markets. Intellectual Property & Regulation A further implication is that rent-seeking solutions may also emerge. This can be seen in two dimensions; pursuit of intellectual property (Boldrin and Levine); and demand for regulations (Stigler). The Alchian-Allen induced shift will affect markets and business models (and firms), and because this will induce strategic defensive and aggressive responses from different organizations. Some organizations will seek to fight and adapt to this new world through innovative competition. Other firms will fight through political connections. Most incumbent firms will have substantial investments in IP or in the business model it supports. Yet the intellectual property model is optimized for high-quality large volume centralized production and global sales of undifferentiated product. Much industrial and labour regulation is built on that model. How governments support such industries is predicated on the stability of this model. The Alchian-Allen effect threatens to upset that model. Political pushback will invariably take the form of opposing most new business models and the new entrants they carry. Conclusion I have presented here a lesser-known but important theorem in applied microeconomics – the Alchian-Allen effect – and explain why its inverse is central to understanding the evolution of new media industries, and also why cute animals proliferate on the Internet. The theorem states that when a fixed cost is added to substitute goods, consumers will shift to the higher quality item (now relatively less expensive). The theorem also holds in reverse, when a fixed cost is removed from substitute items we expect a shift to lower quality consumption. The Internet has dramatically lowered fixed costs of access to media consumption, and various development platforms have similarly lowered the costs of production. Alchian-Allen predicts a shift to lower-quality, ”bittier” cuter consumption (Cowen). References Alchian, Arman, and William Allen. Exchange and Production. 2nd ed. Belmont, CA: Wadsworth, 1967. Anderson, Chris. Makers. New York: Crown Business, 2012. Banks, John, and Jason Potts. "Consumer Co-Creation in Online Games." New Media and Society 12.2 (2010): 253-70. Beinhocker, Eric. Origin of Wealth. Cambridge, Mass.: Harvard University Press, 2005. Bentley, R., et al. "Regular Rates of Popular Culture Change Reflect Random Copying." Evolution and Human Behavior 28 (2007): 151-158. Borcherding, Thomas, and Eugene Silberberg. "Shipping the Good Apples Out: The Alchian and Allen Theorem Reconsidered." Journal of Political Economy 86.1 (1978): 131-6. Cowen, Tyler. Create Your Own Economy. New York: Dutton, 2009. (Also published as The Age of the Infovore: Succeeding in the Information Economy. Penguin, 2010.) Cowen, Tyler, and Alexander Tabarrok. "Good Grapes and Bad Lobsters: The Alchian and Allen Theorem Revisited." Journal of Economic Inquiry 33.2 (1995): 253-6. Cuellar, Steven. "Sex, Drugs and the Alchian-Allen Theorem." Unpublished paper, 2005. 29 Apr. 2014 ‹http://www.sonoma.edu/users/c/cuellar/research/Sex-Drugs.pdf›.Earl, Peter. The Economic Imagination. Cheltenham: Harvester Wheatsheaf, 1986. Earl, Peter, and Jason Potts. "The Market for Preferences." Cambridge Journal of Economics 28 (2004): 619–33. Eid, Jean, Travis Ng, and Terence Tai-Leung Chong. "Shipping the Good Horses Out." Wworking paper, 2012. http://homes.chass.utoronto.ca/~ngkaho/Research/shippinghorses.pdf Potts, Jason, et al. "Social Network Markets: A New Definition of Creative Industries." Journal of Cultural Economics 32.3 (2008): 166-185. Quiggin, John, and Jason Potts. "Economics of Non-Market Innovation & Digital Literacy." Media International Australia 128 (2008): 144-50. Razzolini, Laura, William Shughart, and Robert Tollison. "On the Third Law of Demand." Economic Inquiry 41.2 (2003): 292–298. Staten, Michael, and John Umbeck. “Shipping the Good Students Out: The Effect of a Fixed Charge on Student Enrollments.” Journal of Economic Education 20.2 (1989): 165-171. Stigler, George. "The Theory of Economic Regulation." Bell Journal of Economics 2.1 (1971): 3-22. Thornton, Mark. The Economics of Prohibition. Salt Lake City: University of Utah Press, 1991.Thornton, Mark. "The Potency of Illegal Drugs." Journal of Drug Issues 28.3 (1998): 525-40.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Brien, Donna Lee, Leonie Rutherford y Rosemary Williamson. "Hearth and Hotmail". M/C Journal 10, n.º 4 (1 de agosto de 2007). http://dx.doi.org/10.5204/mcj.2696.

Texto completo
Resumen
Introduction It has frequently been noted that ICTs and social networking applications have blurred the once-clear boundary between work, leisure and entertainment, just as they have collapsed the distinction between public and private space. While each individual has a sense of what “home” means, both in terms of personal experience and more conceptually, the following three examples of online interaction (based on participants’ interest, or involvement, in activities traditionally associated with the home: pet care, craft and cooking) suggest that the utilisation of online communication technologies can lead to refined and extended definitions of what “home” is. These examples show how online communication can assist in meeting the basic human needs for love, companionship, shelter and food – needs traditionally supplied by the home environment. They also provide individuals with a considerably expanded range of opportunities for personal expression and emotional connection, as well as creative and commercial production, than that provided by the purely physical (and, no doubt, sometimes isolated and isolating) domestic environment. In this way, these case studies demonstrate the interplay and melding of physical and virtual “home” as domestic practices leach from the most private spaces of the physical home into the public space of the Internet (for discussion, see Gorman-Murray, Moss, and Rose). At the same time, online interaction can assert an influence on activity within the physical space of the home, through the sharing of advice about, and modeling of, domestic practices and processes. A Dog’s (Virtual) Life The first case study primarily explores the role of online communities in the formation and expression of affective values and personal identity – as traditionally happens in the domestic environment. Garber described the 1990s as “the decade of the dog” (20), citing a spate of “new anthropomorphic” (22) dog books, Internet “dog chat” sites, remakes of popular classics such as Lassie Come Home, dog friendly urban amenities, and the meteoric rise of services for pampered pets (28-9). Loving pets has become a lifestyle and culture, witnessed and commodified in Pet Superstores as well as in dog collectables and antiques boutiques, and in publications like The Bark (“the New Yorker of Dog Magazines”) and Clean Run, the international agility magazine, Website, online book store and information gateway for agility products and services. Available online resources for dog lovers have similarly increased rapidly during the decade since Garber’s book was published, with the virtual world now catering for serious hobby trainers, exhibitors and professionals as well as the home-based pet lover. At a recent survey, Yahoo Groups – a personal communication portal that facilitates social networking, in this case enabling users to set up electronic mailing lists and Internet forums – boasted just over 9,600 groups servicing dog fanciers and enthusiasts. The list Dogtalk is now an announcement only mailing list, but was a vigorous discussion forum until mid-2006. Members of Dogtalk were Australian-based “clicker-trainers”, serious hobbyist dog trainers, many of whom operated micro-businesses providing dog training or other pet-related services. They shared an online community, but could also engage in “flesh-meets” at seminars, conferences and competitive dog sport meets. An author of this paper (Rutherford) joined this group two years ago because of her interest in clicker training. Clicker training is based on an application of animal learning theory, particularly psychologist E. F. Skinner’s operant conditioning, so called because of the trademark use of a distinctive “click” sound to mark a desired behaviour that is then rewarded. Clicker trainers tend to dismiss anthropomorphic pack theory that positions the human animal as fundamentally opposed to non-human animals and, thus, foster a partnership (rather than a dominator) mode of social and learning relationships. Partnership and nurturance are common themes within the clicker community (as well as in more traditional “home” locations); as is recognising and valuing the specific otherness of other species. Typically, members regard their pets as affective equals or near-equals to the human animals that are recognised members of their kinship networks. A significant function of the episodic biographical narratives and responses posted to this list was thus to affirm and legitimate this intra-specific kinship as part of normative social relationship – a perspective that is not usually validated in the general population. One of the more interesting nexus that evolved within Dogtalk links the narrativisation of the pet in the domestic sphere with the pictorial genre of the family album. Emergent technologies, such as digital cameras together with Web-based image manipulation software and hosting (as provided by portals like Photobucket and Flickr ) democratise high quality image creation and facilitate the sharing of these images. Increasingly, the Dogtalk list linked to images uploaded to free online galleries, discussed digital image composition and aesthetics, and shared technical information about cameras and online image distribution. Much of this cultural production and circulation was concerned with digitally inscribing particular relationships with individual animals into cultural memory: a form of family group biography (for a discussion of the family photograph as a display of extended domestic space, see Rose). The other major non-training thread of the community involves the sharing and witnessing of the trauma suffered due to the illness and loss of pets. While mourning for human family members is supported in the off-line world – with social infrastructure, such as compassionate leave and/or bereavement counselling, part of professional entitlements – public mourning for pets is not similarly supported. Yet, both cultural studies (in its emphasis on cultural memory) and trauma theory have highlighted the importance of social witnessing, whereby traumatic memories must be narratively integrated into memory and legitimised by the presence of a witness in order to loosen their debilitating hold (Felman and Laub 57). Postings on the progress of a beloved animal’s illness or other misfortune and death were thus witnessed and affirmed by other Dogtalk list members – the sick or deceased pet becoming, in the process, a feature of community memory, not simply an individual loss. In terms of such biographical narratives, memory and history are not identical: “Any memories capable of being formed, retained or articulated by an individual are always a function of socially constituted forms, narratives and relations … Memory is always subject to active social manipulation and revision” (Halbwachs qtd. in Crewe 75). In this way, emergent technologies and social software provide sites, akin to that of physical homes, for family members to process individual memories into cultural memory. Dogzonline, the Australian Gateway site for purebred dog enthusiasts, has a forum entitled “Rainbow Bridge” devoted to textual and pictorial memorialisation of deceased pet dogs. Dogster hosts the For the Love of Dogs Weblog, in which images and tributes can be posted, and also provides links to other dog oriented Weblogs and Websites. An interesting combination of both therapeutic narrative and the commodification of affect is found in Lightning Strike Pet Loss Support which, while a memorial and support site, also provides links to the emerging profession of pet bereavement counselling and to suppliers of monuments and tributary urns for home or other use. loobylu and Narratives of Everyday Life The second case study focuses on online interactions between craft enthusiasts who are committed to the production of distinctive objects to decorate and provide comfort in the home, often using traditional methods. In the case of some popular craft Weblogs, online conversations about craft are interspersed with, or become secondary to, the narration of details of family life, the exploration of important life events or the recording of personal histories. As in the previous examples, the offering of advice and encouragement, and expressions of empathy and support, often characterise these interactions. The loobylu Weblog was launched in 2001 by illustrator and domestic crafts enthusiast Claire Robertson. Robertson is a toy maker and illustrator based in Melbourne, Australia, whose clients have included prominent publishing houses, magazines and the New York Public Library (Robertson “Recent Client List” online). She has achieved a measure of public recognition: her loobylu Weblog has won awards and been favourably commented upon in the Australian press (see Robertson “Press for loobylu” online). In 2005, an article in The Age placed Robertson in the context of a contemporary “craft revolution”, reporting her view that this “revolution” is in “reaction to mass consumerism” (Atkinson online). The hand-made craft objects featured in Robertson’s Weblogs certainly do suggest engagement with labour-intensive pursuits and the construction of unique objects that reject processes of mass production and consumption. In this context, loobylu is a vehicle for the display and promotion of Robertson’s work as an illustrator and as a craft practitioner. While skills-based, it also, however, promotes a family-centred lifestyle; it advocates the construction by hand of objects designed to enhance the appearance of the family home and the comfort of its inhabitants. Its specific subject matter extends to related aspects of home and family as, in addition to instructions, ideas and patterns for craft, the Weblog features information on commercially available products for home and family, recipes, child rearing advice and links to 27 other craft and other sites (including Nigella Lawson’s, discussed below). The primary member of its target community is clearly the traditional homemaker – the mother – as well as those who may aspire to this role. Robertson does not have the “celebrity” status of Lawson and Jamie Oliver (discussed below), nor has she achieved their market saturation. Indeed, Robertson’s online presence suggests a modest level of engagement that is placed firmly behind other commitments: in February 2007, she announced an indefinite suspension of her blog postings so that she could spend more time with her family (Robertson loobylu 17 February 2007). Yet, like Lawson and Oliver, Robertson has exploited forms of domestic competence traditionally associated with women and the home, and the non-traditional medium of the Internet has been central to her endeavours. The content of the loobylu blog is, unsurprisingly, embedded in, or an accessory to, a unifying running commentary on Robertson’s domestic life as a parent. Miles, who has described Weblogs as “distributed documentaries of the everyday” (66) sums this up neatly: “the weblogs’ governing discursive quality is the manner in which it is embodied within the life world of its author” (67). Landmark family events are narrated on loobylu and some attract deluges of responses: the 19 June 2006 posting announcing the birth of Robertson’s daughter Lily, for example, drew 478 responses; five days later, one describing the difficult circumstances of her birth drew 232 comments. All of these comments are pithy, with many being simple empathetic expressions or brief autobiographically based commentaries on these events. Robertson’s news of her temporary retirement from her blog elicited 176 comments that both supported her decision and also expressed a sense of loss. Frequent exclamation marks attest visually to the emotional intensity of the responses. By narrating aspects of major life events to which the target audience can relate, the postings represent a form of affective mass production and consumption: they are triggers for a collective outpouring of largely homogeneous emotional reaction (joy, in the case of Lily’s birth). As collections of texts, they can be read as auto/biographic records, arranged thematically, that operate at both the individual and the community levels. Readers of the family narratives and the affirming responses to them engage in a form of mass affirmation and consumerism of domestic experience that is easy, immediate, attractive and free of charge. These personal discourses blend fluidly with those of a commercial nature. Some three weeks after loobylu announced the birth of her daughter, Robertson shared on her Weblog news of her mastitis, Lily’s first smile and the family’s favourite television programs at the time, information that many of us would consider to be quite private details of family life. Three days later, she posted a photograph of a sleeping baby with a caption that skilfully (and negatively) links it to her daughter: “Firstly – I should mention that this is not a photo of Lily”. The accompanying text points out that it is a photo of a baby with the “Zaky Infant Sleeping Pillow” and provides a link to the online pregnancystore.com, from which it can be purchased. A quotation from the manufacturer describing the merits of the pillow follows. Robertson then makes a light-hearted comment on her experiences of baby-induced sleep-deprivation, and the possible consequences of possessing the pillow. Comments from readers also similarly alternate between the personal (sharing of experiences) to the commercial (comments on the product itself). One offshoot of loobylu suggests that the original community grew to an extent that it could support specialised groups within its boundaries. A Month of Softies began in November 2004, describing itself as “a group craft project which takes place every month” and an activity that “might give you a sense of community and kinship with other similar minded crafty types across the Internet and around the world” (Robertson A Month of Softies online). Robertson gave each month a particular theme, and readers were invited to upload a photograph of a craft object they had made that fitted the theme, with a caption. These were then included in the site’s gallery, in the order in which they were received. Added to the majority of captions was also a link to the site (often a business) of the creator of the object; another linking of the personal and the commercial in the home-based “cottage industry” sense. From July 2005, A Month of Softies operated through a Flickr site. Participants continued to submit photos of their craft objects (with captions), but also had access to a group photograph pool and public discussion board. This extension simulates (albeit in an entirely visual way) the often home-based physical meetings of craft enthusiasts that in contemporary Australia take the form of knitting, quilting, weaving or other groups. Chatting with, and about, Celebrity Chefs The previous studies have shown how the Internet has broken down many barriers between what could be understood as the separate spheres of emotional (that is, home-based private) and commercial (public) life. The online environment similarly enables the formation and development of fan communities by facilitating communication between those fans and, sometimes, between fans and the objects of their admiration. The term “fan” is used here in the broadest sense, referring to “a person with enduring involvement with some subject or object, often a celebrity, a sport, TV show, etc.” (Thorne and Bruner 52) rather than focusing on the more obsessive and, indeed, more “fanatical” aspects of such involvement, behaviour which is, increasingly understood as a subculture of more variously constituted fandoms (Jenson 9-29). Our specific interest in fandom in relation to this discussion is how, while marketers and consumer behaviourists study online fan communities for clues on how to more successfully market consumer goods and services to these groups (see, for example, Kozinets, “I Want to Believe” 470-5; “Utopian Enterprise” 67-88; Algesheimer et al. 19-34), fans regularly subvert the efforts of those urging consumer consumption to utilise even the most profit-driven Websites for non-commercial home-based and personal activities. While it is obvious that celebrities use the media to promote themselves, a number of contemporary celebrity chefs employ the media to construct and market widely recognisable personas based on their own, often domestically based, life stories. As examples, Jamie Oliver and Nigella Lawson’s printed books and mass periodical articles, television series and other performances across a range of media continuously draw on, elaborate upon, and ultimately construct their own lives as the major theme of these works. In this, these – as many other – celebrity chefs draw upon this revelation of their private lives to lend authenticity to their cooking, to the point where their work (whether cookbook, television show, advertisement or live chat room session with their fans) could be described as “memoir-illustrated-with-recipes” (Brien and Williamson). This generic tendency influences these celebrities’ communities, to the point where a number of Websites devoted to marketing celebrity chefs as product brands also enable their fans to share their own life stories with large readerships. Oliver and Lawson’s official Websites confirm the privileging of autobiographical and biographical information, but vary in tone and approach. Each is, for instance, deliberately gendered (see Hollows’ articles for a rich exploration of gender, Oliver and Lawson). Oliver’s hip, boyish, friendly, almost frantic site includes the what are purported-to-be self-revelatory “Diary” and “About me” sections, a selection of captioned photographs of the chef, his family, friends, co-workers and sponsors, and his Weblog as well as footage streamed “live from Jamie’s phone”. This self-revelation – which includes significant details about Oliver’s childhood and his domestic life with his “lovely girls, Jools [wife Juliette Norton], Poppy and Daisy” – completely blurs the line between private life and the “Jamie Oliver” brand. While such revelation has been normalised in contemporary culture, this practice stands in great contrast to that of renowned chefs and food writers such as Elizabeth David, Julia Child, James Beard and Margaret Fulton, whose work across various media has largely concentrated on food, cooking and writing about cooking. The difference here is because Oliver’s (supposedly private) life is the brand, used to sell “Jamie Oliver restaurant owner and chef”, “Jamie Oliver cookbook author and TV star”, “Jamie Oliver advertising spokesperson for Sainsbury’s supermarket” (from which he earns an estimated £1.2 million annually) (Meller online) and “Jamie Oliver social activist” (made MBE in 2003 after his first Fifteen restaurant initiative, Oliver was named “Most inspiring political figure” in the 2006 Channel 4 Political Awards for his intervention into the provision of nutritious British school lunches) (see biographies by Hildred and Ewbank, and Smith). Lawson’s site has a more refined, feminine appearance and layout and is more mature in presentation and tone, featuring updates on her (private and public) “News” and forthcoming public appearances, a glamorous selection of photographs of herself from the past 20 years, and a series of print and audio interviews. Although Lawson’s children have featured in some of her television programs and her personal misfortunes are well known and regularly commented upon by both herself and journalists (her mother, sister and husband died of cancer) discussions of these tragedies, and other widely known aspects of her private life such as her second marriage to advertising mogul Charles Saatchi, is not as overt as on Oliver’s site, and the user must delve to find it. The use of Lawson’s personal memoir, as sales tool, is thus both present and controlled. This is in keeping with Lawson’s professional experience prior to becoming the “domestic goddess” (Lawson 2000) as an Oxford graduated journalist on the Spectator and deputy literary editor of the Sunday Times. Both Lawson’s and Oliver’s Websites offer readers various ways to interact with them “personally”. Visitors to Oliver’s site can ask him questions and can access a frequently asked question area, while Lawson holds (once monthly, now irregularly) a question and answer forum. In contrast to this information about, and access to, Oliver and Lawson’s lives, neither of their Websites includes many recipes or other food and cooking focussed information – although there is detailed information profiling their significant number of bestselling cookbooks (Oliver has published 8 cookbooks since 1998, Lawson 5 since 1999), DVDs and videos of their television series and one-off programs, and their name branded product lines of domestic kitchenware (Oliver and Lawson) and foodstuffs (Oliver). Instruction on how to purchase these items is also featured. Both these sites, like Robertson’s, provide various online discussion fora, allowing members to comment upon these chefs’ lives and work, and also to connect with each other through posted texts and images. Oliver’s discussion forum section notes “this is the place for you all to chat to each other, exchange recipe ideas and maybe even help each other out with any problems you might have in the kitchen area”. Lawson’s front page listing states: “You will also find a moderated discussion forum, called Your Page, where our registered members can swap ideas and interact with each other”. The community participants around these celebrity chefs can be, as is the case with loobylu, divided into two groups. The first is “foodie (in Robertson’s case, craft) fans” who appear to largely engage with these Websites to gain, and to share, food, cooking and craft-related information. Such fans on Oliver and Lawson’s discussion lists most frequently discuss these chefs’ television programs and books and the recipes presented therein. They test recipes at home and discuss the results achieved, any problems encountered and possible changes. They also post queries and share information about other recipes, ingredients, utensils, techniques, menus and a wide range of food and cookery-related matters. The second group consists of “celebrity fans” who are attracted to the chefs (as to Robertson as craft maker) as personalities. These fans seek and share biographical information about Oliver and Lawson, their activities and their families. These two areas of fan interest (food/cooking/craft and the personal) are not necessarily or always separated, and individuals can be active members of both types of fandoms. Less foodie-orientated users, however (like users of Dogtalk and loobylu), also frequently post their own auto/biographical narratives to these lists. These narratives, albeit often fragmented, may begin with recipes and cooking queries or issues, but veer off into personal stories that possess only minimal or no relationship to culinary matters. These members also return to the boards to discuss their own revealed life stories with others who have commented on these narratives. Although research into this aspect is in its early stages, it appears that the amount of public personal revelation either encouraged, or allowed, is in direct proportion to the “open” friendliness of these sites. More thus are located in Oliver’s and less in Lawson’s, and – as a kind of “control” in this case study, but not otherwise discussed – none in that of Australian chef Neil Perry, whose coolly sophisticated Website perfectly complements Perry’s professional persona as the epitome of the refined, sophisticated and, importantly in this case, unapproachable, high-end restaurant chef. Moreover, non-cuisine related postings are made despite clear directions to the contrary – Lawson’s site stating: “We ask that postings are restricted to topics relating to food, cooking, the kitchen and, of course, Nigella!” and Oliver making the plea, noted above, for participants to keep their discussions “in the kitchen area”. Of course, all such contemporary celebrity chefs are supported by teams of media specialists who selectively construct the lives that these celebrities share with the public and the postings about others’ lives that are allowed to remain on their discussion lists. The intersection of the findings reported above with the earlier case studies suggests, however, that even these most commercially-oriented sites can provide a fruitful data regarding their function as home-like spaces where domestic practices and processes can be refined, and emotional relationships formed and fostered. In Summary As convergence results in what Turow and Kavanaugh call “the wired homestead”, our case studies show that physically home-based domestic interests and practices – what could be called “home truths” – are also contributing to a refiguration of the private/public interplay of domestic activities through online dialogue. In the case of Dogtalk, domestic space is reconstituted through virtual spaces to include new definitions of family and memory. In the case of loobylu, the virtual interaction facilitates a development of craft-based domestic practices within the physical space of the home, thus transforming domestic routines. Jamie Oliver’s and Nigella Lawson’s sites facilitate development of both skills and gendered identities by means of a bi-directional nexus between domestic practices, sites of home labour/identity production and public media spaces. As participants modify and redefine these online communities to best suit their own needs and desires, even if this is contrary to the stated purposes for which the community was instituted, online communities can be seen to be domesticated, but, equally, these modifications demonstrate that the activities and relationships that have traditionally defined the home are not limited to the physical space of the house. While virtual communities are “passage points for collections of common beliefs and practices that united people who were physically separated” (Stone qtd in Jones 19), these interactions can lead to shared beliefs, for example, through advice about pet-keeping, craft and cooking, that can significantly modify practices and routines in the physical home. Acknowledgments An earlier version of this paper was presented at the Association of Internet Researchers’ International Conference, Brisbane, 27-30 September 2006. The authors would like to thank the referees of this article for their comments and input. Any errors are, of course, our own. References Algesheimer, R., U. Dholake, and A. Herrmann. “The Social Influence of Brand Community: Evidence from European Car Clubs”. Journal of Marketing 69 (2005): 19-34. Atkinson, Frances. “A New World of Craft”. The Age (11 July 2005). 28 May 2007 http://www.theage.com.au/articles/2005/07/10/1120934123262.html>. Brien, Donna Lee, and Rosemary Williamson. “‘Angels of the Home’ in Cyberspace: New Technologies and Biographies of Domestic Production”. Paper. Biography and New Technologies conference. Humanities Research Centre, Australian National University, Canberra, ACT. 12-14 Sep. 2006. Crewe, Jonathan. “Recalling Adamastor: Literature as Cultural Memory in ‘White’ South Africa”. In Acts of Memory: Cultural Recall in the Present, eds. Mieke Bal, Jonathan Crewe, and Leo Spitzer. Hanover, NH: Dartmouth College, 1999. 75-86. Felman, Shoshana, and Dori Laub. Testimony: Crises of Witnessing in Literature, Psychoanalysis, and History. New York: Routledge, 1992. Garber, Marjorie. Dog Love. New York: Touchstone/Simon and Schuster, 1996. Gorman-Murray, Andrew. “Homeboys: Uses of Home by Gay Australian Men”. Social and Cultural Geography 7.1 (2006): 53-69. Halbwachs, Maurice. On Collective Memory. Trans. Lewis A. Closer. Chicago: U of Chicago P, 1992. Hildred, Stafford, and Tim Ewbank. Jamie Oliver: The Biography. London: Blake, 2001. Hollows, Joanne. “Feeling like a Domestic Goddess: Post-Feminism and Cooking.” European Journal of Cultural Studies 6.2 (2003): 179-202. ———. “Oliver’s Twist: Leisure, Labour and Domestic Masculinity in The Naked Chef.” International Journal of Cultural Studies 6.2 (2003): 229-248. Jenson, J. “Fandom as Pathology: The Consequences of Characterization”. The Adoring Audience; Fan Culture and Popular Media. Ed. L. A. Lewis. New York, NY: Routledge, 1992. 9-29. Jones, Steven G., ed. Cybersociety, Computer-Mediated Communication and Community. Thousand Oaks, CA: Sage, 1995. Kozinets, R.V. “‘I Want to Believe’: A Netnography of the X’Philes’ Subculture of Consumption”. Advances in Consumer Research 34 (1997): 470-5. ———. “Utopian Enterprise: Articulating the Meanings of Star Trek’s Culture of Consumption.” Journal of Consumer Research 28 (2001): 67-88. Lawson, Nigella. How to Be a Domestic Goddess: Baking and the Art of Comfort Cooking. London: Chatto and Windus, 2000. Meller, Henry. “Jamie’s Tips Spark Asparagus Shortages”. Daily Mail (17 June 2005). 21 Aug. 2007 http://www.dailymail.co.uk/pages/live/articles/health/dietfitness.html? in_article_id=352584&in_page_id=1798>. Miles, Adrian. “Weblogs: Distributed Documentaries of the Everyday.” Metro 143: 66-70. Moss, Pamela. “Negotiating Space in Home Environments: Older Women Living with Arthritis.” Social Science and Medicine 45.1 (1997): 23-33. Robertson, Claire. Claire Robertson Illustration. 2000-2004. 28 May 2007 . Robertson, Claire. loobylu. 16 Feb. 2007. 28 May 2007 http://www.loobylu.com>. Robertson, Claire. “Press for loobylu.” Claire Robertson Illustration. 2000-2004. 28 May 2007 http://www.clairetown.com/press.html>. Robertson, Claire. A Month of Softies. 28 May 2007. 21 Aug. 2007 . Robertson, Claire. “Recent Client List”. Claire Robertson Illustration. 2000-2004. 28 May 2007 http://www.clairetown.com/clients.html>. Rose, Gillian. “Family Photographs and Domestic Spacings: A Case Study.” Transactions of the Institute of British Geographers NS 28.1 (2003): 5-18. Smith, Gilly. Jamie Oliver: Turning Up the Heat. Sydney: Macmillian, 2006. Thorne, Scott, and Gordon C. Bruner. “An Exploratory Investigation of the Characteristics of Consumer Fanaticism.” Qualitative Market Research: An International Journal 9.1 (2006): 51-72. Turow, Joseph, and Andrea Kavanaugh, eds. The Wired Homestead: An MIT Press Sourcebook on the Internet and the Family. Cambridge, MA: MIT Press, 2003. Citation reference for this article MLA Style Brien, Donna Lee, Leonie Rutherford, and Rosemary Williamson. "Hearth and Hotmail: The Domestic Sphere as Commodity and Community in Cyberspace." M/C Journal 10.4 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0708/10-brien.php>. APA Style Brien, D., L. Rutherford, and R. Williamson. (Aug. 2007) "Hearth and Hotmail: The Domestic Sphere as Commodity and Community in Cyberspace," M/C Journal, 10(4). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0708/10-brien.php>.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía