Journal articles on the topic 'Price-setting technologies'

To see the other types of publications on this topic, follow the link: Price-setting technologies.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Price-setting technologies.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Germeshausen, Robert, and Nikolas Wölfing. "How marginal is lignite? Two simple approaches to determine price-setting technologies in power markets." Energy Policy 142 (July 2020): 111482. http://dx.doi.org/10.1016/j.enpol.2020.111482.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Snyder, Brian F. "Beyond the social cost of carbon: Negative emission technologies as a means for biophysically setting the price of carbon." Ambio 49, no. 9 (December 9, 2019): 1567–80. http://dx.doi.org/10.1007/s13280-019-01301-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Pérez-Pons, María E., Ricardo S. Alonso, Oscar García, Goreti Marreiros, and Juan Manuel Corchado. "Deep Q-Learning and Preference Based Multi-Agent System for Sustainable Agricultural Market." Sensors 21, no. 16 (August 4, 2021): 5276. http://dx.doi.org/10.3390/s21165276.

Full text
Abstract:
Yearly population growth will lead to a significant increase in agricultural production in the coming years. Twenty-first century agricultural producers will be facing the challenge of achieving food security and efficiency. This must be achieved while ensuring sustainable agricultural systems and overcoming the problems posed by climate change, depletion of water resources, and the potential for increased erosion and loss of productivity due to extreme weather conditions. Those environmental consequences will directly affect the price setting process. In view of the price oscillations and the lack of transparent information for buyers, a multi-agent system (MAS) is presented in this article. It supports the making of decisions in the purchase of sustainable agricultural products. The proposed MAS consists of a system that supports decision-making when choosing a supplier on the basis of certain preference-based parameters aimed at measuring the sustainability of a supplier and a deep Q-learning agent for agricultural future market price forecast. Therefore, different agri-environmental indicators (AEIs) have been considered, as well as the use of edge computing technologies to reduce costs of data transfer to the cloud. The presented MAS combines price setting optimizations and user preferences in regards to accessing, filtering, and integrating information. The agents filter and fuse information relevant to a user according to supplier attributes and a dynamic environment. The results presented in this paper allow a user to choose the supplier that best suits their preferences as well as to gain insight on agricultural future markets price oscillations through a deep Q-learning agent.
APA, Harvard, Vancouver, ISO, and other styles
4

Verhoef, E. T., J. C. J. M. van den Bergh, and K. J. Button. "Transport, Spatial Economy, and the Global Environment." Environment and Planning A: Economy and Space 29, no. 7 (July 1997): 1195–213. http://dx.doi.org/10.1068/a291195.

Full text
Abstract:
In this paper we investigate interdependencies between transport, spatial economy, and the environment in the context of policies aimed at a global environmental target. A small-scale spatial price equilibrium model is formulated and used to perform a number of numerical simulations, and to investigate market-based versus environmentally sound spatioeconomic configurations with first-best and second-best policies, and with endogenous environmental technologies. We thus present a modelling framework capable of dealing with complexities associated with the simultaneous regulation, first-best and second-best, of multiple interdependent sectors in a spatial setting.
APA, Harvard, Vancouver, ISO, and other styles
5

Ejaz, Muhammad Rahim. "The Future of Flexible Product Manufacturing by Using Industry 4.0 Technologies in Regard to Consumer Preferences." Marketing & Menedzsment 55, no. 3 (December 23, 2021): 7–17. http://dx.doi.org/10.15170/mm.2021.55.03.01.

Full text
Abstract:
THE AIMS OF THE PAPER The core objectives of this paper are to understand constantly changing consumer choices over time and the manufacturing of flexible products to answer the problem. The flexible products with multiple utility choices can help consumers from every segment to fulfil their needs. The study has shed light on the flexible product manufacturing process and has also discussed launching strategies into the market with having considered various market factors in the process. METHODOLOGY A rigorous analysis of literature has been done to understand why flexible products should be preferred over standard products. Literature related to flexible product strategy is being examined and explored its dimensions of price setting and product utility. MOST IMPORTANT RESULTS This study provides a road map for companies to shift their focus on developing new manufacturing processes in order to develop flexible products to address dynamic consumer preferences. This study also shed light on the fact the flexible products might be more profitable for the company. RECOMMENDATIONS The findings show that flexible products provide larger range of utility choices and with a right price; it can be more profitable than a standard product. Flexible products can go along with mass customisation of products which can enlarge utility choices for consumers to an unlimited level.
APA, Harvard, Vancouver, ISO, and other styles
6

Vasilchenko, Lidiya, Sergey Pepchuk, and Anjelika Bokovnya. "CONTINUUM OF ADVERTISING TECHNOLOGIES AND MARKETING PRICING IN THE FORMATION OF CONSUMER BEHAVIOR." HERALD OF KHMELNYTSKYI NATIONAL UNIVERSITY 300, no. 6 (December 3, 2021): 201–7. http://dx.doi.org/10.31891/2307-5740-2021-300-6-32.

Full text
Abstract:
The article considers the neсessity to study the consumers motivation when making a purchase, as one of the main factors ensuring the growth for goods and services demand. The review of consumer behavior models homo economikus and homo psychologicus and their postulates are reviewed. In the process of reviewing the models revealed key differences between rational and emotional consumer behavior, which is important in the formation of advertising appeals. The factors that determine the choice priority between rationality and emotional impact in terms of modern research are considered. The process of consumer decision-making on the purchase using the components of psychology is studied. Relevant, from the point of view of marketing, characteristics of the buyer’s behavior, which directly affect his attitude to the level of the product price and the reaction to changes in price, are presented and considered. It is necessary to take into account the risks to which the consumer is exposed when making a choice, which should be meant by the manufacturer when setting prices and developing advertising appeals. It was found that the study of patterns of consumer behavior allows manufacturers not only to identify consumer motives in the process of choosing and making a purchase decision, but also to form ways to influence them. The analysis of consumer behavior based on the considered models and risks showed that most consumers are not ready, and some do not want to be responsible for the choice and decision-making, while striving to “have a choice”. By studying consumer behavior, producers have the opportunity to significantly improve the pricing process and make a more accurate calculation of a “fair” price that is acceptable to the consumer. The need for accurate market segmentation is justified. It is proposed to conduct segmentation based on ABC analysis to ensure a sufficient size of segments required for the implementation of flexible pricing strategies. The article reveals that in modern conditions of market saturation, factors that have a special impact on buying behavior need a new quality. It is determined that the interaction of advertising technologies and pricing is based on continuous comparison of consumer benefits and their perception of the price offer, which is manifested in the degree of their involvement in the purchase process. The necessity of close interaction of advertising technologies and pricing as the most influential marketing tools on consumer behavior in modern conditions is substantiated. Achieving such interaction is possible through the study of consumer involvement in brands, motivating factors of consumer behavior and perception of advertising.
APA, Harvard, Vancouver, ISO, and other styles
7

IVAMA-BRUMMELL, Adriana M., Pilar PINILLA-DOMINGUEZ, and Aline N. BIZ. "The regulatory, evaluation, pricing and reimbursement pathway for medicines in the UK: combining innovation and access." Revista Brasileira de Farmácia Hospitalar e Serviços de Saúde 13, no. 2 (June 18, 2022): 804. http://dx.doi.org/10.30968/rbfhss.2022.132.0804.

Full text
Abstract:
The United Kingdom has universal healthcare systems, the National Health System (NHS), in its four nations, with healthcare services provided free of charge at the point of delivery. Approximately 10.5% of the UK population has voluntary supplementary private health insurance. While the provision of inpatient medicines is free of charge, medicines provided in the outpatient setting have a dispensing fee in some of the nations, such as the case of England (co-payment). The UK marketing authorisation process is called product licensing and is overseen by the Medicines and Healthcare Products Regulatory Agency (MHRA). There are different licensing routes based on the intended market for launch. MHRA also offers early access schemes and pathways for products targeting unmet medical needs and promising technologies, that aim to accelerate and facilitate market and patient access to products in the UK. These schemes include the option for companies to engage early with regulators and other system partners such as health technology assessment (HTA) agencies. As soon as the technology is authorised, it is available at a list price. Prices for medicines are regulated in legislation and in schemes agreed between the industry association and the Department of Health and Social Care (DHSC). The prices for the NHS are negotiated between the government and the companies. Routine funding decisions in the NHS are guided by HTA evaluations informed by agencies such as the National Institute for Health and Care Excellence (NICE) in England, the Scottish Medicines Consortium (SMC) in Scotland, and the All Wales Medicines Strategy Group (AWMSG) in Wales. Many medicines and other technologies are subject to price negotiations in the NHS, sometimes with confidential price agreements. The NHS in England is legally mandated to routinely fund technologies recommended by NICE that have been evaluated by some of its programmes. The other UK nations have similar arrangements or recognise decisions made in England. The role and contribution of NICE and other HTA agencies in ensuring value for money and evidence-based decision making is well recognised worldwide.
APA, Harvard, Vancouver, ISO, and other styles
8

McKenzie, Andrew, Daniel Schlauch, Yasha Sharma, David R. Spigel, Howard A. Burris, Suzanne Fields Jones, and Holli Hutcheson Dilks. "Adoption and utilization of NGS-based molecular profiling in community-based oncology practices: Insights from Sarah Cannon." Journal of Clinical Oncology 37, no. 15_suppl (May 20, 2019): e18064-e18064. http://dx.doi.org/10.1200/jco.2019.37.15_suppl.e18064.

Full text
Abstract:
e18064 Background: As the price of NGS-based sequencing technologies falls, adoption of those technologies is predicted to increase. To date, a survey of the adoption and utilization of NGS-based technologies as a part routine oncology clinical care has not been performed. Thus, a comprehensive analysis of physician adoption and utilization of commercial NGS testing in the community-setting between 2012 and 2018 was performed. Methods: Medical Oncologists in the Sarah Cannon Research Institute network ordered commercially-available NGS-based molecular profiling for their patients as standard of care. Data use agreements were initiated between SCRI, affiliated medical oncology practices, and commercial NGS testing providers, and patient NGS data were subsequently analyzed starting in 2012. Results: Community-based NGS testing rates within the Sarah Cannon network were 5.75/month in 2012 and 440/month in 2018. Plasma-based NGS testing began in 2014 and comprised 4.9% of total testing compared to 40.1% in 2018. The number of oncologists ordering molecular profiles increased from 11 in 2012 to 269 in 2018. Physician test utilization grew from an average of 6 tests/physician to 22 tests/physician in 2012 and 2018, respectively. NGS tests were performed on 34 different tumor types and biopsies were taken from both primary (~40%) and metastatic (~60%) sites . Tissue-based tests averaged 14 mutations/sample while plasma-based tests averaged 4 mutations/sample. There was a 74% decrease in median time between biopsy collection and NGS test results between 2012 and 2018 (131 and 34 days, respectively), indicating a shift toward the use of fresh – non-archival – tissue in recent years. Conclusions: These data establish NGS-based testing trends in community oncology practices and show that NGS-based tumor testing utilization has increased in the community-setting between 2012 and 2018. NGS testing is performed on a wide array of tumor types and oncologists utilize fresher biopsies.
APA, Harvard, Vancouver, ISO, and other styles
9

Zgalat-Lоzynska, Liubov. "STATE INNOVATION POLICY FOR GREEN TECHNOLOGIES SUPPORT IN CONSTRUCTION." Green, Blue and Digital Economy Journal 1, no. 2 (December 3, 2020): 8–13. http://dx.doi.org/10.30525/2661-5169/2020-2-2.

Full text
Abstract:
The purpose of the paper is to determine the status and areas of improvement of state policy to support the development of green technologies in construction. Methodology. This review is based on the characteristics of individual areas of implementation of green technologies in construction. Possibilities of using renewable sources in the process of construction and operation of buildings are considered. It is established that different methods of energy production are used in energy efficient houses. These include photovoltaic solar panels, heat pumps, photothermal collectors, geothermal waters, mini hydroelectric power plants. It is emphasized that the energy consumption of such buildings should also be reduced; for the reason of special architectural design solutions, the energy-efficient materials with high thermal insulation properties are used. The peculiarities of the policy of stimulating the increase of energy efficiency of buildings in Ukraine are considered, the conclusion on its insufficient efficiency is given. The reason for ineffectiveness of the incentive policy is figured out. The directions for reduction of air and water pollution by filtration and use of rain and melt water for household needs are considered. It is established that the active implementation of innovation is impossible mainly due to obsolete housing and worn-out utilities. Eco-design is also used to increase energy efficiency in construction and architects must actively use passive and active methods of designing houses in different climatic conditions. Green construction also involves recycling of construction waste. Improving the environmental efficiency of buildings and structures involves the use of modern insulation materials, coatings. Currently, nanomaterials with unique properties are becoming widespread in construction. They have increased physical characteristics, in particular, accumulated thermal radiation, provided significant energy savings in winter and summer. The results of the study showed that the state regulatory policy (innovation policy, support of science and R&D, technology transfer, price policy regulation, updating of technological regulations) has an extremely important role in stimulating the spread of green technologies in construction. Practical implications. The most important areas of public policy in construction are: the implementation of environmental energy and quality international standards for construction products; stimulating consumer demand for environmentally friendly innovative solutions, including through “green” public procurement, setting reasonable prices for energy resources, stimulating the implementation of a holistic concept of product life cycle; development of financial mechanisms to support the demand for cleaner technologies and innovations.
APA, Harvard, Vancouver, ISO, and other styles
10

Gaire, Sarthak, and Shridhika Dahal. "Assessment of vegetable production by adopting climate SMART agriculture technologies in Chormara, Nawalparasi district, Nepal." Archives of Agriculture and Environmental Science 6, no. 4 (December 25, 2021): 535–41. http://dx.doi.org/10.26832/24566632.2021.0604016.

Full text
Abstract:
Vegetable production is an economic booster contributing around 9.71% to total Agricultural Gross Domestic Production. So, the research study was performed under the topic “Assessment of vegetable production adopting climate-smart agriculture technologies in Chormara, Nawalparasi district” from March- April 2021 to assess the production of selected vegetables i.e. Cucumber, Tomato, Bitter Gourd, Sponge Gourd, and Chilly adopting climate-smart agriculture technology among 100 households applying simple random sampling. The study revealed that 96% of the total respondents were being affected directly by the ongoing climate change and to tackle such scenario 88% of the total respondents were adopting climate SMART Agricultural technologies including mulching, drip irrigation, cultivation of vegetables under the semi-protected house, quality seeds, etc. to mitigate the negative impacts of climate change with increased crop production. To enhance the productivity of vegetables and meet the food security of the increasing global population, farmers were integrating organic and synthetic fertilizers to attain the sustainability of soil health. It was found that 76% of the surveyed farmers were going through market hindrances like lack of proper market, fluctuation in price structure, and poor marketing channel suggesting an immediate need for a proper marketing system in the study area. The highest net return of USD 17588.53 per hectare and B:C ratio of 5.88 in tomatoes illustrated economic viability in vegetable production. Although vegetable production and marketing in Chormara seem a profitable business, the study suggests an immediate need for adoption and scaling up of successful CSA practices, its extension and proper implementation along with the provision of effective marketing channel and setting of minimum prices for the vegetable products based on the cost of cultivation that may overcome the farmer’s problems.
APA, Harvard, Vancouver, ISO, and other styles
11

Burris III, Howard A., Daniel Schlauch, Andrew McKenzie, Yasha Sharma, David R. Spigel, Suzanne Fields Jones, and Holli Hutcheson Dilks. "Adoption and utilization of NGS-based molecular profiling in community-based oncology practices: Insights from Sarah Cannon." Journal of Global Oncology 5, suppl (October 7, 2019): 34. http://dx.doi.org/10.1200/jgo.2019.5.suppl.34.

Full text
Abstract:
34 Background: The price of NGS-based sequencing technologies is falling, and the adoption of NGS-based testing is increasing in oncology practices. To date, a survey of the adoption and utilization of NGS-based technologies as a part routine oncology clinical care has not been performed. Thus, a comprehensive analysis of physician adoption and utilization of commercial NGS testing in the non-academic medical center, community-setting between 2012 and 2018 was performed. Methods: Medical Oncologists in the Sarah Cannon Research Institute network ordered commercially-available NGS-based molecular profiling for their patients as standard of care. Data use agreements were initiated between SCRI, affiliated medical oncology practices, and commercial NGS testing providers, and patient NGS data was subsequently analyzed starting in 2012. Results: Community-based NGS testing rates with the Sarah Cannon network were 5.75/month in 2012 and 440/month in 2018. Plasma-based NGS testing began in 2014 and comprised 4.9% of total testing compared to 40.1% in 2018. The number of oncologist ordering molecular profiles increased from 11 in 2012 to 269 in 2018. Physician test utilization grew from an average of 6 tests/physician to 22 tests/physician in 2012 and 2018, respectively. NGS tests were performed on 34 different tumor types and biopsies were taken from both primary tumors (~40%) and metastatic sites (~60%). Tissue-based tests averaged 14 mutations/sample while plasma-based tests averaged 4 mutations/sample. There was a 74% decrease in median time between biopsy collection and NGS test results between 2012 and 2018 (131 and 34 days, respectively), indicating a shift toward the use of fresh – non-archival – tissue in recent years. Conclusions: These data establish NGS-based testing trends in community oncology practices and show that NGS-based tumor testing utilization has increased in the community-setting between 2012 and 2018. NGS testing is performed on a wide array of tumor types and oncologists order tests earlier and utilize fresher biopsies.
APA, Harvard, Vancouver, ISO, and other styles
12

Sime, Getachew, and Jens Aune. "Sustainability of Improved Crop Varieties and Agricultural Practices: A Case Study in the Central Rift Valley of Ethiopia." Agriculture 8, no. 11 (November 9, 2018): 177. http://dx.doi.org/10.3390/agriculture8110177.

Full text
Abstract:
Technological change has been the major driving force for increasing agricultural productivity and promoting agriculture development in developing countries. To improve the agricultural productivity and farmers’ livelihoods, several agricultural technologies (improved crop varieties and related agricultural practices) were introduced by various agencies to the farmers in the Rift Valley of Ethiopia. Thus, the objective of this study is to identify these technologies, and evaluate their characteristics and sustainability. The data were collected from farmers, agricultural extension workers, and agricultural experts, through a series of focus group discussions, key informant interviews, and farm observations, selected through purposive and random sampling techniques. Results showed that extension systems, social networks, or research projects were the agencies that introduced the technologies to the farmers. Haricot beans (Phaseolus vulgaris L.) and early and mid-maturing maize (Zea mays L.), as well as agricultural practices like row-sowing, banding fertilizer application, intercropping, and traditional rainwater-harvesting, were found to be in continuous use by the farmers. In contrast, the use of extra-early-maturing maize, sorghum (Sorghum bicolor L.) and finger millet (Eleusine coracana L.), as well as the use of related practices, including harvesting maize at physiological maturity, seed priming and fertilizer microdosing, were the technologies that were discontinued at the time of pursuing this study. Most of the continuing technologies had a high potential for reducing the vulnerability of the rain-fed agriculture to rainfall variability. Regardless of sources, the national extension system supported technologies that were integrated into the system only. Most of the discontinued technologies were found to be introduced by the research projects. These technologies were not brought into the attention of policy-makers for their integration into the extension system. The farmers also disliked a few of them for unfitting the existing socioeconomic setting. Whereas, the technologies that were introduced by the social networks were found to be widely used by the farmers, though they were not supported by the extension system. This is because most such technologies offer better yield and income. For instance, social networks have popularized haricot beans and hybrid maize because of their higher benefits to farmers. Farmers consider both socioeconomic and agroecological conditions for selecting and using technologies, whereas the extension system centers on existing agroecological conditions for recommending and supporting agricultural technologies. Consideration of both socioeconomic and agroecological settings would increase the prospect of a technology for sustainable adoption. Overall, rainfall variability, high price and poor access to improved seeds, farmers’ poor economic conditions, and the inadequate linkage between extension systems, social networks and research projects, remain critical factors influencing the sustainable use of agricultural technologies. It is, thus, commendable that policymakers should consider local socioeconomic and agroecological settings in recommending and supporting agricultural technologies besides instituting a strong consortium of extension systems, research institutes, research projects, social networks and farmers for improved agricultural technology development, extension system and sustainable adoption.
APA, Harvard, Vancouver, ISO, and other styles
13

Zhyliakova, Olena, and Halyna Zhyliakova. "Innovative Technology in Life Insurance." Modern Economics 22, no. 1 (August 27, 2020): 13–17. http://dx.doi.org/10.31521/modecon.v22(2020)-02.

Full text
Abstract:
Introduction. The negative impact of macroeconomic factors on the development of the life insurance market during the quarantine period due to the spread of coronavirus disease has been exacerbated by declining incomes, rising unemployment, and the need to adhere to strict quarantine measures. The combination of these factors has led to a decrease in demand for long-term insurance and reduced efficiency of traditional sales channels of insurance products. At the same time, foreign experience shows that the level of innovation of the insurer is one of the main factors in increasing its competitiveness. Dissemination of information technology allows insurers through the implementation of innovative solutions to increase the quality and speed of service, increase the share of the insurance field, reduce the cost of conducting insurance business and the cost of insurance services. Purpose. The purpose of the article is to determine the features of innovative technologies in life insurance and to study the impact of their introduction into the activities of insurers on the development of the life insurance market. Results. The article describes the factors influencing the introduction of innovative technologies in life insurance. Characteristics of social, economic, market and technological groups of factors taking into account the specifics of life insurance and the current state of development of the life insurance market in Ukraine are given. Based on the experience of foreign insurance companies, the efficiency and necessity of introducing innovative technologies in life insurance have been proved, which will allow insurers to create competitive advantages, reduce costs, cost of insurance services, improve the quality of service for policyholders. Conclusions. For the insurance industry, the crisis of 2020 had some consequences, both obvious negative and hidden opportunities, as the introduction of quarantine measures requires accelerating the introduction of digital tools and channels through the widespread use of remote work, price pressure and changes in customer behavior. Thus, the crisis has accelerated the pace of introduction of innovative technologies in life insurance. Achieving successful phased changes involves a number of coordinated steps: – setting dynamic goals for the introduction of innovative technologies at all stages of servicing policyholders, the implementation of which in the near future will provide competitive advantages and reduce costs for insurers; – use of analytical reports and databases as a source of competitive advantage, abandoning outdated IT; – radical cost reduction through automation and self-service, while improving employee skills.
APA, Harvard, Vancouver, ISO, and other styles
14

Ciborowski, Rafał. "Innovation Process Adjusting in Peripheral Regions. The Case of Podlaskie Voivodship." Equilibrium 9, no. 2 (June 30, 2014): 57–72. http://dx.doi.org/10.12775/equil.2014.011.

Full text
Abstract:
Comparative studies indicate that innovation activity in Poland is, in general, significantly below that of EU and in North-east it is below that of other parts of Poland. There is peripheral region with traditional structure of production and employment. The North-east remains one of the least innovative regions of Poland and enlarged EU. This is probably the result of the obsolete institutional setting, which doesn’t reflect the requirements of modern international competitiveness. It is the heritage of past times, where creation of the innovation system wasn’t considered a priority factor of economic development. Additionally, Poland is still undergoing intensive modernization of its technology capabilities. The capital and production structure are way outdated and don’t meet the demands of the international trade competition, most of all the non-price competition. That is why it seems that creating conditions for innovation system might become a crucial factors determining the nature and dynamics of development processes as well as influencing the North-east future innovation capability. Creating a new regional innovation structure and transfer of technologies should support modernization processes in companies and the creation of development opportunities for the national economy as a whole and especially in peripheral regions. Those processes will accelerate technological convergence of regions and the economy.
APA, Harvard, Vancouver, ISO, and other styles
15

Rubinstein, Adolfo, Andrés Pichon-Riviere, and Federico Augustovski. "Development and implementation of health technology assessment in Argentina: Two steps forward and one step back." International Journal of Technology Assessment in Health Care 25, S1 (July 2009): 260–69. http://dx.doi.org/10.1017/s0266462309090734.

Full text
Abstract:
Objectives: The objectives of this study are to review the financing and organization of the Argentine healthcare system, the licensing and drug price setting mechanisms, the benefit packages and coverage policies of pharmaceuticals and other medical technologies, as well as the development of HTA in Argentina, and the role of the Institute of Clinical Effectiveness and Health Policy (IECS) as an HTA agency. Finally, the perspectives and future of HTA as a tool to make resource-allocation decisions and priority setting in Argentina is discussed.Methods: The study is a discussion/review based largely on the experiences of the authors, but supported by available literature.Results: Argentina is an upper-middle income country with major healthcare problems related to both equity and efficiency. Its healthcare system consists of a multitier system divided in three large sectors: public, social security, and private, where the federal Ministry of Health has a rather limited role in national health policy stewardship. Many of Argentina's shortcomings are due in part to its pluralistic and fragmented healthcare system. In the past decade, Argentina, like many other Latin American countries, has undergone a profound reform of its healthcare system. Whereas some of the objectives of the reforms were specific to each country, a common issue among all of them was to establish a mechanism that ensured a more efficient allocation of scarce resources, and guaranteed a wider provision of healthcare services on the basis of the local population needs and equity. Although some signals from the national government and congress show that there are plans to formally incorporate HTA to inform reimbursement policies, these signals are still very weak. Paradoxically, even though Argentina was the first country in the region to require formal health economic evidence for the adoption of technologies into the mandatory benefit package of the social security, this “fourth hurdle” is no longer required. Nevertheless, there is an increasing interest and demand for a more explicit and transparent resource-allocation process that include HTA as a formal tool to inform decision making, in most of Argentine healthcare stakeholders.Conclusions: In conclusion, what is needed in Argentina is a clear political will to push forward for a national agency of HTA that, similar to other developed countries, advance the regulation on the adoption of new health technologies to improve not only technical or allocative efficiency, but also health equity. Until this milestone is accomplished, the HTA production and use to inform healthcare coverage policies will continue to mirror the current fragmented healthcare system.
APA, Harvard, Vancouver, ISO, and other styles
16

Kipkoech, Rogers, Mohammed Takase, and Ernest Kofi Amankwa Afrifa. "Renewable Energies in Ghana in Relation to Market Condition, the Environment, and Food Security." Journal of Renewable Energy 2022 (March 29, 2022): 1–8. http://dx.doi.org/10.1155/2022/8243904.

Full text
Abstract:
Energy is essential to the development of a country, and several studies have been carried out on the production and use of energy by industrialised countries. However, little research and development has been carried out in developing countries on renewable energy. Also, the importance of traditional fuels such as biomass has not been emphasised in developing countries like Ghana, which rely on fossil fuels. Ghana relies heavily on imported petroleum fuel obtained from fossil fuels. However, fossil fuels are faced with many limitations including environmental pollution and an escalating price. Hydropower, biomass, biofuel, wind, and solar energy are the major renewable energy resources expected to be fully exploited in the future. This study, therefore, assesses the sources of the main renewable energy in relation to policy, the conditions of the market and food security. The government of Ghana has put in place a favourable business environment for the renewable energy sector by setting explicit feed-in tariffs (FITs). In addition, various acts and legislation have been passed and formulated by the relevant institutions (Renewable Energy Act (832) of 2011). The study revealed that there is an increase in the exploitation and use of energy from renewable resources when compared with the past decades. However, this exploitation is still limited due to barriers such as the cost of technologies, financing issues, and scientific and technical barriers.
APA, Harvard, Vancouver, ISO, and other styles
17

Alkoby, Shani, Zihe Wang, David Sarne, and Pingzhong Tang. "Making Money from What You Know - How to Sell Information?" Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 2421–28. http://dx.doi.org/10.1609/aaai.v33i01.33012421.

Full text
Abstract:
Information plays a key role in many decision situations. The rapid advancement in communication technologies makes information providers more accessible, and various information providing platforms can be found nowadays, most of which are strategic in the sense that their goal is to maximize the providers’ expected profit. In this paper, we consider the common problem of a strategic information provider offering prospective buyers information which can disambiguate uncertainties the buyers have, which can be valuable for their decision making. Unlike prior work, we do not limit the information provider’s strategy to price setting but rather enable her flexibility over the way information is sold, specifically enabling querying about specific outcomes and the elimination of a subset of non-true world states alongside the traditional approach of disclosing the true world state. We prove that for the case where the buyer is self-interested (and the information provider does not know the true world state beforehand) all three methods (i.e., disclosing the true worldstate value, offering to check a specific value, and eliminating a random value) are equivalent, yielding the same expected profit to the information provider. For the case where buyers are human subjects, using an extensive set of experiments we show that the methods result in substantially different outcomes. Furthermore, using standard machine learning techniques the information provider can rather accurately predict the performance of the different methods for new problem settings, hence substantially increase profit.
APA, Harvard, Vancouver, ISO, and other styles
18

Mitsis, Giorgos, Pavlos Athanasios Apostolopoulos, Eirini Eleni Tsiropoulou, and Symeon Papavassiliou. "Intelligent Dynamic Data Offloading in a Competitive Mobile Edge Computing Market." Future Internet 11, no. 5 (May 21, 2019): 118. http://dx.doi.org/10.3390/fi11050118.

Full text
Abstract:
Software Defined Networks (SDN) and Mobile Edge Computing (MEC), capable of dynamically managing and satisfying the end-users computing demands, have emerged as key enabling technologies of 5G networks. In this paper, the joint problem of MEC server selection by the end-users and their optimal data offloading, as well as the optimal price setting by the MEC servers is studied in a multiple MEC servers and multiple end-users environment. The flexibility and programmability offered by the SDN technology enables the realistic implementation of the proposed framework. Initially, an SDN controller executes a reinforcement learning framework based on the theory of stochastic learning automata towards enabling the end-users to select a MEC server to offload their data. The discount offered by the MEC server, its congestion and its penetration in terms of serving end-users’ computing tasks, and its announced pricing for its computing services are considered in the overall MEC selection process. To determine the end-users’ data offloading portion to the selected MEC server, a non-cooperative game among the end-users of each server is formulated and the existence and uniqueness of the corresponding Nash Equilibrium is shown. An optimization problem of maximizing the MEC servers’ profit is formulated and solved to determine the MEC servers’ optimal pricing with respect to their offered computing services and the received offloaded data. To realize the proposed framework, an iterative and low-complexity algorithm is introduced and designed. The performance of the proposed approach was evaluated through modeling and simulation under several scenarios, with both homogeneous and heterogeneous end-users.
APA, Harvard, Vancouver, ISO, and other styles
19

Gallacher, Daniel, Peter Kimani, and Nigel Stallard. "Extrapolating Parametric Survival Models in Health Technology Assessment Using Model Averaging: A Simulation Study." Medical Decision Making 41, no. 4 (February 25, 2021): 476–84. http://dx.doi.org/10.1177/0272989x21992297.

Full text
Abstract:
Previous work examined the suitability of relying on routine methods of model selection when extrapolating survival data in a health technology appraisal setting. Here we explore solutions to improve reliability of restricted mean survival time (RMST) estimates from trial data by assessing model plausibility and implementing model averaging. We compare our previous methods of selecting a model for extrapolation using the Akaike information criterion (AIC) and Bayesian information criterion (BIC). Our methods of model averaging include using equal weighting across models falling within established threshold ranges for AIC and BIC and using BIC-based weighted averages. We apply our plausibility assessment and implement model averaging to the output of our previous simulations, where 10,000 runs of 12 trial-based scenarios were examined. We demonstrate that removing implausible models from consideration reduces the mean squared error associated with the restricted mean survival time (RMST) estimate from each selection method and increases the percentage of RMST estimates that were within 10% of the RMST from the parameters of the sampling distribution. The methods of averaging were superior to selecting a single optimal extrapolation, aside from some of the exponential scenarios where BIC already selected the exponential model. The averaging methods with wide criterion-based thresholds outperformed BIC-weighted averaging in the majority of scenarios. We conclude that model averaging approaches should feature more widely in the appraisal of health technologies where extrapolation is influential and considerable uncertainty is present. Where data demonstrate complicated underlying hazard rates, funders should account for the additional uncertainty associated with these extrapolations in their decision making. Extended follow-up from trials should be encouraged and used to review prices of therapies to ensure a fair price is paid.
APA, Harvard, Vancouver, ISO, and other styles
20

Mantel, Jessica. "Spending Medicare’s Dollars Wisely: Taking Aim at Hospitals’ Cultures of Overtreatment." University of Michigan Journal of Law Reform, no. 49.1 (2015): 121. http://dx.doi.org/10.36646/mjlr.49.1.spending.

Full text
Abstract:
With Medicare’s rising costs threatening the country’s fiscal health, policymakers have focused their attention on a primary cause of Medicare’s high price tag—the overtreatment of patients. Guided by professional norms that demand they do “everything possible” for their patients, physicians frequently order additional diagnostic tests, perform more procedures, utilize costly technologies, and provide more inpatient care. Much of this care, however, does not improve Medicare patients’ health, but only increases Medicare spending. Reducing the overtreatment of patients requires aligning physicians’ interests with the government’s goal of spending Medicare’s dollars wisely. Toward that end, recent Medicare payment reforms establish a range of financial incentives that encourage more efficient practices among physicians. Physicians, however, do not practice medicine in a vaccum. Rather, they are profoundly influenced by the organizational cultures of hospitals. Far too often hospitals’ cultures lead physicians to provide Medicare patients care of questionable value. If Medicare is to successfully contain costs, it must prod hospitals to move from cultures of overtreatment to cultures of efficiency. Current Medicare reform proposals, however, do too little to address hospitals’ cultures of overtreatment. That is unfortunate, as regulators will have limited success in constraining Medicare’s growth if hospitals’ cultures continue to foster the overtreatment of Medicare patients. This Article therefore sets forth a more robust proposal for reforming Medicare payment policy, one that would facilitate hospitals fully embracing a culture of efficiency. Specifically, federal regulators should reform the Medicare Hospital Value-Based Purchasing Program so that a hospital’s Medicare payment rates are tied to the hospital’s success in lowering the cost of treating patients both inside and outside the hospital setting. Regulators could accomplish this goal by incorporating into the program efficiency measures based on broadly defined episodes of care.
APA, Harvard, Vancouver, ISO, and other styles
21

Радіонова, Наталія Йосипівна. "АНАЛІЗ ПРОБЛЕМ УПРАВЛІННЯ ВИТРАТАМИ НА ВІТЧИЗНЯНИХ ШВЕЙНИХ ПІДПРИЄМСТВАХ ТА ШЛЯХИ ЇХ ПОДОЛАННЯ." Bulletin of the Kyiv National University of Technologies and Design. Series: Economic sciences 123, no. 3 (January 13, 2019): 37–46. http://dx.doi.org/10.30857/2413-0117.2018.3.3.

Full text
Abstract:
The paper offers a system approach to cost analysis in garment manufacturing. The content of cost analysis and its major elements have been disclosed: the purpose, objectives, object, subject, principles, information and methodological support, unit of measurement and results. The major problems that affect the expenditure level in domestic garment manufacturing have been revealed. It is evidenced that factors of both internal and external environment negatively affect cost management, thus triggering the problems at the three levels: the macrolevel (the Ukrainian industry), the mesolevel (sector of textiles, apparel, leather, leather goods and products from other materials) and the microlevel (garment manufacturer). The key macrolevel problems account for inflation, tax burden, imperfection of legislation, tariffs on public utilities and energy, political challenges, low solvency of the population. At the mesolevel, the major challenges are: the low level of cooperation between domestic enterprises and supporting industries, high import dependency and the high price of imported raw materials, equipment and technologies, small share of medium-sized enterprises or small business associations, intense competition, lack of active government support. Among the main problems at the microlevel are moral and physical depreciation of equipment with depleted resources, low-skilled young employees and high staff turnover, low competitiveness and high production costs, high toll rates, underdeveloped logistics infrastructure which hampers the finished product sales, insufficient product intellectual property rights protection, inadequate national brand promotion policies. The research findings provide an assessment on how the above challenges affect business costs. The problems have been classified depending on the extent of their regulatory management capacity. A matrix has been constructed that allows problems differentiation and their priority setting. This matrix application will contribute to enhance the decision-making and cost management efficiency in garment manufacturing.
APA, Harvard, Vancouver, ISO, and other styles
22

Jan, Semjon, Grexa Ján, and Mako Peter. "DESIGN OF DOCKING SYSTEM FOR MOBILE ROBOTICS PLATFORM TYPE AGV." TECHNICAL SCIENCES AND TECHNOLOGIES, no. 4 (14) (2018): 210–15. http://dx.doi.org/10.25140/2411-5363-2018-4(14)-210-215.

Full text
Abstract:
Urgency of the research. Automatic battery charging of AGV platforms allows you to maximize their potential. Safe and quickly positioning AGVs in a charging station equipped with appropriate contacts, reduces the charging time as well as the purchase price of the device. Target setting. The aim of the solution is to design an automatic docking and charging station from a used hand-held charging station. In the design, it was necessary to ensure the appropriate position of the AGV platform against the docking station. Actual scientific researches and issues analysis. The issue of fast and reliable charging of mobile service robots is highly up-to-date. The reason for this is the growing deployment of AGV platforms in various industrial or service sectors. Uninvestigated parts of general matters defining. This article focuses on a specific solution for the provision of transport services. Transport services come from the need to transport medical supplies and medications in a multi-storey hospital building. The movement of the robot between the floors is solved by the use of lifts used by the personal of hospital. The research objective. The aim of the research was to design a docking and charging station utilized an already purchased power-up charger. The design was aimed at creating an appropriate power transmission system between the charger and the AGV platform batteries. The price ceiling for the whole facility was worth € 2,000. The statement of basic materials. The use of docking and charging stations for mobile service robots is dependent on a number of parameters. In particular, the parameters depend on the area of use, the size of the battery to be charged, and the amount of robots being recharged at the station. Last but not least, charging time and purchase price are also important. Conclusions. The task of the solution was to design a docking station design for the AGV platform. At the beginning, three variants were created, from which the most appropriate solution was chosen using the scoring method. However, before designing the docking station design, it was necessary to modify the existing AGV platform construction so that it could be connected to the docking station charging mechanism. The design of the docking station itself consisted of the design of the charging and charging mechanism. These mechanisms provide charge and guidance of the AGV platform to the docking station. Mechanisms are not dependent on each other, since the charging mechanism is activated later than the drive mechanism. Subsequently, a design of the docking station, which can be anchored to the floor or to the wall, was created. At the docking station there is a charger from Hoppecke, which provides the AGV platform charging. The design dimensions of the docking station have been greatly influenced by the size of the above-mentioned charger. It has been found that new and better technologies will not be needed at the docking stations in the future, as AGV platforms can be guided without their help. The development of new and better quality systems will bring new guidance options to AGV platforms and docking stations.
APA, Harvard, Vancouver, ISO, and other styles
23

Riabchyn, Oleksiy, Nadiia Novytska, and Inna Khliebnikova. "Conceptual approaches to improving carbon dioxid taxation in Ukraine." Economy and forecasting 2021, no. 4 (December 30, 2021): 44–61. http://dx.doi.org/10.15407/econforecast2021.04.044.

Full text
Abstract:
The domestic carbon tax needs to improve tax administration to ensure its fiscal efficiency and reduce transaction costs for tax compliance. Despite the fact that in the Tax Code of Ukraine the calculation of such a tax is based on the actual indicators of CO2 emissions, in practice it is based on the amount of resources consumed and the characteristics of the production process. Accordingly, the difficulties in administering this tax are the complexity of tax audits and the need to involve environmental experts. All this does not allow to adhere to the principle of cost-effectiveness of taxation and highlights the need to find opportunities to simplify the process of tax administration on the basis of world best practices. The purpose of the article is to outline conceptual approaches to improving carbon taxation, which will allow Ukraine to simplify tax administration and together with the EU to effectively combat the effects of climate change in order to increase security and create new opportunities for Ukrainian business under the European Green Deal. The methodological basis of the study was the use of a set of general and special methods: generalizations and scientific abstraction, historical and logical, extrapolations, spatial and graphical and tabular methods of visualization. The application of the SWOT analysis method and the systematization of European practice revealed that the most acceptable for Ukraine is the use of tax on CO2 emissions in the form of an indirect tax on energy consumption. Coefficients of carbon content in fuel, calorific value of fuel and its oxidation factor were used to convert the emission base carbon tax into the fuel base carbon tax. The implementation of these proposals will help increase the efficiency of administration of such a tax, as it will: 1) reduce the number of taxpayers through the introduction of the institution of tax agents while increasing the amount of tax paid by one taxpayer; 2) simplify the procedure for calculating the tax base by taxpayers and employees of tax authorities; 3) increase the fiscal efficiency of the environmental tax on carbon dioxide emissions from stationary sources by 50% in the case of setting the CO2 price at UAH 10 per ton (5-fold when setting the CO2 price at UAH 30 per ton in accordance with the proposals of the bill No 5600) and to attract potential revenues from the transport sector in the amount of 0.06% of GDP. The use of practical proposals and recommendations obtained in the article will increase the effectiveness of Ukraine's tax policy by forming a set of measures which will reduce the energy dependence of the national economy, including through incentives for energy-saving and climate-neutral technologies, reduce the burden on the environment, and will help simplify the administration of environmental taxes while increasing their fiscal efficiency. Research materials can be used in the preparation of draft regulations and policy documents in the field of environmental and excise taxation, which is within the competence of the Ministry of Finance of Ukraine, as well as in the formation of proposals, reservations and recommendations to other regulations on improving environmental and excise taxation initiated both by the authorities of the executive power of Ukraine, and the Verkhovna Rada of Ukraine on improving environmental and excise taxation. The theoretical results are the development of a general theory of fiscal administration for environmental and excise taxation.
APA, Harvard, Vancouver, ISO, and other styles
24

Riabchyn, Oleksiy, Nadiia Novytska, and Inna Khliebnikova. "Conceptual approaches to improving carbon dioxid taxation in Ukraine." Ekonomìka ì prognozuvannâ 2021, no. 4 (December 20, 2021): 53–73. http://dx.doi.org/10.15407/eip2021.04.053.

Full text
Abstract:
The domestic carbon tax needs to improve tax administration to ensure its fiscal efficiency and reduce transaction costs for tax compliance. Despite the fact that in the Tax Code of Ukraine the calculation of such a tax is based on the actual indicators of CO2 emissions, in practice it is based on the amount of resources consumed and the characteristics of the production process. Accordingly, the difficulties in administering this tax are the complexity of tax audits and the need to involve environmental experts. All this does not allow to adhere to the principle of cost-effectiveness of taxation and highlights the need to find opportunities to simplify the process of tax administration on the basis of world best practices. The purpose of the article is to outline conceptual approaches to improving carbon taxation, which will allow Ukraine to simplify tax administration and together with the EU to effectively combat the effects of climate change in order to increase security and create new opportunities for Ukrainian business under the European Green Deal. The methodological basis of the study was the use of a set of general and special methods: generalizations and scientific abstraction, historical and logical, extrapolations, spatial and graphical and tabular methods of visualization. The application of the SWOT analysis method and the systematization of European practice revealed that the most acceptable for Ukraine is the use of tax on CO2 emissions in the form of an indirect tax on energy consumption. Coefficients of carbon content in fuel, calorific value of fuel and its oxidation factor were used to convert the emission base carbon tax into the fuel base carbon tax. The implementation of these proposals will help increase the efficiency of administration of such a tax, as it will: 1) reduce the number of taxpayers through the introduction of the institution of tax agents while increasing the amount of tax paid by one taxpayer; 2) simplify the procedure for calculating the tax base by taxpayers and employees of tax authorities; 3) increase the fiscal efficiency of the environmental tax on carbon dioxide emissions from stationary sources by 50% in the case of setting the CO2 price at UAH 10 per ton (5-fold when setting the CO2 price at UAH 30 per ton in accordance with the proposals of the bill No 5600) and to attract potential revenues from the transport sector in the amount of 0.06% of GDP. The use of practical proposals and recommendations obtained in the article will increase the effectiveness of Ukraine's tax policy by forming a set of measures which will reduce the energy dependence of the national economy, including through incentives for energy-saving and climate-neutral technologies, reduce the burden on the environment, and will help simplify the administration of environmental taxes while increasing their fiscal efficiency. Research materials can be used in the preparation of draft regulations and policy documents in the field of environmental and excise taxation, which is within the competence of the Ministry of Finance of Ukraine, as well as in the formation of proposals, reservations and recommendations to other regulations on improving environmental and excise taxation initiated both by the authorities of the executive power of Ukraine, and the Verkhovna Rada of Ukraine on improving environmental and excise taxation. The theoretical results are the development of a general theory of fiscal administration for environmental and excise taxation.
APA, Harvard, Vancouver, ISO, and other styles
25

Haleta, Yaroslav, and Volodymyr Kozlenko. "TECHNOLOGY OF EDUCATIONAL MARKETING AS DIRECTION OF ADMINISTRATIVE ACTIVITY." Academic Notes Series Pedagogical Science 1, no. 204 (October 2022): 27–31. http://dx.doi.org/10.36550/2415-7988-2022-1-205-27-31.

Full text
Abstract:
It is marked in the article, that development and efficiency of functioning of educational establishment at the market of educational services are impossible without marketing technologies that give an opportunity to forecast the dynamics of basic indexes of functioning of educational establishment depending on the economic state of affairs, the basic marketing instruments of management the competitiveness of закладуосвіти are analysed. In the conditions of high-competition marketing environment effective administrative decisions in relation to functioning of educational establishment can not be base on intuition or simple reasoning, that is why for successful realization of the marketing programs it is necessary to conduct marketing researches. Attention is accented on that market of educational services and labour-market, marketing dissociated from all complex, research will not provide educational establishment of competition status at the market, and such research is the first stage of increase of his competitiveness, as exactly due to researches basic instruments that can form the competitive edge of educational establishment are determined. Essence of positioning of educational services of educational establishment is certain as attractive presentation of educational establishment and him educational services, the aim of that is a conquest of leadership on a certain market of educational services segment, his stages (choice of target group of consumers, development of strategy of positioning, introduction of strategy of positioning in the marketing program) and strategies are described. Meaningfulness of such elements of marketing communications is found out, as: advertisement, public relations, exhibition activity, representative office of educational establishment in Internet. Attention is accented on the most popular element of marketing communications is a representative office of educational establishment in the Internet. Drawn conclusion that the increase of competitiveness of educational establishment is possible on condition of the use of all elements of marketing complex, and for this purpose it is necessary creation at educational establishment of separate subdivision or even setting of responsible persons, that would investigate the market of educational services and labour-market and on the basis of analysis of research results offered a commodity, sale, price and communication politicians of educational establishment.
APA, Harvard, Vancouver, ISO, and other styles
26

Grymak, A. V., L. V. Kurylas, and T. Ye Seneshyna. "EXPERIENCE OF FORMATION OF STPATEGY OF PRIORITIES IN ACTIVITIES OF VETMEDICINE ENTERPRISES." Scientific and Technical Bulletin оf State Scientific Research Control Institute of Veterinary Medical Products and Fodder Additives аnd Institute of Animal Biology 22, no. 2 (October 7, 2021): 110–17. http://dx.doi.org/10.36359/scivp.2021-22-2.12.

Full text
Abstract:
Practice confirms that every enterprise of veterinary medicine, as well as other branches of economy, cannot work effectively in market conditions without the developed strategy in which effective, payback priorities would be declared. This is important because in market conditions, companies are in ever-increasing technological requirements, the dynamics of the range of competitive products, strong competition, as well as the growing demands of consumers. Today, the market of products for veterinary medicine is focused mainly on manufacturers who have adequate, economically sound market motivation. Therefore, it is important for enterprises to work on the development and continuous improvement of priority areas of their development. And this, in turn, will contribute to the development of a model of effective management activities to solve important production problems, including the introduction of new technologies, the formation of promising areas of enterprise development in market conditions. It is the strategic priorities that should be the guideline for veterinary medicine enterprises in the short and long term. The development of strategic priorities should be based on positive experience, the level of qualification of staff who should clearly understand the goals of the enterprise and be directly involved in optimizing all available resources, their efficient allocation and use (Dikan et al., 2013). When setting priorities, it should be borne in mind that the markets for veterinary products coexist with other markets. This factor should be taken into account, because in such markets the presence of consumers is greater, respectively, and the diversity of their requests for different types of products presented in such markets is wider. And this reinforces the principles of competition, which producers should take into account when forming their priorities, focus on the production of high consumer quality products, appropriate price, which would satisfy consumers. Therefore, one of the key tasks of veterinary enterprises is the ability to assess their capabilities in a competitive environment. Experience confirms that to be market participants and compete successfully can be those companies that organize their activities taking into account the prospects of development, see the priorities, and achieve them. The purpose of the study is to summarize the experience of leading enterprises of veterinary medicine of Ukraine in the formation and practical implementation of priorities for their development.
APA, Harvard, Vancouver, ISO, and other styles
27

Arditi, David. "Music Everywhere: Setting a Digital Music Trap." Critical Sociology 45, no. 4-5 (August 31, 2017): 617–30. http://dx.doi.org/10.1177/0896920517729192.

Full text
Abstract:
Before the digital era, music consumption was limited to purchasing LPs, tapes and CDs, or attending concerts. With digitization and mobile technologies in tow, the consumption of music exploded. Music is now literally everywhere—but none of it is actually free. Our consumption of it on television and cable, through games on our computers and our phones, through subscriptions or sites with built-in never-ending streams of advertising always has a price. Music is everywhere, but how did this happen? How has digital distribution and production changed the recording industry? What are the consequences of ubiquitous music? In this article, I argue that the digital music trap is an outgrowth of digital capitalism that commodifies our everyday existence.
APA, Harvard, Vancouver, ISO, and other styles
28

Cochoy, Franck, Johan Hagberg, and Hans Kjellberg. "Price display technologies and price ceiling policies: governing prices in the WWII and postwar US economy (1940–1953)." Socio-Economic Review, October 19, 2019. http://dx.doi.org/10.1093/ser/mwz045.

Full text
Abstract:
Abstract This article explores the politics and technologies of price fixing and price display in US grocery stores in the mid-20th century. Drawing on the literature on market devices and policy instruments, it complements previous studies focused on price setting processes by stressing the importance of price display. Through a systematic reading of the trade journal The Progressive Grocer, the article shows how displaying prices during World War II and the postwar inflation period combined the mastery of Government authorities at the federal level, and the expertise of retail professionals at the shelf level. It demonstrates that the regulation of prices is linked to mundane policies, technologies and practices, in particular the technique of ‘stereoscopic prices’ aimed at linking a reference price (the ceiling price set by the government) and the selling price (the actual price set by the retailer). Such technologies proved able to reinvent prices and price competition through their ‘bifurcated agency’, i.e. their propensity to both enact the scripts delegated to them (conveying price ceilings) and produce major side effects, like generalizing the practice of price display and linking prices to new values and qualitative dimensions of grocery products.
APA, Harvard, Vancouver, ISO, and other styles
29

Germeshausen, Robert, and Nikolas M. Wölfing. "How Marginal is Lignite? Two Simple Approaches to Determine Price-Setting Technologies in Power Markets." SSRN Electronic Journal, 2019. http://dx.doi.org/10.2139/ssrn.3444825.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Blume-Werry, Eike, Thomas Faber, Lion Hirth, Claus Huber, and Martin Everts. "Eyes on the Price: Which Power Generation Technologies Set the Market Price? Price Setting in European Electricity Markets: An Application to the Proposed Dutch Carbon Price Floor." SSRN Electronic Journal, 2019. http://dx.doi.org/10.2139/ssrn.3313338.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Krysiak, Frank C. "Technological Diversity and Cost Uncertainty." B.E. Journal of Economic Analysis & Policy 9, no. 1 (December 1, 2009). http://dx.doi.org/10.2202/1935-1682.2168.

Full text
Abstract:
Abstract In many industries, different technologies are used simultaneously for the production of a homogeneous good. Such diversification is socially beneficial, because it reduces the transmission of factor price volatility, like oil-price shocks, to consumer prices. Therefore, many countries have implemented policies aimed at increasing technological diversification. The question is whether such policies are necessary. We use a two-stage investment model to address this question in the setting of perfect competition and of a monopoly. We show that factor price uncertainty leads to diversification, if capital is not too expensive, and that this diversification is due to each firm investing in a diversified technology portfolio. An important implication of this form of diversification is that technological diversity is socially optimal, even in the case of a monopoly. Thus policy intervention is unnecessary and might even be detrimental.
APA, Harvard, Vancouver, ISO, and other styles
32

Lambert, Joseph, and Callum Walker. "Because We’re Worth It." Translation Spaces, July 19, 2022. http://dx.doi.org/10.1075/ts.21030.lam.

Full text
Abstract:
Abstract Rate-setting is a problematic area for newcomers to translation and established practitioners alike. Survey data generally support the view that translators feel underpaid and that money matters remain a chief ethical and pragmatic concern, but appropriate guidance is almost entirely absent from introductory textbooks on the translation profession and documentation prepared by industry associations remains unsatisfactory. Focusing on the translation industry in the United Kingdom, this conceptual paper explores constraints that limit price formation practices, and argues that translators feel under threat from disruptive technologies, Uberisation, and non-professional translation, now more than ever. We explore the complex interaction between status, internal and external perceptions, and regulation, and illustrate their push-pull relationship with rate-setting within a range of industry ‘educators’, uncovering the ways in which translators themselves, translation associations, and academic institutions directly and indirectly impact upon rate-setting practices. The article concludes by considering potential channels to buoy status and improve rate-setting practices in the translation industry.
APA, Harvard, Vancouver, ISO, and other styles
33

Skryhun, Nataliia, Larysa Kapinus, and Dmytro Haidukov. "FEATURES AND PSYCHOLOGY OF BRANDS PRICING: MARKETING ASPECT." Eastern Europe: economy, business and management, no. 3(30) (2021). http://dx.doi.org/10.32782/easterneurope.30-9.

Full text
Abstract:
With the aim of providing the enterprise’s profitability there is an actual problem of searching the effective marketing technologies that would promote the value of commodity for consumers. It is important to point effort at brand capitalization, using totality of marketing instruments; one of basic aspect is price. The essence of brand pricing and brand value as an important constituent of price politics taking into account the marketing and expense approach to pricing is considered in the article. Essence of brands pricing, main purposes of the enterprises, that produce brands, their varieties and methods of achievement, is considered. The row of key factors that directly influence on brand cost and its value for consumers is set. The basic approaches to the process of forming the price of branded commodity, their strengths and weaknesses is proved. The basic differences of brands pricing from ordinary trademarks, the basic constituents of successful brand, their structure, role and influence, are analyzed on the consumers’ decision are established. The value of brand equity, its description, advantage of brand possessing, and also influence on pricing on the whole is investigated. The basic categories of brands depending on price and quality of commodity, their value for consumers were determined. The process of price setting on the branded commodity that consists of some basic stages they are: determination of commodity cost taking into account the charges on producing goods and minimum mark-up; prices coregulation to the standard of prices of commodities-analogues from firms-competitors; determination of size of extra charge for unique trade suggestion, that commodity contains; determination of extra charge for marketing possibilities of company and degree of possessing the channels of distribution; adding to the price a certain price bonus for a brand. The types of strategies that are used for brand pricing taking into account the selected method of its positioning at the market and aims of pricing are established: strategy of skimming; strategy of cost penetration; strategy of stable prices; strategy of price discrimination; strategy of price leadership and others, and also their descriptions and separate features are considered. The influence of such factor as loyalty of consumers on forming the branded cost of commodity is shown. Psychological methods that are needed to be used for forming and adjustment of brand price that will give an opportunity to promote profitability of enterprise activity, retain and fix market positions of the enterprise and its brands are generalized.
APA, Harvard, Vancouver, ISO, and other styles
34

"IoT Based Smart Surveillance System for Healthcare Monitoring using Raspberry PI." International Journal of Innovative Technology and Exploring Engineering 8, no. 12 (October 10, 2019): 627–29. http://dx.doi.org/10.35940/ijitee.j9974.1081219.

Full text
Abstract:
IOT could be a trending in technology that can transform any device into a wise one a lot of industries setting out to utilize these technologies to extend their capacity and improve potency. These system has been created to detect people who are suffering with heart diseases, this framework is powered by Raspberry pi electronic board, which is worked on power control supply, Remote web availability by utilizing USB modem, it incorporates with sensors. pulse sensor which detects each beats per minute price. Temperature sensor detects the temperature variation, blood pressure sensor reads blood pressure and heart rate, ECG sensor which measures the electrical signal of the heart. it is an analog from converted in digital by using of SPI protocol. If any emergency occurs, it will raise a caution send it to the website and mobile though NOOBS Software. If any sensor parameter value more than the instructed value it will raise a beep sound
APA, Harvard, Vancouver, ISO, and other styles
35

Alaoui, I., C. Izambert, and A. Toullier. "Ensuring access to medicines at a fair price? Innovative contracting experiences in France." European Journal of Public Health 30, Supplement_5 (September 1, 2020). http://dx.doi.org/10.1093/eurpub/ckaa165.071.

Full text
Abstract:
Abstract Issue Innovative contracting models are developed to ease price-setting negotiations in case an extremely expensive drug has not proven sufficient efficiency in clinical trials. As disruptive HIV treatments are expected in the near future, French patient organizations evaluated the ability of these innovative contracts to ensure accessible medicines at a fair price. Description Performance-based schemes condition prices paid by the State to the efficiency of the medicine observed through real-world data. In France, thirteen performance-based contracts have been concluded between 2008 and 2015. They are presented as a triple solution: innovative treatments are available to patients, manufacturers access markets, and states ensure healthcare within limited budgets. Establishing the added value of these models implies determining if they allow rapid access to treatments with substantial savings for payers, while ensuring rigorous price and cost transparency. Results Performance-based contracts indeed ensure patient access to treatments, but other mechanisms (such as temporary use authorizations) already serve this purpose. Regarding expenditure reduction however, these schemes have not proven their worth. The Court of Auditors' evaluation showed they do not generate substantial savings, as final prices correspond to those that would have applied with the European price guarantee. Lastly, as contracts are protected by business secrecy, the public cannot access neither to actual prices negotiated by payers, nor the amount of public investment that have been used for the research and development of the drug. Lessons The derogatory nature of performance contracts invites us to consider them on a case-by-case basis if ensuring access to a specific innovation is necessary. These contracts are certainly innovative, but they cannot be presented as technologies providing access at a fair price. Finally, their contractual and derogatory nature raises serious transparency issues. Key messages Performance-based contracts should be considered as alternatives to existing administrative channels provided that they lead to substantial savings and are drawn up in full transparency. Patient organizations need to assess innovative schemes such as performance-based contracting to ensure access to treatments without undermining historical struggles for fair and transparent pricing.
APA, Harvard, Vancouver, ISO, and other styles
36

Di Giovanni, Giulia, and Davide Bernardini. "Vibration Damping Performances of Buildings with Moving Façades Under Harmonic Excitation." Journal of Vibration Engineering & Technologies, September 14, 2020. http://dx.doi.org/10.1007/s42417-020-00247-w.

Full text
Abstract:
Abstract Background Façade technologies are in continuous evolution and the idea to realize buildings equipped with cladding systems capable to undergo significant displacements relatively to the main structure has been considered by many authors as an opportunity to improve their vibration performances. Method From a structural dynamics viewpoint, a building with a monolithic Moving Façade is essentially the same thing as a building with a Tuned Mass Damper. However, in the presence of excitations directly acting on the external surface of the building, there may be significant diferences of behavior. In this work, a first step towards a systematic comparison between the performances of buildings with Moving Façades and Tuned Mass Dampers is carried out in the simplest setting of 2 degrees of freedom modeling and harmonic excitation. Results Despite the deceptive simplicity of the setting, some of the aspects related to the potential applicability of moving façades to vibration damping and the correlated limitations are discussed and critically analyzed. The analyses show that, depending on the tuning of the system, monolithic Moving Façades could effectively act as vibration absorbers with a potentially high efficiency. However, it turns out that good performances could be realized at the price of extremely large displacements of the façade. The possibility to pursue potential applications of this type of systems seems therefore to be subordinated to the search of solutions to limit such displacements within functionally acceptable ranges.
APA, Harvard, Vancouver, ISO, and other styles
37

Barbazzeni, Beatrice, Sultan Haider, and Michael Friebe. "Engaging Through Awareness: Purpose-Driven Framework Development to Evaluate and Develop Future Business Strategies With Exponential Technologies Toward Healthcare Democratization." Frontiers in Public Health 10 (May 25, 2022). http://dx.doi.org/10.3389/fpubh.2022.851380.

Full text
Abstract:
Industry 4.0 and digital transformation will likely come with an era of changes for most manufacturers and tech industries, and even healthcare delivery will likely be affected. A few trends are already foreseeable such as an increased number of patients, advanced technologies, different health-related business models, increased costs, revised ethics, and regulatory procedures. Moreover, cybersecurity, digital invoices, price transparency, improving patient experience, management of big data, and the need for a revised education are challenges in response to digital transformation. Indeed, forward-looking innovation about exponential technologies and their effect on healthcare is now gaining momentum. Thus, we developed a framework, followed by an online survey, to investigate key areas, analyze and visualize future-oriented developments concerning technologies and innovative business models while attempting to translate visions into a strategy toward healthcare democratization. When forecasting the future of health in a short and long-term perspective, results showed that digital healthcare, data management, electronics, and sensors were the most common predictions, followed by artificial intelligence in clinical diagnostic and in which hospitals and homes would be the places of primary care. Shifting from a reactive to a proactive digital ecosystem, the focus on prevention, quality, and faster care accessibility are the novel value propositions toward democratization and digitalization of patient-centered services. Longevity will translate into increased neurodegenerative, chronic diseases, and mental illnesses, becoming severe issues for a future healthcare setup. Besides, data privacy, big data management, and novel regulatory procedures were considered as potential problems resulting from digital transformation. However, a revised education is needed to address these issues while preparing future health professionals. The “P4 of health”, a novel business model that is outcome-based oriented, awareness and acceptance of technologies to support public health, a different mindset that is proactive and future-oriented, and an interdisciplinary setting to merge clinical and technological advances would be key to a novel healthcare ecosystem. Lastly, based on the developed framework, we aim to conduct regular surveys to capture up-to-date technological trends, sustainable health-related business models, and interdependencies. The engagement of stakeholders through awareness and participation is the key to recognizing and improving healthcare needs and services.
APA, Harvard, Vancouver, ISO, and other styles
38

Draief, Moez, Hoda Heidari, and Michael Kearns. "New Models for Competitive Contagion." Proceedings of the AAAI Conference on Artificial Intelligence 28, no. 1 (June 21, 2014). http://dx.doi.org/10.1609/aaai.v28i1.8809.

Full text
Abstract:
In this paper, we introduce and examine two new models for competitive contagion in networks, a game-theoretic generalization of the viral marketing problem. In our setting, firms compete to maximize their market share in a network of consumers whose adoption decisions are stochastically determined by the choices of their neighbors. Building on the switching-selecting framework introduced by Goyal and Kearns, we first introduce a new model in which the payoff to firms comprises not only the number of vertices who adopt their (competing) technologies, but also the network connectivity among those nodes. For a general class of stochastic dynamics driving the local adoption process, we derive upper bounds on (1) the (pure strategy) Price of Anarchy (PoA), which measures the inefficiency of resource use at equilibrium, and (2) the Budget Multiplier, which captures the extent to which the network amplifies the imbalances in the firms' initial budgets. These bounds depend on the firm budgets and the maximum degree of the network, but no other structural properties. In addition, we give general conditions under which the PoA and the Budget Multiplier can be unbounded. We also introduce a model in which budgeting decisions are endogenous, rather than externally given as is typical in the viral marketing problem. In this setting, the firms are allowed to choose the number of seeds to initially infect (at a fixed cost per seed), as well as which nodes to select as seeds. In sharp contrast to the results of Goyal and Kearns, we show that for almost any local adoption dynamics, there exists a family of graphs for which the PoA and Budget Multiplier are unbounded.
APA, Harvard, Vancouver, ISO, and other styles
39

Bhagyashree Pandurang Gadekar and Dr. Tryambak Hiwarkar. "A Conceptual Modeling Framework to Measure the Effectiveness using ML in Business Analytics." International Journal of Advanced Research in Science, Communication and Technology, December 11, 2022, 399–406. http://dx.doi.org/10.48175/ijarsct-7703.

Full text
Abstract:
The purpose of this article is to introduce price analytics as a tool for business. Improving outcomes using supervised machine learning for find solutions to the challenges of determining appropriate pricing for a variety of goods and shopping for goods at the best possible price. Important and necessary important to do research on business analytics many factors, dimensions, and methods for enhancing productivity of business processes, managerial effectiveness, and decision making to get an edge in the market. The use of Machine Learning in the workplace can improve results in allowing us to make prompt, informed judgments based on the data we've stored knowledge. Methods such as supervised learning are used to achievement in business, both qualitatively and quantitatively, by the entrepreneur. In this step, we accomplish this after determining the optimal pricing and distributing it. Instantly update the costs of anything in stock. Because of this, it's possible that the operational effectiveness and efficiency by the highest possible profit, the rate of all of the bookkeeping work and determining the best possible pricing to reach the goal set by the business owners. To summarize, it may be argued that Because of the incredibly competitive corporate environment, cutting-edge scientific research is needed. In particular machine learning technologies with the rise of supervised learning, data mining methods, and corporate optimization of prices in a corporate setting using analytics essential, number one, must-have, etc. Machine learning with an instructor is called supervised learning. By entering the system's recommendations on what to do and what not to do the right values for the variables to get the expected outcome. Some of the many facets of in the corporate world, including domains, orientations, and methodologies..
APA, Harvard, Vancouver, ISO, and other styles
40

Prodan, Alexandra, Lucas Deimel, Johannes Ahlqvist, Strahil Birov, Rainer Thiel, Meeri Toivanen, Zoi Kolitsi, and Dipak Kalra. "Success Factors for Scaling Up the Adoption of Digital Therapeutics Towards the Realization of P5 Medicine." Frontiers in Medicine 9 (April 12, 2022). http://dx.doi.org/10.3389/fmed.2022.854665.

Full text
Abstract:
IntroductionDigital therapeutics (DTx) can be a valuable contribution to the successful scale up of P5 Medicine (personalized, participatory, predictive, preventive, precision medicine) as they offer powerful means of delivering personalization and active patient participation in disease self-management. We investigated how the approval and adoption of DTx within health systems have been approached in five selected European countries and regions, with a view to proposing success factors scaling up their adoption.MethodologyPreliminary research established best countries or region candidates as being Germany, UK, France, Belgium, and the Spanish Region of Catalonia. The research was informed by a literature review, interviews with public bodies and industry, and a multi-stakeholder workshop to validate the findings and fill in existing gaps.ResultsTo authorize the use of digital technologies, the countries and regions passed legislation and developed policy instruments, appointed bodies to assess and certify the products and formalized mechanisms for permitting reimbursement. While DTx is not a commonly used nomenclature, there are digital health technology types defined that have similar requirements as DTx. Assessment and certification frameworks are usually built around the Medical Device Regulation with additional criteria. Reimbursement considerations often observe reimbursement of therapeutic devices and/or medicines. To be integrated into reimbursement systems, countries require manufacturers to demonstrate clinical value and cost-effectiveness. As there are currently very few DTx approved in practice, there is resistance toward clinical acceptance and organizational change, and change management is highly needed to integrate DTx into healthcare systems. The integration and secondary use of DTx data is not encountered in daily practice. Although some enablers exist, there remain technical and legal barriers.DiscussionDTx strategies should be considered as an integral part of digital health strategies and legislation, and specific DTx pathways with clear and transparent assessment and guidelines that balance regulation and innovation should be defined. To help manufacturers, countries should recommend and list methods that are widely accepted and ensure scientific robustness, aligned to the MDR requirements to support transfer of relevant and comparable data across countries. To facilitate rapid uptake of innovation, countries should add flexibility to the framework by allowing temporary market authorization to enable data collection that can support the clinical and socio-economic evaluation and data gathering phase. Certification should trigger rapid price setting and reimbursement mechanisms, and dynamic ways to adjust price and reimbursement levels in time should be established. Relevant stakeholders should be approached on the potential impacts of DTx through transparent communication and change management strategies should be considered. These findings should be validated with a wider range of stakeholders.
APA, Harvard, Vancouver, ISO, and other styles
41

Karpinska, N. "Institutional and functional support of application of sanitary and phytosanitary measures in the light of WTO and EU requirements." Law and innovative society, no. 1 (14) (July 3, 2021). http://dx.doi.org/10.37772/2309-9275-2020-1(14)-13.

Full text
Abstract:
Problem setting. Research devoted to the coverage of the system of institutions that provide influence on the relationship of application of sanitary and phytosanitary measures. Target of research. The purpose of the study is is to describe the main institutions of the WTO, in particular, the International Convention on Plant Quarantine and Protection, the World Organization for Animal Health and the Codex Alimentarius Commission. Analysis of recent researches and publications. The following scientists studied the issues of V. Nosik, A. Stativka, A. Dukhnevych, H. Grigorieva, G. Mamyshov, S. Komendantnov, N. Chuiko, T. Gulyaeva, M. Popov, A. Popova and others. However, the issues of the influence of the main institutions of the international level on the formation of national legislation in the field of SPS, as well as on the practice of its application remain not fully covered. Article’s main body. It is established that at the international level the most globally relations of the SPS are regulated within the WTO and accordingly fall under the competence of its structural parts. The WTO SPM Committee is called upon to resolve differences related to the application of these measures, to monitor the harmonization of international standards and to cooperate with international organizations concerned with food safety, and to be responsible for developing, developing and interpreting , collecting, disseminating information on the application of the SPS of the Member States and providing them with technical cooperation. Mediation is an extremely important function of the SPM Committee. It is established that in the WTO system, along with the Committee on Sanitary and Phytosanitary Measures, there are three important institutions that provide significant influence on the formation of rules for the application of measures and are characterized by an organic relationship, common price orientation, principles of approach and objectives. Various characteristics are revealed, which consist in the creation of special institutions in the field of institutional and functional support of sanitary and phytosanitary measures in the EU. Conclusions and prospects for the development. Analysis of the activities of WTO and EU institutions shows that the development of international (European) phytosanitary legislation occurs through the development and implementation in agricultural production of standard technologies to ensure phytosanitary well-being.
APA, Harvard, Vancouver, ISO, and other styles
42

Halland, Eva. "Carbon capture utilization and storage (CCUS) – it’s happening now! However, are there still any challenges?" Baltic Carbon Forum 1 (October 13, 2022). http://dx.doi.org/10.21595/bcf.2022.22890.

Full text
Abstract:
These days we see a growing interest and more concrete project plans for CCUS in many European countries but with a pathway to “Net Zero”, we are fare from on-track! This definitely implies a stronger push for CCUS in Europe.Although we can show 26 years of permanently stored CO2 in deep geological formations offshore Norway, heavily studied and monitored, there are still many questions about whether CCS is a safe and viable technology. Based on this experience and many years of research and development, we can conclude that this is a viable and safe technology.We know that we have a large storage resources for CO2 on land and offshore in Europe, and we have large CO2 emissions that need to be captured. If CCUS is to achieve the economies of scale necessary to reduce costs and develop technology, cooperation is needed. Like other technologies that are expensive at the start, CO2capture needs to be more efficient and by that less expensive and we need an effort to speed up the mapping and characterization of safe CO2 storage capacity. However, CCUS is the lowest cost, or only, option for many industries to decarbonize, and these industries will be fully exposed to the carbon price by 2023, so CCUS is essential to deliver large-scale and permanent removal of CO2.To contribute to the development of technology for capture, transport, and storage of CO2, with the ambition of achieving a cost-effective solution, the Norwegian government decided in 2020 to develop a full-scale carbon capture and storage project, called Longship.As a result of this decision, we now see that the next phase for CCS is already underway with a growing interest in new areas for CO2storage and more industrial demonstration projects for emission reductions. On the Norwegian continental shelf, three licenses for offshore storage of CO2have been awarded in recent years, these involve 5 companies, and new license applications and new companies are on the way. These companies have presented clear projects involving the entire business chain.We have the knowledge and the technology is ready, so why isn't the CCUS flying? Perhaps it is about setting clear political goals, transporting CO2across national borders, removing potential regulatory barriers and developing new business models. Easy? Let's talk about it and cooperate.
APA, Harvard, Vancouver, ISO, and other styles
43

Balachonova, Olesiy, and Liudmila Shyriaieva. "IMPROVEMENT OF USING THE METHOD OF TARGET COSTING IN THE DEVELOPMENT AND PRODUCTION OF NEW PRODUCTS." International Humanitarian University Herald. Economics and Management, no. 50 (2021). http://dx.doi.org/10.32841/2413-2675/2021-50-1.

Full text
Abstract:
The article proposes a methodological approach that, within the framework of the controlling system, uses the methodology of the target costing method. It can be considered as a special form of functional-cost analysis, which allows not only making optimizing adjustments to the structure of new products, but also creating them according to given cost recommendations, which are based on the acceptable level of costs and information about the usefulness of product properties for the consumer. One of the most important components of controlling is the philosophy of profitability, which provides for a long-term orientation. Cost management in the short and long term is the most important factor in increasing the competitiveness of enterprises. The globalization of markets, the development of information technologies and telecommunications, the increasing dynamism of the external and internal environment of the enterprise, and the increasing demands of consumers put the enterprise in very tough conditions in terms of product profitability. Cost management today is one of the main means of achieving high economic results. Cost management in an enterprise should be understood not only as a reduction, but also as an analysis and optimization, a desire to achieve a certain level of costs, which will allow for an increase in the economic efficiency of activities, and an increase in the competitiveness of products. The use of the target cost method is advisable, first of all, when implementing innovative projects. Depending on the prevailing conditions, enterprises can develop various modifications of the method. When creating a new product, the data obtained on the basis of its use can serve as the basis for the analysis and planning of costs for the products being created, as well as the basis for the control and analysis of deviations. Since the expected price is taken as the basis for using the target costing method to control the value of product costs, the method can be considered not only as a cost optimization method, but also as a way of setting prices. Pricing is also an extremely important aspect of an industrial enterprise. In addition, it should be noted that various modifications of this approach are successfully used in European enterprises. For example, BMW Company uses one of them. Although the method of Target costing in practice has to deal with a number of difficulties, but the method is evolving, attempts are being made to find opportunities to overcome the inconveniences associated with its use. For machine-building and instrument-making enterprises, it is especially effective to incorporate a unit for the analysis of target costs into an automated design system, which will help speed up the receipt of cost recommendations and make the most of them. The application of the method of target costing will allow using the foundations of the components of controlling, which relate to innovation, which is extremely important for the holistic orientation of the enterprise to effective development in the long run.
APA, Harvard, Vancouver, ISO, and other styles
44

"Technology Acceptance and Adoption of Security Surveillance Systems." International Journal of Innovative Technology and Exploring Engineering 8, no. 12 (October 10, 2019): 5829–35. http://dx.doi.org/10.35940/ijitee.l2720.1081219.

Full text
Abstract:
Purpose: The popularity of security cameras have increased in the last decade and due to the advancement of technology and its need is becoming more and more important due to the increasing crime and theft. The installation of surveillance systems gives home owners and business owners a peace of mind. Though organisations have adopted ICT, very few have adopted any security technologies to secure their infrastructures. It’s crucial for the manufacturers to understand the real customer needs with respect to high technology oriented product offers. Thus this study has focused on the customer buying behaviour towards security surveillance system. The factors which are influenced towards buying the camera with special preference towards the technological factors are generally considered towards buying a security surveillance camera. This research is based on Technology Acceptance Model proposed by Bagozzi, R.P in 2007. The study has included various factors that influence the purchase decision and adoption of new technology i.e. setting up security surveillance systems across different types of customers in the city of Coimbatore in South India. The potential institutional buyers that include Schools, Colleges, Construction Contractors, Restaurants and Showrooms were identified which is the population. In this study the researchers have basically focused on the Safety factor of CCTV camera and the influencing parameter for purchasing of CCTV camera. To analyze all the data, different techniques has been used like Percentage Analysis, one-way ANOVA, chi-square test, Correlation and Exploratory factor analysis. IBM SPSS software package has been employed for statistical analysis. Research Approach: The study is a descriptive research which collected data from the past happening without manipulating or intervening in the study environment. The current status of use and awareness of the security surveillance systems were studied with the identified parameters as it existed. The survey was done in randomly chosen areas from Coimbatore – Tirupur Composite Districts. The survey was done with the institutional buyers with purposive sampling that covered Pollachi, Tirupur, Mettupalayam and Coimbatore city. The proprietors of the institutions or the in-charge of purchase department in organizations and institutes that included schools, colleges, hospitals, construction buildings, office buildings, shopping complexes, etc. formed the population. The researchers devised a questionnaire based on the literature survey. The scales were framed and fine-tuned keeping the objectives of research in mind. The scales developed were checked through a pilot survey before the start of the main survey for its reliability. The Cronbach alpha value is 0.712 and thus the scales developed are reliable. The researchers enumerated the data from the respondents using the questionnaire method that collected primary data. Around 300 respondents were chosen. A purposive sample of 300 institutions, builders, restaurants and showrooms in Coimbatore district was selected and a primary data was collected through direct filling of questionnaire by the respondents. Revised Manuscript Received on October 05, 2019. B. Poongodi, Professor, Marketing and an Avid Researcher in Agribusiness, Agriculture Marketing and Consumer Behavior. Navaneetha Krishnan, Credit Relationship Manager, HDB Financial Services, Coimbatore. The sampling method used was non-probability sampling method. The technique used in this research was purposive sampling. For this study, the data collection was done by with the intention of the respondent being a prospective buyer. The elements in the sample included educational institutions, hotels, showrooms and builders in and around Coimbatore and Tirupur Cities. A Structured questionnaire was designed and using that questionnaire data was collected from the respondents. Findings: Hikvision is the Brand which was mostly used by the customers. Next to that CP Plus and Doha Comes in to 2nd and 3rd position. It is evident from the analysis that camera resolution is the most influencing parameter while purchasing the surveillance camera. Next to the Brand, Price was the influencing parameter towards purchasing the camera. The usefulness of AHD system is normally distributed. Hence the Average customer rating is 3. The usefulness of IP system is more than the AHD type system since the most of the customer rated 4 and 5. The Influencing Parameter and expectation towards the memory capacity of the camera are dependent on Sector in which the company operates. It was also found that the expectation towards the life cycle of the camera and expectation toward the price of the camera are also dependent on sector in which the Company operates. Correlation Analysis has revealed that preference on the power saving variable is not associated with the preference on innovativeness of the product. There is positive correlation between the Safety factor and the Crime reduction factor. The customers those who feel safe and protected are also feel crime can be reduced.
APA, Harvard, Vancouver, ISO, and other styles
45

Polishchuk, Yu. "Concepts of state policy for the development of “green” economy: evolution, essence, goals and principles." Efficiency of public administration, no. 65 (March 17, 2021). http://dx.doi.org/10.33990/2070-4011.65.2020.226476.

Full text
Abstract:
Problem setting. The concepts of economic development at the beginning of the third millennium have largely exhausted themselves and require immediate revision and updating. The modern economic development of our civilization at certain intervals has more, so to speak, “side effects” which are a kind of price to increase the welfare of people. Among them there are two main vectors: first, the depletion of limited planetary resources, and secondly – an increase in waste. These two vectors are the most important in the development of Ukraine and the world as a whole, the implementation of the state policy of “green” economy will save our land and live in harmony with our nature.Recent research and publications analysis. Theoretical aspects of the concept of “green economy” are studied by the following scientists: Bublyk, Galushkina, Ivanyuta, Chmir, Yakovenko, Yarmolenko and others. Today, many countries, given the experience of the shortcomings of globalization processes, are actively looking for new models of further development.Highlighting previously unsettled parts of the general problem. The concept of “green” economy today is actively discussed at the level of international organizations, national governments, and among scientists. Despite this, there are few such studies at the scientific level in Ukraine. It should be noted that most of the research in the field of public policy of the “green” economy is carried out by foreign scientists. Therefore, there is a need to systematize scientific and practical knowledge about the new course of formation and implementation of state policy of “green” economy and the definition of basic concepts of “green” economy for Ukraine.Paper main body To generalize the basic concepts of “green” economy development as an object of state policy, to reveal the essence, goals and principles of “green” economy.The concept of “green” economy adopted by the world community is designed to ensure the harmonization of the three components of sustainable development – economic, social and environmental. “Green” Economy is not just a possibility but a compulsory way of development for all countries and nations if they want to ensure a secure future for their citizens. The transition to a green economy has a long and difficult path, but today a number of advanced economies and developing countries demonstrate leadership by adopting “green” economic strategies.Comparative characteristic of concepts of “green economy” is made, its purposes and principles are defined. It is emphasized that the model of “green” economy normalizes the interaction of human activity and nature. Summarizing all the concepts of “green economy”, which were considered in the paper, we can say that it embodied three areas – it is ecology, economics, social development.In this paper actualized problems in the transition of society to priority innovation areas “green” economy.The priority public policy based on the introduction of innovative environmentally sound technologies that will ensure the sustainable development of society in a strategic perspective. The author analyzes the concept as an important component of socio-economic development of the state, shows its importance to the national economy. Strategic priorities involvement of the Ukrainian regions’ “green” economics potential as the basis for the formation of “growth points” and “economic development areas” at the local and regional level has been reasonable. In paper it is proved that the development of “green” economics contributes to solute several system problems of the regions in Ukraine: reduces energy production, stimulates the transition to alternative energy, which creates jobs and improves the quality of life in the regions.Conclusions of the research and prospects for further studies Thus, the state policy of development of “green” economy is a new direction of strategy of many countries of the world which was formed during the last twenty years. It is believed that the development of public policy of “green” economy will ensure the development of most countries and save from the financial and environmental crisis.In view of the above, it is necessary to promote the ideology of public policy “green” economy as the main paradigm for the transformation of social change. Even with very conservative assumptions, the “green” investment scenario allows for decisive measures to achieve higher annual growth rates over 5 – 10 years and increases the reserves of renewable resources needed for the development of the world economy.
APA, Harvard, Vancouver, ISO, and other styles
46

Cesarini, Paul. "‘Opening’ the Xbox." M/C Journal 7, no. 3 (July 1, 2004). http://dx.doi.org/10.5204/mcj.2371.

Full text
Abstract:
“As the old technologies become automatic and invisible, we find ourselves more concerned with fighting or embracing what’s new”—Dennis Baron, From Pencils to Pixels: The Stage of Literacy Technologies What constitutes a computer, as we have come to expect it? Are they necessarily monolithic “beige boxes”, connected to computer monitors, sitting on computer desks, located in computer rooms or computer labs? In order for a device to be considered a true computer, does it need to have a keyboard and mouse? If this were 1991 or earlier, our collective perception of what computers are and are not would largely be framed by this “beige box” model: computers are stationary, slab-like, and heavy, and their natural habitats must be in rooms specifically designated for that purpose. In 1992, when Apple introduced the first PowerBook, our perception began to change. Certainly there had been other portable computers prior to that, such as the Osborne 1, but these were more luggable than portable, weighing just slightly less than a typical sewing machine. The PowerBook and subsequent waves of laptops, personal digital assistants (PDAs), and so-called smart phones from numerous other companies have steadily forced us to rethink and redefine what a computer is and is not, how we interact with them, and the manner in which these tools might be used in the classroom. However, this reconceptualization of computers is far from over, and is in fact steadily evolving as new devices are introduced, adopted, and subsequently adapted for uses beyond of their original purpose. Pat Crowe’s Book Reader project, for example, has morphed Nintendo’s GameBoy and GameBoy Advance into a viable electronic book platform, complete with images, sound, and multi-language support. (Crowe, 2003) His goal was to take this existing technology previously framed only within the context of proprietary adolescent entertainment, and repurpose it for open, flexible uses typically associated with learning and literacy. Similar efforts are underway to repurpose Microsoft’s Xbox, perhaps the ultimate symbol of “closed” technology given Microsoft’s propensity for proprietary code, in order to make it a viable platform for Open Source Software (OSS). However, these efforts are not forgone conclusions, and are in fact typical of the ongoing battle over who controls the technology we own in our homes, and how open source solutions are often at odds with a largely proprietary world. In late 2001, Microsoft launched the Xbox with a multimillion dollar publicity drive featuring events, commercials, live models, and statements claiming this new console gaming platform would “change video games the way MTV changed music”. (Chan, 2001) The Xbox launched with the following technical specifications: 733mhz Pentium III 64mb RAM, 8 or 10gb internal hard disk drive CD/DVD ROM drive (speed unknown) Nvidia graphics processor, with HDTV support 4 USB 1.1 ports (adapter required), AC3 audio 10/100 ethernet port, Optional 56k modem (TechTV, 2001) While current computers dwarf these specifications in virtually all areas now, for 2001 these were roughly on par with many desktop systems. The retail price at the time was $299, but steadily dropped to nearly half that with additional price cuts anticipated. Based on these features, the preponderance of “off the shelf” parts and components used, and the relatively reasonable price, numerous programmers quickly became interested in seeing it if was possible to run Linux and additional OSS on the Xbox. In each case, the goal has been similar: exceed the original purpose of the Xbox, to determine if and how well it might be used for basic computing tasks. If these attempts prove to be successful, the Xbox could allow institutions to dramatically increase the student-to-computer ratio in select environments, or allow individuals who could not otherwise afford a computer to instead buy and Xbox, download and install Linux, and use this new device to write, create, and innovate . This drive to literally and metaphorically “open” the Xbox comes from many directions. Such efforts include Andrew Huang’s self-published “Hacking the Xbox” book in which, under the auspices of reverse engineering, Huang analyzes the architecture of the Xbox, detailing step-by-step instructions for flashing the ROM, upgrading the hard drive and/or RAM, and generally prepping the device for use as an information appliance. Additional initiatives include Lindows CEO Michael Robertson’s $200,000 prize to encourage Linux development on the Xbox, and the Xbox Linux Project at SourceForge. What is Linux? Linux is an alternative operating system initially developed in 1991 by Linus Benedict Torvalds. Linux was based off a derivative of the MINIX operating system, which in turn was a derivative of UNIX. (Hasan 2003) Linux is currently available for Intel-based systems that would normally run versions of Windows, PowerPC-based systems that would normally run Apple’s Mac OS, and a host of other handheld, cell phone, or so-called “embedded” systems. Linux distributions are based almost exclusively on open source software, graphic user interfaces, and middleware components. While there are commercial Linux distributions available, these mainly just package the freely available operating system with bundled technical support, manuals, some exclusive or proprietary commercial applications, and related services. Anyone can still download and install numerous Linux distributions at no cost, provided they do not need technical support beyond the community / enthusiast level. Typical Linux distributions come with open source web browsers, word processors and related productivity applications (such as those found in OpenOffice.org), and related tools for accessing email, organizing schedules and contacts, etc. Certain Linux distributions are more or less designed for network administrators, system engineers, and similar “power users” somewhat distanced from that of our students. However, several distributions including Lycoris, Mandrake, LindowsOS, and other are specifically tailored as regular, desktop operating systems, with regular, everyday computer users in mind. As Linux has no draconian “product activation key” method of authentication, or digital rights management-laden features associated with installation and implementation on typical desktop and laptop systems, Linux is becoming an ideal choice both individually and institutionally. It still faces an uphill battle in terms of achieving widespread acceptance as a desktop operating system. As Finnie points out in Desktop Linux Edges Into The Mainstream: “to attract users, you need ease of installation, ease of device configuration, and intuitive, full-featured desktop user controls. It’s all coming, but slowly. With each new version, desktop Linux comes closer to entering the mainstream. It’s anyone’s guess as to when critical mass will be reached, but you can feel the inevitability: There’s pent-up demand for something different.” (Finnie 2003) Linux is already spreading rapidly in numerous capacities, in numerous countries. Linux has “taken hold wherever computer users desire freedom, and wherever there is demand for inexpensive software.” Reports from technology research company IDG indicate that roughly a third of computers in Central and South America run Linux. Several countries, including Mexico, Brazil, and Argentina, have all but mandated that state-owned institutions adopt open source software whenever possible to “give their people the tools and education to compete with the rest of the world.” (Hills 2001) The Goal Less than a year after Microsoft introduced the The Xbox, the Xbox Linux project formed. The Xbox Linux Project has a goal of developing and distributing Linux for the Xbox gaming console, “so that it can be used for many tasks that Microsoft don’t want you to be able to do. ...as a desktop computer, for email and browsing the web from your TV, as a (web) server” (Xbox Linux Project 2002). Since the Linux operating system is open source, meaning it can freely be tinkered with and distributed, those who opt to download and install Linux on their Xbox can do so with relatively little overhead in terms of cost or time. Additionally, Linux itself looks very “windows-like”, making for fairly low learning curve. To help increase overall awareness of this project and assist in diffusing it, the Xbox Linux Project offers step-by-step installation instructions, with the end result being a system capable of using common peripherals such as a keyboard and mouse, scanner, printer, a “webcam and a DVD burner, connected to a VGA monitor; 100% compatible with a standard Linux PC, all PC (USB) hardware and PC software that works with Linux.” (Xbox Linux Project 2002) Such a system could have tremendous potential for technology literacy. Pairing an Xbox with Linux and OpenOffice.org, for example, would provide our students essentially the same capability any of them would expect from a regular desktop computer. They could send and receive email, communicate using instant messaging IRC, or newsgroup clients, and browse Internet sites just as they normally would. In fact, the overall browsing experience for Linux users is substantially better than that for most Windows users. Internet Explorer, the default browser on all systems running Windows-base operating systems, lacks basic features standard in virtually all competing browsers. Native blocking of “pop-up” advertisements is still not yet possible in Internet Explorer without the aid of a third-party utility. Tabbed browsing, which involves the ability to easily open and sort through multiple Web pages in the same window, often with a single mouse click, is also missing from Internet Explorer. The same can be said for a robust download manager, “find as you type”, and a variety of additional features. Mozilla, Netscape, Firefox, Konqueror, and essentially all other OSS browsers for Linux have these features. Of course, most of these browsers are also available for Windows, but Internet Explorer is still considered the standard browser for the platform. If the Xbox Linux Project becomes widely diffused, our students could edit and save Microsoft Word files in OpenOffice.org’s Writer program, and do the same with PowerPoint and Excel files in similar OpenOffice.org components. They could access instructor comments originally created in Microsoft Word documents, and in turn could add their own comments and send the documents back to their instructors. They could even perform many functions not yet capable in Microsoft Office, including saving files in PDF or Flash format without needing Adobe’s Acrobat product or Macromedia’s Flash Studio MX. Additionally, by way of this project, the Xbox can also serve as “a Linux server for HTTP/FTP/SMB/NFS, serving data such as MP3/MPEG4/DivX, or a router, or both; without a monitor or keyboard or mouse connected.” (Xbox Linux Project 2003) In a very real sense, our students could use these inexpensive systems previously framed only within the context of entertainment, for educational purposes typically associated with computer-mediated learning. Problems: Control and Access The existing rhetoric of technological control surrounding current and emerging technologies appears to be stifling many of these efforts before they can even be brought to the public. This rhetoric of control is largely typified by overly-restrictive digital rights management (DRM) schemes antithetical to education, and the Digital Millennium Copyright Act (DMCA). Combined,both are currently being used as technical and legal clubs against these efforts. Microsoft, for example, has taken a dim view of any efforts to adapt the Xbox to Linux. Microsoft CEO Steve Ballmer, who has repeatedly referred to Linux as a cancer and has equated OSS as being un-American, stated, “Given the way the economic model works - and that is a subsidy followed, essentially, by fees for every piece of software sold - our license framework has to do that.” (Becker 2003) Since the Xbox is based on a subsidy model, meaning that Microsoft actually sells the hardware at a loss and instead generates revenue off software sales, Ballmer launched a series of concerted legal attacks against the Xbox Linux Project and similar efforts. In 2002, Nintendo, Sony, and Microsoft simultaneously sued Lik Sang, Inc., a Hong Kong-based company that produces programmable cartridges and “mod chips” for the PlayStation II, Xbox, and Game Cube. Nintendo states that its company alone loses over $650 million each year due to piracy of their console gaming titles, which typically originate in China, Paraguay, and Mexico. (GameIndustry.biz) Currently, many attempts to “mod” the Xbox required the use of such chips. As Lik Sang is one of the only suppliers, initial efforts to adapt the Xbox to Linux slowed considerably. Despite that fact that such chips can still be ordered and shipped here by less conventional means, it does not change that fact that the chips themselves would be illegal in the U.S. due to the anticircumvention clause in the DMCA itself, which is designed specifically to protect any DRM-wrapped content, regardless of context. The Xbox Linux Project then attempted to get Microsoft to officially sanction their efforts. They were not only rebuffed, but Microsoft then opted to hire programmers specifically to create technological countermeasures for the Xbox, to defeat additional attempts at installing OSS on it. Undeterred, the Xbox Linux Project eventually arrived at a method of installing and booting Linux without the use of mod chips, and have taken a more defiant tone now with Microsoft regarding their circumvention efforts. (Lettice 2002) They state that “Microsoft does not want you to use the Xbox as a Linux computer, therefore it has some anti-Linux-protection built in, but it can be circumvented easily, so that an Xbox can be used as what it is: an IBM PC.” (Xbox Linux Project 2003) Problems: Learning Curves and Usability In spite of the difficulties imposed by the combined technological and legal attacks on this project, it has succeeded at infiltrating this closed system with OSS. It has done so beyond the mere prototype level, too, as evidenced by the Xbox Linux Project now having both complete, step-by-step instructions available for users to modify their own Xbox systems, and an alternate plan catering to those who have the interest in modifying their systems, but not the time or technical inclinations. Specifically, this option involves users mailing their Xbox systems to community volunteers within the Xbox Linux Project, and basically having these volunteers perform the necessary software preparation or actually do the full Linux installation for them, free of charge (presumably not including shipping). This particular aspect of the project, dubbed “Users Help Users”, appears to be fairly new. Yet, it already lists over sixty volunteers capable and willing to perform this service, since “Many users don’t have the possibility, expertise or hardware” to perform these modifications. Amazingly enough, in some cases these volunteers are barely out of junior high school. One such volunteer stipulates that those seeking his assistance keep in mind that he is “just 14” and that when performing these modifications he “...will not always be finished by the next day”. (Steil 2003) In addition to this interesting if somewhat unusual level of community-driven support, there are currently several Linux-based options available for the Xbox. The two that are perhaps the most developed are GentooX, which is based of the popular Gentoo Linux distribution, and Ed’s Debian, based off the Debian GNU / Linux distribution. Both Gentoo and Debian are “seasoned” distributions that have been available for some time now, though Daniel Robbins, Chief Architect of Gentoo, refers to the product as actually being a “metadistribution” of Linux, due to its high degree of adaptability and configurability. (Gentoo 2004) Specifically, the Robbins asserts that Gentoo is capable of being “customized for just about any application or need. ...an ideal secure server, development workstation, professional desktop, gaming system, embedded solution or something else—whatever you need it to be.” (Robbins 2004) He further states that the whole point of Gentoo is to provide a better, more usable Linux experience than that found in many other distributions. Robbins states that: “The goal of Gentoo is to design tools and systems that allow a user to do their work pleasantly and efficiently as possible, as they see fit. Our tools should be a joy to use, and should help the user to appreciate the richness of the Linux and free software community, and the flexibility of free software. ...Put another way, the Gentoo philosophy is to create better tools. When a tool is doing its job perfectly, you might not even be very aware of its presence, because it does not interfere and make its presence known, nor does it force you to interact with it when you don’t want it to. The tool serves the user rather than the user serving the tool.” (Robbins 2004) There is also a so-called “live CD” Linux distribution suitable for the Xbox, called dyne:bolic, and an in-progress release of Slackware Linux, as well. According to the Xbox Linux Project, the only difference between the standard releases of these distributions and their Xbox counterparts is that “...the install process – and naturally the bootloader, the kernel and the kernel modules – are all customized for the Xbox.” (Xbox Linux Project, 2003) Of course, even if Gentoo is as user-friendly as Robbins purports, even if the Linux kernel itself has become significantly more robust and efficient, and even if Microsoft again drops the retail price of the Xbox, is this really a feasible solution in the classroom? Does the Xbox Linux Project have an army of 14 year olds willing to modify dozens, perhaps hundreds of these systems for use in secondary schools and higher education? Of course not. If such an institutional rollout were to be undertaken, it would require significant support from not only faculty, but Department Chairs, Deans, IT staff, and quite possible Chief Information Officers. Disk images would need to be customized for each institution to reflect their respective needs, ranging from setting specific home pages on web browsers, to bookmarks, to custom back-up and / or disk re-imaging scripts, to network authentication. This would be no small task. Yet, the steps mentioned above are essentially no different than what would be required of any IT staff when creating a new disk image for a computer lab, be it one for a Windows-based system or a Mac OS X-based one. The primary difference would be Linux itself—nothing more, nothing less. The institutional difficulties in undertaking such an effort would likely be encountered prior to even purchasing a single Xbox, in that they would involve the same difficulties associated with any new hardware or software initiative: staffing, budget, and support. If the institutional in question is either unwilling or unable to address these three factors, it would not matter if the Xbox itself was as free as Linux. An Open Future, or a Closed one? It is unclear how far the Xbox Linux Project will be allowed to go in their efforts to invade an essentially a proprietary system with OSS. Unlike Sony, which has made deliberate steps to commercialize similar efforts for their PlayStation 2 console, Microsoft appears resolute in fighting OSS on the Xbox by any means necessary. They will continue to crack down on any companies selling so-called mod chips, and will continue to employ technological protections to keep the Xbox “closed”. Despite clear evidence to the contrary, in all likelihood Microsoft continue to equate any OSS efforts directed at the Xbox with piracy-related motivations. Additionally, Microsoft’s successor to the Xbox would likely include additional anticircumvention technologies incorporated into it that could set the Xbox Linux Project back by months, years, or could stop it cold. Of course, it is difficult to say with any degree of certainty how this “Xbox 2” (perhaps a more appropriate name might be “Nextbox”) will impact this project. Regardless of how this device evolves, there can be little doubt of the value of Linux, OpenOffice.org, and other OSS to teaching and learning with technology. This value exists not only in terms of price, but in increased freedom from policies and technologies of control. New Linux distributions from Gentoo, Mandrake, Lycoris, Lindows, and other companies are just now starting to focus their efforts on Linux as user-friendly, easy to use desktop operating systems, rather than just server or “techno-geek” environments suitable for advanced programmers and computer operators. While metaphorically opening the Xbox may not be for everyone, and may not be a suitable computing solution for all, I believe we as educators must promote and encourage such efforts whenever possible. I suggest this because I believe we need to exercise our professional influence and ultimately shape the future of technology literacy, either individually as faculty and collectively as departments, colleges, or institutions. Moran and Fitzsimmons-Hunter argue this very point in Writing Teachers, Schools, Access, and Change. One of their fundamental provisions they use to define “access” asserts that there must be a willingness for teachers and students to “fight for the technologies that they need to pursue their goals for their own teaching and learning.” (Taylor / Ward 160) Regardless of whether or not this debate is grounded in the “beige boxes” of the past, or the Xboxes of the present, much is at stake. Private corporations should not be in a position to control the manner in which we use legally-purchased technologies, regardless of whether or not these technologies are then repurposed for literacy uses. I believe the exigency associated with this control, and the ongoing evolution of what is and is not a computer, dictates that we assert ourselves more actively into this discussion. We must take steps to provide our students with the best possible computer-mediated learning experience, however seemingly unorthodox the technological means might be, so that they may think critically, communicate effectively, and participate actively in society and in their future careers. About the Author Paul Cesarini is an Assistant Professor in the Department of Visual Communication & Technology Education, Bowling Green State University, Ohio Email: pcesari@bgnet.bgsu.edu Works Cited http://xbox-linux.sourceforge.net/docs/debian.php>.Baron, Denis. “From Pencils to Pixels: The Stages of Literacy Technologies.” Passions Pedagogies and 21st Century Technologies. Hawisher, Gail E., and Cynthia L. Selfe, Eds. Utah: Utah State University Press, 1999. 15 – 33. Becker, David. “Ballmer: Mod Chips Threaten Xbox”. News.com. 21 Oct 2002. http://news.com.com/2100-1040-962797.php>. http://news.com.com/2100-1040-978957.html?tag=nl>. http://archive.infoworld.com/articles/hn/xml/02/08/13/020813hnchina.xml>. http://www.neoseeker.com/news/story/1062/>. http://www.bookreader.co.uk>.Finni, Scott. “Desktop Linux Edges Into The Mainstream”. TechWeb. 8 Apr 2003. http://www.techweb.com/tech/software/20030408_software. http://www.theregister.co.uk/content/archive/29439.html http://gentoox.shallax.com/. http://ragib.hypermart.net/linux/. http://www.itworld.com/Comp/2362/LWD010424latinlinux/pfindex.html. http://www.xbox-linux.sourceforge.net. http://www.theregister.co.uk/content/archive/27487.html. http://www.theregister.co.uk/content/archive/26078.html. http://www.us.playstation.com/peripherals.aspx?id=SCPH-97047. http://www.techtv.com/extendedplay/reviews/story/0,24330,3356862,00.html. http://www.wired.com/news/business/0,1367,61984,00.html. http://www.gentoo.org/main/en/about.xml http://www.gentoo.org/main/en/philosophy.xml http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2869075,00.html. http://xbox-linux.sourceforge.net/docs/usershelpusers.html http://www.cnn.com/2002/TECH/fun.games/12/16/gamers.liksang/. Citation reference for this article MLA Style Cesarini, Paul. "“Opening” the Xbox" M/C: A Journal of Media and Culture <http://www.media-culture.org.au/0406/08_Cesarini.php>. APA Style Cesarini, P. (2004, Jul1). “Opening” the Xbox. M/C: A Journal of Media and Culture, 7, <http://www.media-culture.org.au/0406/08_Cesarini.php>
APA, Harvard, Vancouver, ISO, and other styles
47

Toutant, Ligia. "Can Stage Directors Make Opera and Popular Culture ‘Equal’?" M/C Journal 11, no. 2 (June 1, 2008). http://dx.doi.org/10.5204/mcj.34.

Full text
Abstract:
Cultural sociologists (Bourdieu; DiMaggio, “Cultural Capital”, “Classification”; Gans; Lamont & Foumier; Halle; Erickson) wrote about high culture and popular culture in an attempt to explain the growing social and economic inequalities, to find consensus on culture hierarchies, and to analyze cultural complexities. Halle states that this categorisation of culture into “high culture” and “popular culture” underlined most of the debate on culture in the last fifty years. Gans contends that both high culture and popular culture are stereotypes, public forms of culture or taste cultures, each sharing “common aesthetic values and standards of tastes” (8). However, this article is not concerned with these categorisations, or macro analysis. Rather, it is a reflection piece that inquires if opera, which is usually considered high culture, has become more equal to popular culture, and why some directors change the time and place of opera plots, whereas others will stay true to the original setting of the story. I do not consider these productions “adaptations,” but “post-modern morphologies,” and I will refer to this later in the paper. In other words, the paper is seeking to explain a social phenomenon and explore the underlying motives by quoting interviews with directors. The word ‘opera’ is defined in Elson’s Music Dictionary as: “a form of musical composition evolved shortly before 1600, by some enthusiastic Florentine amateurs who sought to bring back the Greek plays to the modern stage” (189). Hence, it was an experimentation to revive Greek music and drama believed to be the ideal way to express emotions (Grout 186). It is difficult to pinpoint the exact moment when stage directors started changing the time and place of the original settings of operas. The practice became more common after World War II, and Peter Brook’s Covent Garden productions of Boris Godunov (1948) and Salome (1949) are considered the prototypes of this practice (Sutcliffe 19-20). Richard Wagner’s grandsons, the brothers Wieland and Wolfgang Wagner are cited in the music literature as using technology and modern innovations in staging and design beginning in the early 1950s. Brief Background into the History of Opera Grout contends that opera began as an attempt to heighten the dramatic expression of language by intensifying the natural accents of speech through melody supported by simple harmony. In the late 1590s, the Italian composer Jacopo Peri wrote what is considered to be the first opera, but most of it has been lost. The first surviving complete opera is Euridice, a version of the Orpheus myth that Peri and Giulio Caccini jointly set to music in 1600. The first composer to understand the possibilities inherent in this new musical form was Claudio Monteverdi, who in 1607 wrote Orfeo. Although it was based on the same story as Euridice, it was expanded to a full five acts. Early opera was meant for small, private audiences, usually at court; hence it began as an elitist genre. After thirty years of being private, in 1637, opera went public with the opening of the first public opera house, Teatro di San Cassiano, in Venice, and the genre quickly became popular. Indeed, Monteverdi wrote his last two operas, Il ritorno d’Ulisse in patria and L’incoronazione di Poppea for the Venetian public, thereby leading the transition from the Italian courts to the ‘public’. Both operas are still performed today. Poppea was the first opera to be based on a historical rather than a mythological or allegorical subject. Sutcliffe argues that opera became popular because it was a new mixture of means: new words, new music, new methods of performance. He states, “operatic fashion through history may be a desire for novelty, new formulas displacing old” (65). By the end of the 17th century, Venice alone had ten opera houses that had produced more than 350 operas. Wealthy families purchased season boxes, but inexpensive tickets made the genre available to persons of lesser means. The genre spread quickly, and various styles of opera developed. In Naples, for example, music rather than the libretto dominated opera. The genre spread to Germany and France, each developing the genre to suit the demands of its audiences. For example, ballet became an essential component of French opera. Eventually, “opera became the profligate art as large casts and lavish settings made it the most expensive public entertainment. It was the only art that without embarrassment called itself ‘grand’” (Boorstin 467). Contemporary Opera Productions Opera continues to be popular. According to a 2002 report released by the National Endowment for the Arts, 6.6 million adults attended at least one live opera performance in 2002, and 37.6 million experienced opera on television, video, radio, audio recording or via the Internet. Some think that it is a dying art form, while others think to the contrary, that it is a living art form because of its complexity and “ability to probe deeper into the human experience than any other art form” (Berger 3). Some directors change the setting of operas with perhaps the most famous contemporary proponent of this approach being Peter Sellars, who made drastic changes to three of Mozart’s most famous operas. Le Nozze di Figaro, originally set in 18th-century Seville, was set by Sellars in a luxury apartment in the Trump Tower in New York City; Sellars set Don Giovanni in contemporary Spanish Harlem rather than 17th century Seville; and for Cosi Fan Tutte, Sellars chose a diner on Cape Cod rather than 18th century Naples. As one of the more than six million Americans who attend live opera each year, I have experienced several updated productions, which made me reflect on the convergence or cross-over between high culture and popular culture. In 2000, I attended a production of Don Giovanni at the Estates Theatre in Prague, the very theatre where Mozart conducted the world premiere in 1787. In this production, Don Giovanni was a fashion designer known as “Don G” and drove a BMW. During the 1999-2000 season, Los Angeles Opera engaged film director Bruce Beresford to direct Verdi’s Rigoletto. Beresford updated the original setting of 16th century Mantua to 20th century Hollywood. The lead tenor, rather than being the Duke of Mantua, was a Hollywood agent known as “Duke Mantua.” In the first act, just before Marullo announces to the Duke’s guests that the jester Rigoletto has taken a mistress, he gets the news via his cell phone. Director Ian Judge set the 2004 production of Le Nozze di Figaro in the 1950s. In one of the opening productions of the 2006-07 LA opera season, Vincent Patterson also chose the 1950s for Massenet’s Manon rather than France in the 1720s. This allowed the title character to appear in the fourth act dressed as Marilyn Monroe. Excerpts from the dress rehearsal can be seen on YouTube. Most recently, I attended a production of Ariane et Barbe-Bleu at the Paris Opera. The original setting of the Maeterlinck play is in Duke Bluebeard’s castle, but the time period is unclear. However, it is doubtful that the 1907 opera based on an 1899 play was meant to be set in what appeared to be a mental institution equipped with surveillance cameras whose screens were visible to the audience. The critical and audience consensus seemed to be that the opera was a musical success but a failure as a production. James Shore summed up the audience reaction: “the production team was vociferously booed and jeered by much of the house, and the enthusiastic applause that had greeted the singers and conductor, immediately went nearly silent when they came on stage”. It seems to me that a new class-related taste has emerged; the opera genre has shot out a subdivision which I shall call “post-modern morphologies,” that may appeal to a larger pool of people. Hence, class, age, gender, and race are becoming more important factors in conceptualising opera productions today than in the past. I do not consider these productions as new adaptations because the libretto and the music are originals. What changes is the fact that both text and sound are taken to a higher dimension by adding iconographic images that stimulate people’s brains. When asked in an interview why he often changes the setting of an opera, Ian Judge commented, “I try to find the best world for the story and characters to operate in, and I think you have to find a balance between the period the author set it in, the period he conceived it in and the nature of theatre and audiences at that time, and the world we live in.” Hence, the world today is complex, interconnected, borderless and timeless because of advanced technologies, and updated opera productions play with symbols that offer multiple meanings that reflect the world we live in. It may be that television and film have influenced opera production. Character tenor Graham Clark recently observed in an interview, “Now the situation has changed enormously. Television and film have made a lot of things totally accessible which they were not before and in an entirely different perception.” Director Ian Judge believes that television and film have affected audience expectations in opera. “I think audiences who are brought up on television, which is bad acting, and movies, which is not that good acting, perhaps require more of opera than stand and deliver, and I have never really been happy with someone who just stands and sings.” Sociologist Wendy Griswold states that culture reflects social reality and the meaning of a particular cultural object (such as opera), originates “in the social structures and social patterns it reflects” (22). Screens of various technologies are embedded in our lives and normalised as extensions of our bodies. In those opera productions in which directors change the time and place of opera plots, use technology, and are less concerned with what the composer or librettist intended (which we can only guess), the iconographic images create multi valances, textuality similar to Mikhail Bakhtin’s notion of multiplicity of voices. Hence, a plurality of meanings. Plàcido Domingo, the Eli and Edyth Broad General Director of Los Angeles Opera, seeks to take advantage of the company’s proximity to the film industry. This is evidenced by his having engaged Bruce Beresford to direct Rigoletto and William Friedkin to direct Ariadne auf Naxos, Duke Bluebeard’s Castle and Gianni Schicchi. Perhaps the most daring example of Domingo’s approach was convincing Garry Marshall, creator of the television sitcom Happy Days and who directed the films Pretty Woman and The Princess Diaries, to direct Jacques Offenbach’s The Grand Duchess of Gerolstein to open the company’s 20th anniversary season. When asked how Domingo convinced him to direct an opera for the first time, Marshall responded, “he was insistent that one, people think that opera is pretty elitist, and he knew without insulting me that I was not one of the elitists; two, he said that you gotta make a funny opera; we need more comedy in the operetta and opera world.” Marshall rewrote most of the dialogue and performed it in English, but left the “songs” untouched and in the original French. He also developed numerous sight gags and added characters including a dog named Morrie and the composer Jacques Offenbach himself. Did it work? Christie Grimstad wrote, “if you want an evening filled with witty music, kaleidoscopic colors and hilariously good singing, seek out The Grand Duchess. You will not be disappointed.” The FanFaire Website commented on Domingo’s approach of using television and film directors to direct opera: You’ve got to hand it to Plàcido Domingo for having the vision to draw on Hollywood’s vast pool of directorial talent. Certainly something can be gained from the cross-fertilization that could ensue from this sort of interaction between opera and the movies, two forms of entertainment (elitist and perennially struggling for funds vs. popular and, it seems, eternally rich) that in Los Angeles have traditionally lived separate lives on opposite sides of the tracks. A wider audience, for example, never a problem for the movies, can only mean good news for the future of opera. So, did the Marshall Plan work? Purists of course will always want their operas and operettas ‘pure and unadulterated’. But with an audience that seemed to have as much fun as the stellar cast on stage, it sure did. Critic Alan Rich disagrees, calling Marshall “a representative from an alien industry taking on an artistic product, not to create something innovative and interesting, but merely to insult.” Nevertheless, the combination of Hollywood and opera seems to work. The Los Angeles Opera reported that the 2005-2006 season was its best ever: “ticket revenues from the season, which ended in June, exceeded projected figures by nearly US$900,000. Seasonal attendance at the Dorothy Chandler Pavilion stood at more than 86% of the house’s capacity, the largest percentage in the opera’s history.” Domingo continues with the Hollywood connection in the upcoming 2008-2009 season. He has reengaged William Friedkin to direct two of Puccini’s three operas titled collectively as Il Trittico. Friedkin will direct the two tragedies, Il Tabarro and Suor Angelica. Although Friedkin has already directed a production of the third opera in Il Trittico for Los Angeles, the comedy Gianni Schicchi, Domingo convinced Woody Allen to make his operatic directorial debut with this work. This can be viewed as another example of the desire to make opera and popular culture more equal. However, some, like Alan Rich, may see this attempt as merely insulting rather than interesting and innovative. With a top ticket price in Los Angeles of US$238 per seat, opera seems to continue to be elitist. Berger (2005) concurs with this idea and gives his rationale for elitism: there are rich people who support and attend the opera; it is an imported art from Europe that causes some marginalisation; opera is not associated with something being ‘moral,’ a concept engrained in American culture; it is expensive to produce and usually funded by kings, corporations, rich people; and the opera singers are rare –usually one in a million who will have the vocal quality to sing opera arias. Furthermore, Nicholas Kenyon commented in the early 1990s: “there is suspicion that audiences are now paying more and more money for their seats to see more and more money spent on stage” (Kenyon 3). Still, Garry Marshall commented that the budget for The Grand Duchess was US$2 million, while his budget for Runaway Bride was US$72 million. Kenyon warns, “Such popularity for opera may be illusory. The enjoyment of one striking aria does not guarantee the survival of an art form long regarded as over-elitist, over-recondite, and over-priced” (Kenyon 3). A recent development is the Metropolitan Opera’s decision to simulcast live opera performances from the Met stage to various cinemas around the world. These HD transmissions began with the 2006-2007 season when six performances were broadcast. In the 2007-2008 season, the schedule has expanded to eight live Saturday matinee broadcasts plus eight recorded encores broadcast the following day. According to The Los Angeles Times, “the Met’s experiment of merging film with live performance has created a new art form” (Aslup). Whether or not this is a “new art form,” it certainly makes world-class live opera available to countless persons who cannot travel to New York and pay the price for tickets, when they are available. In the US alone, more than 350 cinemas screen these live HD broadcasts from the Met. Top ticket price for these performances at the Met is US$375, while the lowest price is US$27 for seats with only a partial view. Top price for the HD transmissions in participating cinemas is US$22. This experiment with live simulcasts makes opera more affordable and may increase its popularity; combined with updated stagings, opera can engage a much larger audience and hope for even a mass consumption. Is opera moving closer and closer to popular culture? There still seems to be an aura of elitism and snobbery about opera. However, Plàcido Domingo’s attempt to join opera with Hollywood is meant to break the barriers between high and popular culture. The practice of updating opera settings is not confined to Los Angeles. As mentioned earlier, the idea can be traced to post World War II England, and is quite common in Europe. Examples include Erich Wonder’s approach to Wagner’s Ring, making Valhalla, the mythological home of the gods and typically a mountaintop, into the spaceship Valhalla, as well as my own experience with Don Giovanni in Prague and Ariane et Barbe-Bleu in Paris. Indeed, Sutcliffe maintains, “Great classics in all branches of the arts are repeatedly being repackaged for a consumerist world that is increasingly and neurotically self-obsessed” (61). Although new operas are being written and performed, most contemporary performances are of operas by Verdi, Mozart, and Puccini (www.operabase.com). This means that audiences see the same works repeated many times, but in different interpretations. Perhaps this is why Sutcliffe contends, “since the 1970s it is the actual productions that have had the novelty value grabbed by the headlines. Singing no longer predominates” (Sutcliffe 57). If then, as Sutcliffe argues, “operatic fashion through history may be a desire for novelty, new formulas displacing old” (Sutcliffe 65), then the contemporary practice of changing the original settings is simply the latest “new formula” that is replacing the old ones. If there are no new words or new music, then what remains are new methods of performance, hence the practice of changing time and place. Opera is a complex art form that has evolved over the past 400 years and continues to evolve, but will it survive? The underlining motives for directors changing the time and place of opera performances are at least three: for aesthetic/artistic purposes, financial purposes, and to reach an audience from many cultures, who speak different languages, and who have varied tastes. These three reasons are interrelated. In 1996, Sutcliffe wrote that there has been one constant in all the arguments about opera productions during the preceding two decades: “the producer’s wish to relate the works being staged to contemporary circumstances and passions.” Although that sounds like a purely aesthetic reason, making opera relevant to new, multicultural audiences and thereby increasing the bottom line seems very much a part of that aesthetic. It is as true today as it was when Sutcliffe made the observation twelve years ago (60-61). My own speculation is that opera needs to attract various audiences, and it can only do so by appealing to popular culture and engaging new forms of media and technology. Erickson concludes that the number of upper status people who are exclusively faithful to fine arts is declining; high status people consume a variety of culture while the lower status people are limited to what they like. Research in North America, Europe, and Australia, states Erickson, attest to these trends. My answer to the question can stage directors make opera and popular culture “equal” is yes, and they can do it successfully. Perhaps Stanley Sharpless summed it up best: After his Eden triumph, When the Devil played his ace, He wondered what he could do next To irk the human race, So he invented Opera, With many a fiendish grin, To mystify the lowbrows, And take the highbrows in. References The Grand Duchess. 2005. 3 Feb. 2008 < http://www.ffaire.com/Duchess/index.htm >.Aslup, Glenn. “Puccini’s La Boheme: A Live HD Broadcast from the Met.” Central City Blog Opera 7 Apr. 2008. 24 Apr. 2008 < http://www.centralcityopera.org/blog/2008/04/07/puccini%E2%80%99s- la-boheme-a-live-hd-broadcast-from-the-met/ >.Berger, William. Puccini without Excuses. New York: Vintage, 2005.Boorstin, Daniel. The Creators: A History of Heroes of the Imagination. New York: Random House, 1992.Bourdieu, Pierre. Distinction: A Social Critique of the Judgment of Taste. Cambridge: Harvard UP, 1984.Clark, Graham. “Interview with Graham Clark.” The KCSN Opera House, 88.5 FM. 11 Aug. 2006.DiMaggio, Paul. “Cultural Capital and School Success.” American Sociological Review 47 (1982): 189-201.DiMaggio, Paul. “Classification in Art.”_ American Sociological Review_ 52 (1987): 440-55.Elson, C. Louis. “Opera.” Elson’s Music Dictionary. Boston: Oliver Ditson, 1905.Erickson, H. Bonnie. “The Crisis in Culture and Inequality.” In W. Ivey and S. J. Tepper, eds. Engaging Art: The Next Great Transformation of America’s Cultural Life. New York: Routledge, 2007.Fanfaire.com. “At Its 20th Anniversary Celebration, the Los Angeles Opera Had a Ball with The Grand Duchess.” 24 Apr. 2008 < http://www.fanfaire.com/Duchess/index.htm >.Gans, J. Herbert. Popular Culture and High Culture: An Analysis and Evaluation of Taste. New York: Basic Books, 1977.Grimstad, Christie. Concerto Net.com. 2005. 12 Jan. 2008 < http://www.concertonet.com/scripts/review.php?ID_review=3091 >.Grisworld, Wendy. Cultures and Societies in a Changing World. Thousand Oaks, CA: Pine Forge Press, 1994.Grout, D. Jay. A History of Western Music. Shorter ed. New York: W.W. Norton & Company, Inc, 1964.Halle, David. “High and Low Culture.” The Blackwell Encyclopedia of Sociology. London: Blackwell, 2006.Judge, Ian. “Interview with Ian Judge.” The KCSN Opera House, 88.5 FM. 22 Mar. 2006.Harper, Douglas. Online Etymology Dictionary. 2001. 19 Nov. 2006 < http://www.etymonline.com/index.php?search=opera&searchmode=none >.Kenyon, Nicholas. “Introduction.” In A. Holden, N. Kenyon and S. Walsh, eds. The Viking Opera Guide. New York: Penguin, 1993.Lamont, Michele, and Marcel Fournier. Cultivating Differences: Symbolic Boundaries and the Making of Inequality. Chicago: U of Chicago P, 1992.Lord, M.G. “Shlemiel! Shlemozzle! And Cue the Soprano.” The New York Times 4 Sep. 2005.Los Angeles Opera. “LA Opera General Director Placido Domingo Announces Results of Record-Breaking 20th Anniversary Season.” News release. 2006.Marshall, Garry. “Interview with Garry Marshall.” The KCSN Opera House, 88.5 FM. 31 Aug. 2005.National Endowment for the Arts. 2002 Survey of Public Participation in the Arts. Research Division Report #45. 5 Feb. 2008 < http://www.nea.gov/pub/NEASurvey2004.pdf >.NCM Fanthom. “The Metropolitan Opera HD Live.” 2 Feb. 2008 < http://fathomevents.com/details.aspx?seriesid=622&gclid= CLa59NGuspECFQU6awodjiOafA >.Opera Today. James Sobre: Ariane et Barbe-Bleue and Capriccio in Paris – Name This Stage Piece If You Can. 5 Feb. 2008 < http://www.operatoday.com/content/2007/09/ariane_et_barbe_1.php >.Rich, Alan. “High Notes, and Low.” LA Weekly 15 Sep. 2005. 6 May 2008 < http://www.laweekly.com/stage/a-lot-of-night-music/high-notes-and-low/8160/ >.Sharpless, Stanley. “A Song against Opera.” In E. O. Parrott, ed. How to Be Tremendously Tuned in to Opera. New York: Penguin, 1990.Shore, James. Opera Today. 2007. 4 Feb. 2008 < http://www.operatoday.com/content/2007/09/ariane_et_barbe_1.php >.Sutcliffe, Tom. Believing in Opera. Princeton, New Jersey: Princeton UP, 1996.YouTube. “Manon Sex and the Opera.” 24 Apr. 2008 < http://www.youtube.com/watch?v=YiBQhr2Sy0k >.
APA, Harvard, Vancouver, ISO, and other styles
48

Potter, Emily. "Calculating Interests: Climate Change and the Politics of Life." M/C Journal 12, no. 4 (October 13, 2009). http://dx.doi.org/10.5204/mcj.182.

Full text
Abstract:
There is a moment in Al Gore’s 2006 documentary An Inconvenient Truth devised to expose the sheer audacity of fossil fuel lobby groups in the United States. In their attempts to address significant scientific consensus and growing public concern over climate change, these groups are resorting to what Gore’s film suggests are grotesque distortions of fact. A particular example highlighted in the film is the Competitive Enterprise Institute’s (CPE—a lobby group funded by ExxonMobil) “pro” energy industry advertisement: “Carbon dioxide”, the ad states. “They call it pollution, we call it life.” While on the one hand employing rhetoric against the “inconvenient truth” that carbon dioxide emissions are ratcheting up the Earth’s temperature, these advertisements also pose a question – though perhaps unintended – that is worth addressing. Where does life reside? This is not an issue of essentialism, but relates to the claims, materials and technologies through which life as a political object emerges. The danger of entertaining the vested interests of polluting industry in a discussion of climate change and its biopolitics is countered by an imperative to acknowledge the ways in which multiple positions in the climate change debate invoke and appeal to ‘life’ as the bottom line, or inviolable interest, of their political, social or economic work. In doing so, other questions come to the fore that a politics of climate change framed in terms of moral positions or competing values will tend to overlook. These questions concern the manifold practices of life that constitute the contemporary terrain of the political, and the actors and instruments put in this employ. Who speaks for life? And who or what produces it? Climate change as a matter of concern (Latour) has gathered and generated a host of experts, communities, narratives and technical devices all invested in the administration of life. It is, as Malcom Bull argues, “the paradigmatic issue of the new politics,” a politics which “draws people towards the public realm and makes life itself subject to the caprices of state and market” (2). This paper seeks to highlight the politics of life that have emerged around climate change as a public issue. It will argue that these politics appear in incremental and multiple ways that situate an array of actors and interests as active in both contesting and generating the terms of life: what life is and how we come to know it. This way of thinking about climate change debates opposes a prevalent moralistic framework that reads the practices and discourses of debate in terms of oppositional positions alone. While sympathies may flow in varying directions, especially when it comes to such a highly charged and massively consequential issue as climate change, there is little insight to be had from charging the CPE (for example) with manipulating consumers, or misrepresenting well-known facts. Where new and more productive understandings open up is in relation to the fields through which these gathering actors play out their claims to the project of life. These fields, from the state, to the corporation, to the domestic sphere, reveal a complex network of strategies and devices that seek to secure life in constantly renovated terms. Life Politics Biopolitical scholarship in the wake of Foucault has challenged life as a pre-given uncritical category, and sought to highlight the means through which it is put under question and constituted through varying and composing assemblages of practitioners and practices. Such work regards the project of human well-being as highly complex and technical, and has undertaken to document this empirically through close attention to the everyday ecologies in which humans are enmeshed. This is a political and theoretical project in itself, situating political processes in micro, as well as macro, registers, including daily life as a site of (self) management and governance. Rabinow and Rose refer to biopolitical circuits that draw together and inter-relate the multiple sites and scales operative in the administration of life. These involve not just technologies, rationalities and regimes of authority and control, but also politics “from below” in the form of rights claims and community formation and agitation (198). Active in these circuits, too, are corporate and non-state interests for whom the pursuit of maximising life’s qualities and capabilities has become a concern through which “market relations and shareholder value” are negotiated (Rabinow and Rose 211). As many biopolitical scholars argue, biopower—the strategies through which biopolitics are enacted—is characteristic of the “disciplinary neo-liberalism” that has come to define the modern state, and through which the conduct of conduct is practiced (Di Muzio 305). Foucault’s concept of governmentality describes the devolution of state-based disciplinarity and sovereignty to a host of non-state actors, rationalities and strategies of governing, including the self-managing subject, not in opposition to the state, but contributing to its form. According to Bratich, Packer and McCarthy, everyday life is thus “saturated with governmental techniques” (18) in which we are all enrolled. Unlike regimes of biopolitics identified with what Agamben terms “thanopolitics”—the exercise of biopower “which ultimately rests on the power of some to threaten the death of others” (Rabinow and Rose 198), such as the Nazi’s National Socialism and other eugenic campaigns—governmental arts in the service of “vitalist” biopolitics (Rose 1) are increasingly diffused amongst all those with an “interest” in sustaining life, from organisations to individuals. The integration of techniques of self-governance which ask the individual to work on themselves and their own dispositions with State functions has broadened the base by which life is governed, and foregrounded an unsettled terrain of life claims. Rose argues that medical science is at the forefront of these contemporary biopolitics, and to this effect “has […] been fully engaged in the ethical questions of how we should live—of what kinds of creatures we are, of the kinds of obligations that we have to ourselves and to others, of the kinds of techniques we can and should use to improve ourselves” (20). Asking individuals to self-identify through their medical histories and bodily specificities, medical cultures are also shaping new political arrangements, as communities connected by shared genetics or physical conditions, for instance, emerge, evolve and agitate according to the latest medical knowledge. Yet it is not just medicine that provokes ethical work and new political forms. The environment is a key site for life politics that entails a multi-faceted discourse of obligations and entitlements, across fields and scales of engagement. Calculating Environments In line with neo-liberal logic, environmental discourse concerned with ameliorating climate change has increasingly focused upon the individual as an agent of self-monitoring, to both facilitate government agendas at a distance, and to “self-fashion” in the mode of the autonomous subject, securing against external risks (Ong 501). Climate change is commonly represented as such a risk, to both human and non-human life. A recent letter published by the Royal Australasian College of Physicians in two leading British medical journals, named climate change as the “biggest global health threat of the twenty-first century” (Morton). As I have argued elsewhere (Potter), security is central to dominant cultures of environmental governance in the West; these cultures tie sustainability goals to various and interrelated regimes of monitoring which attach to concepts of what Clark and Stevenson call “the good ecological citizen” (238). Citizenship is thus practiced through strategies of governmentality which call on individuals to invest not just in their own well-being, but in the broader project of life. Calculation is a primary technique through which modern environmental governance is enacted; calculative strategies are seen to mediate risk, according to Foucault, and consequently to “assure living” (Elden 575). Rationalised schemes for self-monitoring are proliferating under climate change and the project of environmentalism more broadly, something which critics of neo-liberalism have identified as symptomatic of the privatisation of politics that liberal governmentality has fostered. As we have seen in Australia, an evolving policy emphasis on individual practices and the domestic sphere as crucial sites of environmental action – for instance, the introduction of domestic water restrictions, and the phasing out of energy-inefficient light bulbs in the home—provides a leading discourse of ethico-political responsibility. The rise of carbon dioxide counting is symptomatic of this culture, and indicates the distributed fields of life management in contemporary governmentality. Carbon dioxide, as the CPE is keen to point out, is crucial to life, but it is also—in too large an amount—a force of destruction. Its management, in vitalist terms, is thus established as an effort to protect life in the face of death. The concept of “carbon footprinting” has been promoted by governments, NGOs, industry and individuals as a way of securing this goal, and a host of calculative techniques and strategies are employed to this end, across a spectrum of activities and contexts all framed in the interests of life. The footprinting measure seeks to secure living via self-policed limits, which also—in classic biopolitical form—shift previously private practices into a public realm of count-ability and accountability. The carbon footprint, like its associates the ecological footprint and the water footprint, has developed as a multi-faceted tool of citizenship beyond the traditional boundaries of the state. Suggesting an ecological conception of territory and of our relationships and responsibilities to this, the footprint, as a measure of resource use and emissions relative to the Earth’s capacities to absorb these, calculates and visualises the “specific qualities” (Elden 575) that, in a spatialised understanding of security, constitute and define this territory. The carbon footprint’s relatively simple remit of measuring carbon emissions per unit of assessment—be that the individual, the corporation, or the nation—belies the ways in which life is formatted and produced through its calculations. A tangled set of devices, practices and discourses is employed to make carbon and thus life calculable and manageable. Treading Lightly The old environmental adage to “tread lightly upon the Earth” has been literalised in the metaphor of the footprint, which attempts both to symbolise environmental practice and to directly translate data in order to meaningfully communicate necessary boundaries for our living. The World Wildlife Fund’s Living Planet Report 2008 exemplifies the growing popularity of the footprint as a political and poetic hook: speaking in terms of our “ecological overshoot,” and the move from “ecological credit to ecological deficit”, the report urges an attendance to our “global footprint” which “now exceeds the world’s capacity to regenerate by about 30 per cent” (1). Angela Crombie’s A Lighter Footprint, an instruction manual for sustainable living, is one of a host of media through which individuals are educated in modes of footprint calculation and management. She presents a range of techniques, including carbon offsetting, shifting to sustainable modes of transport, eating and buying differently, recycling and conserving water, to mediate our carbon dioxide output, and to “show […] politicians how easy it is” (13). Governments however, need no persuading from citizens that carbon calculation is an exercise to be harnessed. As governments around the world move (slowly) to address climate change, policies that instrumentalise carbon dioxide emission and reduction via an auditing of credits and deficits have come to the fore—for example, the European Union Emissions Trading Scheme and the Chicago Climate Exchange. In Australia, we have the currently-under-debate Carbon Pollution Reduction Scheme, a part of which is the Australian Emissions Trading Scheme (AETS) that will introduce a system of “carbon credits” and trading in a market-based model of supply and demand. This initiative will put a price on carbon dioxide emissions, and cap the amount of emissions any one polluter can produce without purchasing further credits. In readiness for the scheme, business initiatives are forming to take advantage of this new carbon market. Industries in carbon auditing and off-setting services are consolidating; hectares of trees, already active in the carbon sequestration market, are being cultivated as “carbon sinks” and key sites of compliance for polluters under the AETS. Governments are also planning to turn their tracts of forested public land into carbon credits worth billions of dollars (Arup 7). The attachment of emission measures to goods and services requires a range of calculative experts, and the implementation of new marketing and branding strategies, aimed at conveying the carbon “health” of a product. The introduction of “food mile” labelling (the amount of carbon dioxide emitted in the transportation of the food from source to consumer) in certain supermarkets in the United Kingdom is an example of this. Carbon risk analysis and management programs are being introduced across businesses in readiness for the forthcoming “carbon economy”. As one flyer selling “a suite of carbon related services” explains, “early action will give you the edge in understanding and mitigating the risks, and puts you in a prime position to capitalise on the rewards” (MGI Business Solutions Worldwide). In addition, lobby groups are working to ensure exclusions from or the free allocation of permits within the proposed AETS, with degrees of compulsion applied to different industries – the Federal Government, for instance, will provide a $3.9 billion compensation package for the electric power sector when the AETS commences, to enable their “adjustment” to this carbon regime. Performing Life Noortje Mares provides a further means of thinking through the politics of life in the context of climate change by complicating the distinction between public and private interest. Her study of “green living experiments” describes the rise of carbon calculation in the home in recent years, and the implementation of technologies such as the smart electricity meter that provides a constantly updating display of data relating to amounts and cost of energy consumed and the carbon dioxide emitted in the routines of domestic life. Her research tracks the entry of these personal calculative regimes into public life via internet forums such as blogs, where individuals notate or discuss their experiences of pursing low-carbon lifestyles. On the one hand, these calculative practices of living and their public representation can be read as evidencing the pervasive neo-liberal governmentality at work in contemporary environmental practice, where individuals are encouraged to scrupulously monitor their domestic cultures. The rise of auditing as a technology of self, and more broadly as a technique of public accountability, has come under fire for its “immunity-granting role” (Charkiewicz 79), where internal audits become substituted for external compliance and regulation. Mares challenges this reading, however, by demonstrating the ways in which green living experiments “transform everyday material practices into practices of public involvement” that (118) don’t resolve or pin down relations between the individual, the non-human environment, and the social, or reveal a mappable flow of actions and effects between the public realm and the home. The empirical modes of publicity that these individuals employ, “the careful recording of measurements and the reliable descriptions of sensory observation, so as to enable ‘virtual witnessing’ by wider audiences”, open up to much more complex understandings than one of calculative self-discipline at work. As “instrument[s] of public involvement” (120), the experiments that Mares describe locate the politics of life in the embodied socio-material entanglements of the domestic sphere, in arrangements of humans and non-human technologies. Such arrangements, she suggests, are ontologically productive in that they introduce “not only new knowledge, but also new entities […] to society” (119), and as such these experiments and the modes of calculation they employ become active in the composition of reality. Recent work in economic sociology and cultural studies has similarly contended that calculation, far from either a naturalised or thoroughly abstract process, relies upon a host of devices, relations, and techniques: that is, as Gay Hawkins explains, calculative processes “have to be enacted” (108). Environmental governmentality in the service of securing life is a networked practice that draws in a host of actors, not a top-down imposition. The institution of carbon economies and carbon emissions as a new register of public accountability, brings alternative ways to calculate the world into being, and consequently re-calibrates life as it emerges from these heterogeneous arrangements. All That Gathers Latour writes that we come to know a matter of concern by all the things that gather around it (Latour). This includes the human, as well as the non-human actors, policies, practices and technologies that are put to work in the making of our realities. Climate change is routinely represented as a threat to life, with predicted (and occurring) species extinction, growing numbers of climate change refugees, dispossessed from uninhabitable lands, and the rise of diseases and extreme weather scenarios that put human life in peril. There is no doubt, of course, that climate change does mean death for some: indeed, there are thanopolitical overtones in inequitable relations between the fall-out of impacts from major polluting nations on poorer countries, or those much more susceptible to rising sea levels. Biosocial equity, as Bull points out, is a “matter of being equally alive and equally dead” (2). Yet in the biopolitical project of assuring living, life is burgeoning around the problem of climate change. The critique of neo-liberalism as a blanketing system that subjects all aspects of life to market logic, and in which the cynical techniques of industry seek to appropriate ethico-political stances for their own material ends, are insufficient responses to what is actually unfolding in the messy terrain of climate change and its biopolitics. What this paper has attempted to show is that there is no particular purchase on life that can be had by any one actor who gathers around this concern. Varying interests, ambitions, and intentions, without moral hierarchy, stake their claim in life as a constantly constituting site in which they participate, and from this perspective, the ways in which we understand life to be both produced and managed expand. This is to refuse either an opposition or a conflation between the market and nature, or the market and life. It is also to argue that we cannot essentialise human-ness in the climate change debate. For while human relations with animals, plants and weathers may make us what we are, so too do our relations with (in a much less romantic view) non-human things, technologies, schemes, and even markets—from carbon auditing services, to the label on a tin on the supermarket shelf. As these intersect and entangle, the project of life, in the new politics of climate change, is far from straightforward. References An Inconvenient Truth. Dir. Davis Guggenheim. Village Roadshow, 2006. Arup, Tom. “Victoria Makes Enormous Carbon Stocktake in Bid for Offset Billions.” The Age 24 Sep. 2009: 7. Bratich, Jack Z., Jeremy Packer, and Cameron McCarthy. “Governing the Present.” Foucault, Cultural Studies and Governmentality. Ed. Bratich, Packer and McCarthy. Albany: State University of New York Press, 2003. 3-21. Bull, Malcolm. “Globalization and Biopolitics.” New Left Review 45 (2007): 12 May 2009 . < http://newleftreview.org/?page=article&view=2675 >. Charkiewicz, Ewa. “Corporations, the UN and Neo-liberal Bio-politics.” Development 48.1 (2005): 75-83. Clark, Nigel, and Nick Stevenson. “Care in a Time of Catastrophe: Citizenship, Community and the Ecological Imagination.” Journal of Human Rights 2.2 (2003): 235-246. Crombie, Angela. A Lighter Footprint: A Practical Guide to Minimising Your Impact on the Planet. Carlton North, Vic.: Scribe, 2007. Di Muzio, Tim. “Governing Global Slums: The Biopolitics of Target 11.” Global Governance. 14.3 (2008): 305-326. Elden, Stuart. “Governmentality, Calculation and Territory.” Environment and Planning D: Society and Space 25 (2007): 562-580. Hawkins, Gay. The Ethics of Waste: How We Relate to Rubbish. Sydney: University of New South Wales Press, 2006. Latour, Bruno. “Why Has Critique Run Out of Steam?: From Matters of Fact to Matters of Concern.” Critical Inquiry 30.2 (2004): 225-248. Mares, Noortje. “Testing Powers of Engagement: Green Living Experiments, the Ontological Turn and the Undoability and Involvement.” European Journal of Social Theory 12.1 (2009): 117-133. MGI Business Solutions Worldwide. “Carbon News.” Adelaide. 2 Aug. 2009. Ong, Aihwa. “Mutations in Citizenship.” Theory, Culture and Society 23.2-3 (2006): 499-505. Potter, Emily. “Footprints in the Mallee: Climate Change, Sustaining Communities, and the Nature of Place.” Landscapes and Learning: Place Studies in a Global World. Ed. Margaret Somerville, Kerith Power and Phoenix de Carteret. Sense Publishers. Forthcoming. Rabinow, Paul, and Nikolas Rose. “Biopower Today.” Biosocieties 1 (2006): 195-217. Rose, Nikolas. “The Politics of Life Itself.” Theory, Culture and Society 18.6 (2001): 1-30. World Wildlife Fund. Living Planet Report 2008. Switzerland, 2008.
APA, Harvard, Vancouver, ISO, and other styles
49

Paull, John. "Beyond Equal: From Same But Different to the Doctrine of Substantial Equivalence." M/C Journal 11, no. 2 (June 1, 2008). http://dx.doi.org/10.5204/mcj.36.

Full text
Abstract:
A same-but-different dichotomy has recently been encapsulated within the US Food and Drug Administration’s ill-defined concept of “substantial equivalence” (USFDA, FDA). By invoking this concept the genetically modified organism (GMO) industry has escaped the rigors of safety testing that might otherwise apply. The curious concept of “substantial equivalence” grants a presumption of safety to GMO food. This presumption has yet to be earned, and has been used to constrain labelling of both GMO and non-GMO food. It is an idea that well serves corporatism. It enables the claim of difference to secure patent protection, while upholding the contrary claim of sameness to avoid labelling and safety scrutiny. It offers the best of both worlds for corporate food entrepreneurs, and delivers the worst of both worlds to consumers. The term “substantial equivalence” has established its currency within the GMO discourse. As the opportunities for patenting food technologies expand, the GMO recruitment of this concept will likely be a dress rehearsal for the developing debates on the labelling and testing of other techno-foods – including nano-foods and clone-foods. “Substantial Equivalence” “Are the Seven Commandments the same as they used to be, Benjamin?” asks Clover in George Orwell’s “Animal Farm”. By way of response, Benjamin “read out to her what was written on the wall. There was nothing there now except a single Commandment. It ran: ALL ANIMALS ARE EQUAL BUT SOME ANIMALS ARE MORE EQUAL THAN OTHERS”. After this reductionist revelation, further novel and curious events at Manor Farm, “did not seem strange” (Orwell, ch. X). Equality is a concept at the very core of mathematics, but beyond the domain of logic, equality becomes a hotly contested notion – and the domain of food is no exception. A novel food has a regulatory advantage if it can claim to be the same as an established food – a food that has proven its worth over centuries, perhaps even millennia – and thus does not trigger new, perhaps costly and onerous, testing, compliance, and even new and burdensome regulations. On the other hand, such a novel food has an intellectual property (IP) advantage only in terms of its difference. And thus there is an entrenched dissonance for newly technologised foods, between claiming sameness, and claiming difference. The same/different dilemma is erased, so some would have it, by appeal to the curious new dualist doctrine of “substantial equivalence” whereby sameness and difference are claimed simultaneously, thereby creating a win/win for corporatism, and a loss/loss for consumerism. This ground has been pioneered, and to some extent conquered, by the GMO industry. The conquest has ramifications for other cryptic food technologies, that is technologies that are invisible to the consumer and that are not evident to the consumer other than via labelling. Cryptic technologies pertaining to food include GMOs, pesticides, hormone treatments, irradiation and, most recently, manufactured nano-particles introduced into the food production and delivery stream. Genetic modification of plants was reported as early as 1984 by Horsch et al. The case of Diamond v. Chakrabarty resulted in a US Supreme Court decision that upheld the prior decision of the US Court of Customs and Patent Appeal that “the fact that micro-organisms are alive is without legal significance for purposes of the patent law”, and ruled that the “respondent’s micro-organism plainly qualifies as patentable subject matter”. This was a majority decision of nine judges, with four judges dissenting (Burger). It was this Chakrabarty judgement that has seriously opened the Pandora’s box of GMOs because patenting rights makes GMOs an attractive corporate proposition by offering potentially unique monopoly rights over food. The rear guard action against GMOs has most often focussed on health repercussions (Smith, Genetic), food security issues, and also the potential for corporate malfeasance to hide behind a cloak of secrecy citing commercial confidentiality (Smith, Seeds). Others have tilted at the foundational plank on which the economics of the GMO industry sits: “I suggest that the main concern is that we do not want a single molecule of anything we eat to contribute to, or be patented and owned by, a reckless, ruthless chemical organisation” (Grist 22). The GMO industry exhibits bipolar behaviour, invoking the concept of “substantial difference” to claim patent rights by way of “novelty”, and then claiming “substantial equivalence” when dealing with other regulatory authorities including food, drug and pesticide agencies; a case of “having their cake and eating it too” (Engdahl 8). This is a clever slight-of-rhetoric, laying claim to the best of both worlds for corporations, and the worst of both worlds for consumers. Corporations achieve patent protection and no concomitant specific regulatory oversight; while consumers pay the cost of patent monopolization, and are not necessarily apprised, by way of labelling or otherwise, that they are purchasing and eating GMOs, and thereby financing the GMO industry. The lemma of “substantial equivalence” does not bear close scrutiny. It is a fuzzy concept that lacks a tight testable definition. It is exactly this fuzziness that allows lots of wriggle room to keep GMOs out of rigorous testing regimes. Millstone et al. argue that “substantial equivalence is a pseudo-scientific concept because it is a commercial and political judgement masquerading as if it is scientific. It is moreover, inherently anti-scientific because it was created primarily to provide an excuse for not requiring biochemical or toxicological tests. It therefore serves to discourage and inhibit informative scientific research” (526). “Substantial equivalence” grants GMOs the benefit of the doubt regarding safety, and thereby leaves unexamined the ramifications for human consumer health, for farm labourer and food-processor health, for the welfare of farm animals fed a diet of GMO grain, and for the well-being of the ecosystem, both in general and in its particularities. “Substantial equivalence” was introduced into the food discourse by an Organisation for Economic Co-operation and Development (OECD) report: “safety evaluation of foods derived by modern biotechnology: concepts and principles”. It is from this document that the ongoing mantra of assumed safety of GMOs derives: “modern biotechnology … does not inherently lead to foods that are less safe … . Therefore evaluation of foods and food components obtained from organisms developed by the application of the newer techniques does not necessitate a fundamental change in established principles, nor does it require a different standard of safety” (OECD, “Safety” 10). This was at the time, and remains, an act of faith, a pro-corporatist and a post-cautionary approach. The OECD motto reveals where their priorities lean: “for a better world economy” (OECD, “Better”). The term “substantial equivalence” was preceded by the 1992 USFDA concept of “substantial similarity” (Levidow, Murphy and Carr) and was adopted from a prior usage by the US Food and Drug Agency (USFDA) where it was used pertaining to medical devices (Miller). Even GMO proponents accept that “Substantial equivalence is not intended to be a scientific formulation; it is a conceptual tool for food producers and government regulators” (Miller 1043). And there’s the rub – there is no scientific definition of “substantial equivalence”, no scientific test of proof of concept, and nor is there likely to be, since this is a ‘spinmeister’ term. And yet this is the cornerstone on which rests the presumption of safety of GMOs. Absence of evidence is taken to be evidence of absence. History suggests that this is a fraught presumption. By way of contrast, the patenting of GMOs depends on the antithesis of assumed ‘sameness’. Patenting rests on proven, scrutinised, challengeable and robust tests of difference and novelty. Lightfoot et al. report that transgenic plants exhibit “unexpected changes [that] challenge the usual assumptions of GMO equivalence and suggest genomic, proteomic and metanomic characterization of transgenics is advisable” (1). GMO Milk and Contested Labelling Pesticide company Monsanto markets the genetically engineered hormone rBST (recombinant Bovine Somatotropin; also known as: rbST; rBGH, recombinant Bovine Growth Hormone; and the brand name Prosilac) to dairy farmers who inject it into their cows to increase milk production. This product is not approved for use in many jurisdictions, including Europe, Australia, New Zealand, Canada and Japan. Even Monsanto accepts that rBST leads to mastitis (inflammation and pus in the udder) and other “cow health problems”, however, it maintains that “these problems did not occur at rates that would prohibit the use of Prosilac” (Monsanto). A European Union study identified an extensive list of health concerns of rBST use (European Commission). The US Dairy Export Council however entertain no doubt. In their background document they ask “is milk from cows treated with rBST safe?” and answer “Absolutely” (USDEC). Meanwhile, Monsanto’s website raises and answers the question: “Is the milk from cows treated with rbST any different from milk from untreated cows? No” (Monsanto). Injecting cows with genetically modified hormones to boost their milk production remains a contested practice, banned in many countries. It is the claimed equivalence that has kept consumers of US dairy products in the dark, shielded rBST dairy farmers from having to declare that their milk production is GMO-enhanced, and has inhibited non-GMO producers from declaring their milk as non-GMO, non rBST, or not hormone enhanced. This is a battle that has simmered, and sometimes raged, for a decade in the US. Finally there is a modest victory for consumers: the Pennsylvania Department of Agriculture (PDA) requires all labels used on milk products to be approved in advance by the department. The standard issued in October 2007 (PDA, “Standards”) signalled to producers that any milk labels claiming rBST-free status would be rejected. This advice was rescinded in January 2008 with new, specific, department-approved textual constructions allowed, and ensuring that any “no rBST” style claim was paired with a PDA-prescribed disclaimer (PDA, “Revised Standards”). However, parsimonious labelling is prohibited: No labeling may contain references such as ‘No Hormones’, ‘Hormone Free’, ‘Free of Hormones’, ‘No BST’, ‘Free of BST’, ‘BST Free’,’No added BST’, or any statement which indicates, implies or could be construed to mean that no natural bovine somatotropin (BST) or synthetic bovine somatotropin (rBST) are contained in or added to the product. (PDA, “Revised Standards” 3) Difference claims are prohibited: In no instance shall any label state or imply that milk from cows not treated with recombinant bovine somatotropin (rBST, rbST, RBST or rbst) differs in composition from milk or products made with milk from treated cows, or that rBST is not contained in or added to the product. If a product is represented as, or intended to be represented to consumers as, containing or produced from milk from cows not treated with rBST any labeling information must convey only a difference in farming practices or dairy herd management methods. (PDA, “Revised Standards” 3) The PDA-approved labelling text for non-GMO dairy farmers is specified as follows: ‘From cows not treated with rBST. No significant difference has been shown between milk derived from rBST-treated and non-rBST-treated cows’ or a substantial equivalent. Hereinafter, the first sentence shall be referred to as the ‘Claim’, and the second sentence shall be referred to as the ‘Disclaimer’. (PDA, “Revised Standards” 4) It is onto the non-GMO dairy farmer alone, that the costs of compliance fall. These costs include label preparation and approval, proving non-usage of GMOs, and of creating and maintaining an audit trail. In nearby Ohio a similar consumer versus corporatist pantomime is playing out. This time with the Ohio Department of Agriculture (ODA) calling the shots, and again serving the GMO industry. The ODA prescribed text allowed to non-GMO dairy farmers is “from cows not supplemented with rbST” and this is to be conjoined with the mandatory disclaimer “no significant difference has been shown between milk derived from rbST-supplemented and non-rbST supplemented cows” (Curet). These are “emergency rules”: they apply for 90 days, and are proposed as permanent. Once again, the onus is on the non-GMO dairy farmers to document and prove their claims. GMO dairy farmers face no such governmental requirements, including no disclosure requirement, and thus an asymmetric regulatory impost is placed on the non-GMO farmer which opens up new opportunities for administrative demands and technocratic harassment. Levidow et al. argue, somewhat Eurocentrically, that from its 1990s adoption “as the basis for a harmonized science-based approach to risk assessment” (26) the concept of “substantial equivalence” has “been recast in at least three ways” (58). It is true that the GMO debate has evolved differently in the US and Europe, and with other jurisdictions usually adopting intermediate positions, yet the concept persists. Levidow et al. nominate their three recastings as: firstly an “implicit redefinition” by the appending of “extra phrases in official documents”; secondly, “it has been reinterpreted, as risk assessment processes have … required more evidence of safety than before, especially in Europe”; and thirdly, “it has been demoted in the European Union regulatory procedures so that it can no longer be used to justify the claim that a risk assessment is unnecessary” (58). Romeis et al. have proposed a decision tree approach to GMO risks based on cascading tiers of risk assessment. However what remains is that the defects of the concept of “substantial equivalence” persist. Schauzu identified that: such decisions are a matter of “opinion”; that there is “no clear definition of the term ‘substantial’”; that because genetic modification “is aimed at introducing new traits into organisms, the result will always be a different combination of genes and proteins”; and that “there is no general checklist that could be followed by those who are responsible for allowing a product to be placed on the market” (2). Benchmark for Further Food Novelties? The discourse, contestation, and debate about “substantial equivalence” have largely focussed on the introduction of GMOs into food production processes. GM can best be regarded as the test case, and proof of concept, for establishing “substantial equivalence” as a benchmark for evaluating new and forthcoming food technologies. This is of concern, because the concept of “substantial equivalence” is scientific hokum, and yet its persistence, even entrenchment, within regulatory agencies may be a harbinger of forthcoming same-but-different debates for nanotechnology and other future bioengineering. The appeal of “substantial equivalence” has been a brake on the creation of GMO-specific regulations and on rigorous GMO testing. The food nanotechnology industry can be expected to look to the precedent of the GMO debate to head off specific nano-regulations and nano-testing. As cloning becomes economically viable, then this may be another wave of food innovation that muddies the regulatory waters with the confused – and ultimately self-contradictory – concept of “substantial equivalence”. Nanotechnology engineers particles in the size range 1 to 100 nanometres – a nanometre is one billionth of a metre. This is interesting for manufacturers because at this size chemicals behave differently, or as the Australian Office of Nanotechnology expresses it, “new functionalities are obtained” (AON). Globally, government expenditure on nanotechnology research reached US$4.6 billion in 2006 (Roco 3.12). While there are now many patents (ETC Group; Roco), regulation specific to nanoparticles is lacking (Bowman and Hodge; Miller and Senjen). The USFDA advises that nano-manufacturers “must show a reasonable assurance of safety … or substantial equivalence” (FDA). A recent inventory of nano-products already on the market identified 580 products. Of these 11.4% were categorised as “Food and Beverage” (WWICS). This is at a time when public confidence in regulatory bodies is declining (HRA). In an Australian consumer survey on nanotechnology, 65% of respondents indicated they were concerned about “unknown and long term side effects”, and 71% agreed that it is important “to know if products are made with nanotechnology” (MARS 22). Cloned animals are currently more expensive to produce than traditional animal progeny. In the course of 678 pages, the USFDA Animal Cloning: A Draft Risk Assessment has not a single mention of “substantial equivalence”. However the Federation of Animal Science Societies (FASS) in its single page “Statement in Support of USFDA’s Risk Assessment Conclusion That Food from Cloned Animals Is Safe for Human Consumption” states that “FASS endorses the use of this comparative evaluation process as the foundation of establishing substantial equivalence of any food being evaluated. It must be emphasized that it is the food product itself that should be the focus of the evaluation rather than the technology used to generate cloned animals” (FASS 1). Contrary to the FASS derogation of the importance of process in food production, for consumers both the process and provenance of production is an important and integral aspect of a food product’s value and identity. Some consumers will legitimately insist that their Kalamata olives are from Greece, or their balsamic vinegar is from Modena. It was the British public’s growing awareness that their sugar was being produced by slave labour that enabled the boycotting of the product, and ultimately the outlawing of slavery (Hochschild). When consumers boycott Nestle, because of past or present marketing practices, or boycott produce of USA because of, for example, US foreign policy or animal welfare concerns, they are distinguishing the food based on the narrative of the food, the production process and/or production context which are a part of the identity of the food. Consumers attribute value to food based on production process and provenance information (Paull). Products produced by slave labour, by child labour, by political prisoners, by means of torture, theft, immoral, unethical or unsustainable practices are different from their alternatives. The process of production is a part of the identity of a product and consumers are increasingly interested in food narrative. It requires vigilance to ensure that these narratives are delivered with the product to the consumer, and are neither lost nor suppressed. Throughout the GM debate, the organic sector has successfully skirted the “substantial equivalence” debate by excluding GMOs from the certified organic food production process. This GMO-exclusion from the organic food stream is the one reprieve available to consumers worldwide who are keen to avoid GMOs in their diet. The organic industry carries the expectation of providing food produced without artificial pesticides and fertilizers, and by extension, without GMOs. Most recently, the Soil Association, the leading organic certifier in the UK, claims to be the first organisation in the world to exclude manufactured nonoparticles from their products (Soil Association). There has been the call that engineered nanoparticles be excluded from organic standards worldwide, given that there is no mandatory safety testing and no compulsory labelling in place (Paull and Lyons). The twisted rhetoric of oxymorons does not make the ideal foundation for policy. Setting food policy on the shifting sands of “substantial equivalence” seems foolhardy when we consider the potentially profound ramifications of globally mass marketing a dysfunctional food. If there is a 2×2 matrix of terms – “substantial equivalence”, substantial difference, insubstantial equivalence, insubstantial difference – while only one corner of this matrix is engaged for food policy, and while the elements remain matters of opinion rather than being testable by science, or by some other regime, then the public is the dupe, and potentially the victim. “Substantial equivalence” has served the GMO corporates well and the public poorly, and this asymmetry is slated to escalate if nano-food and clone-food are also folded into the “substantial equivalence” paradigm. Only in Orwellian Newspeak is war peace, or is same different. It is time to jettison the pseudo-scientific doctrine of “substantial equivalence”, as a convenient oxymoron, and embrace full disclosure of provenance, process and difference, so that consumers are not collateral in a continuing asymmetric knowledge war. References Australian Office of Nanotechnology (AON). Department of Industry, Tourism and Resources (DITR) 6 Aug. 2007. 24 Apr. 2008 < http://www.innovation.gov.au/Section/Innovation/Pages/ AustralianOfficeofNanotechnology.aspx >.Bowman, Diana, and Graeme Hodge. “A Small Matter of Regulation: An International Review of Nanotechnology Regulation.” Columbia Science and Technology Law Review 8 (2007): 1-32.Burger, Warren. “Sidney A. Diamond, Commissioner of Patents and Trademarks v. Ananda M. Chakrabarty, et al.” Supreme Court of the United States, decided 16 June 1980. 24 Apr. 2008 < http://caselaw.lp.findlaw.com/cgi-bin/getcase.pl?court=US&vol=447&invol=303 >.Curet, Monique. “New Rules Allow Dairy-Product Labels to Include Hormone Info.” The Columbus Dispatch 7 Feb. 2008. 24 Apr. 2008 < http://www.dispatch.com/live/content/business/stories/2008/02/07/dairy.html >.Engdahl, F. William. Seeds of Destruction. Montréal: Global Research, 2007.ETC Group. Down on the Farm: The Impact of Nano-Scale Technologies on Food and Agriculture. Ottawa: Action Group on Erosion, Technology and Conservation, November, 2004. European Commission. Report on Public Health Aspects of the Use of Bovine Somatotropin. Brussels: European Commission, 15-16 March 1999.Federation of Animal Science Societies (FASS). Statement in Support of FDA’s Risk Assessment Conclusion That Cloned Animals Are Safe for Human Consumption. 2007. 24 Apr. 2008 < http://www.fass.org/page.asp?pageID=191 >.Grist, Stuart. “True Threats to Reason.” New Scientist 197.2643 (16 Feb. 2008): 22-23.Hochschild, Adam. Bury the Chains: The British Struggle to Abolish Slavery. London: Pan Books, 2006.Horsch, Robert, Robert Fraley, Stephen Rogers, Patricia Sanders, Alan Lloyd, and Nancy Hoffman. “Inheritance of Functional Foreign Genes in Plants.” Science 223 (1984): 496-498.HRA. Awareness of and Attitudes toward Nanotechnology and Federal Regulatory Agencies: A Report of Findings. Washington: Peter D. Hart Research Associates, 25 Sep. 2007.Levidow, Les, Joseph Murphy, and Susan Carr. “Recasting ‘Substantial Equivalence’: Transatlantic Governance of GM Food.” Science, Technology, and Human Values 32.1 (Jan. 2007): 26-64.Lightfoot, David, Rajsree Mungur, Rafiqa Ameziane, Anthony Glass, and Karen Berhard. “Transgenic Manipulation of C and N Metabolism: Stretching the GMO Equivalence.” American Society of Plant Biologists Conference: Plant Biology, 2000.MARS. “Final Report: Australian Community Attitudes Held about Nanotechnology – Trends 2005-2007.” Report prepared for Department of Industry, Tourism and Resources (DITR). Miranda, NSW: Market Attitude Research Services, 12 June 2007.Miller, Georgia, and Rye Senjen. “Out of the Laboratory and on to Our Plates: Nanotechnology in Food and Agriculture.” Friends of the Earth, 2008. 24 Apr. 2008 < http://nano.foe.org.au/node/220 >.Miller, Henry. “Substantial Equivalence: Its Uses and Abuses.” Nature Biotechnology 17 (7 Nov. 1999): 1042-1043.Millstone, Erik, Eric Brunner, and Sue Mayer. “Beyond ‘Substantial Equivalence’.” Nature 401 (7 Oct. 1999): 525-526.Monsanto. “Posilac, Bovine Somatotropin by Monsanto: Questions and Answers about bST from the United States Food and Drug Administration.” 2007. 24 Apr. 2008 < http://www.monsantodairy.com/faqs/fda_safety.html >.Organisation for Economic Co-operation and Development (OECD). “For a Better World Economy.” Paris: OECD, 2008. 24 Apr. 2008 < http://www.oecd.org/ >.———. “Safety Evaluation of Foods Derived by Modern Biotechnology: Concepts and Principles.” Paris: OECD, 1993.Orwell, George. Animal Farm. Adelaide: ebooks@Adelaide, 2004 (1945). 30 Apr. 2008 < http://ebooks.adelaide.edu.au/o/orwell/george >.Paull, John. “Provenance, Purity and Price Premiums: Consumer Valuations of Organic and Place-of-Origin Food Labelling.” Research Masters thesis, University of Tasmania, Hobart, 2006. 24 Apr. 2008 < http://eprints.utas.edu.au/690/ >.Paull, John, and Kristen Lyons. “Nanotechnology: The Next Challenge for Organics.” Journal of Organic Systems (in press).Pennsylvania Department of Agriculture (PDA). “Revised Standards and Procedure for Approval of Proposed Labeling of Fluid Milk.” Milk Labeling Standards (2.0.1.17.08). Bureau of Food Safety and Laboratory Services, Pennsylvania Department of Agriculture, 17 Jan. 2008. ———. “Standards and Procedure for Approval of Proposed Labeling of Fluid Milk, Milk Products and Manufactured Dairy Products.” Milk Labeling Standards (2.0.1.17.08). Bureau of Food Safety and Laboratory Services, Pennsylvania Department of Agriculture, 22 Oct. 2007.Roco, Mihail. “National Nanotechnology Initiative – Past, Present, Future.” In William Goddard, Donald Brenner, Sergy Lyshevski and Gerald Iafrate, eds. Handbook of Nanoscience, Engineering and Technology. 2nd ed. Boca Raton, FL: CRC Press, 2007.Romeis, Jorg, Detlef Bartsch, Franz Bigler, Marco Candolfi, Marco Gielkins, et al. “Assessment of Risk of Insect-Resistant Transgenic Crops to Nontarget Arthropods.” Nature Biotechnology 26.2 (Feb. 2008): 203-208.Schauzu, Marianna. “The Concept of Substantial Equivalence in Safety Assessment of Food Derived from Genetically Modified Organisms.” AgBiotechNet 2 (Apr. 2000): 1-4.Soil Association. “Soil Association First Organisation in the World to Ban Nanoparticles – Potentially Toxic Beauty Products That Get Right under Your Skin.” London: Soil Association, 17 Jan. 2008. 24 Apr. 2008 < http://www.soilassociation.org/web/sa/saweb.nsf/848d689047 cb466780256a6b00298980/42308d944a3088a6802573d100351790!OpenDocument >.Smith, Jeffrey. Genetic Roulette: The Documented Health Risks of Genetically Engineered Foods. Fairfield, Iowa: Yes! Books, 2007.———. Seeds of Deception. Melbourne: Scribe, 2004.U.S. Dairy Export Council (USDEC). Bovine Somatotropin (BST) Backgrounder. Arlington, VA: U.S. Dairy Export Council, 2006.U.S. Food and Drug Administration (USFDA). Animal Cloning: A Draft Risk Assessment. Rockville, MD: Center for Veterinary Medicine, U.S. Food and Drug Administration, 28 Dec. 2006.———. FDA and Nanotechnology Products. U.S. Department of Health and Human Services, U.S. Food and Drug Administration, 2008. 24 Apr. 2008 < http://www.fda.gov/nanotechnology/faqs.html >.Woodrow Wilson International Center for Scholars (WWICS). “A Nanotechnology Consumer Products Inventory.” Data set as at Sep. 2007. Woodrow Wilson International Center for Scholars, Project on Emerging Technologies, Sep. 2007. 24 Apr. 2008 < http://www.nanotechproject.org/inventories/consumer >.
APA, Harvard, Vancouver, ISO, and other styles
50

Levine, Michael, and William Taylor. "The Upside of Down: Disaster and the Imagination 50 Years On." M/C Journal 16, no. 1 (March 18, 2013). http://dx.doi.org/10.5204/mcj.586.

Full text
Abstract:
IntroductionIt has been nearly half a century since the appearance of Susan Sontag’s landmark essay “The Imagination of Disaster.” The critic wrote of the public fascination with science fiction disaster films, claiming that, on the one hand “from a psychological point of view, the imagination of disaster does not greatly differ from one period in history to another [but, on the other hand] from a political and moral point of view, it does” (224). Even if Sontag is right about aspects of the imagination of disaster not changing, the types, frequency, and magnitude of disasters and their representation in media and popular culture suggest that dynamic conditions prevail on both counts. Disaster has become a significantly urban phenomenon, and highly publicised “worst case” scenarios such as Hurricane Katrina and the Haiti earthquake highlight multiple demographic, cultural, and environmental contexts for visualising cataclysm. The 1950s and 60s science fiction films that Sontag wrote about were filled with marauding aliens and freaks of disabused science. Since then, their visual and dramatic effects have been much enlarged by all kinds of disaster scenarios. Partly imagined, these scenarios have real-life counterparts with threats from terrorism and the war on terror, pan-epidemics, and global climate change. Sontag’s essay—like most, if not all of the films she mentions—overlooked the aftermath; that is, the rebuilding, following extra-terrestrial invasion. It ignored what was likely to happen when the monsters were gone. In contrast, the psychological as well as the practical, social, and economic aspects of reconstruction are integral to disaster discourse today. Writing about how architecture might creatively contribute to post-conflict (including war) and disaster recovery, for instance, Boano elaborates the psychological background for rebuilding, where the material destruction of dwellings and cities “carries a powerful symbolic erosion of security, social wellbeing and place attachment” (38); these are depicted as attributes of selfhood and identity that must be restored. Similarly, Hutchison and Bleiker (385) adopt a view evident in disaster studies, that disaster-struck communities experience “trauma” and require inspired responses that facilitate “healing and reconciliation” as well as material aid such as food, housing, and renewed infrastructure. This paper revisits Sontag’s “The Imagination of Disaster,” fifty years on in view of the changing face of disasters and their representation in film media, including more recent films. The paper then considers disaster recovery and outlines the difficult path that “creative industries” like architecture and urban planning must tread when promising a vision of rebuilding that provides for such intangible outcomes as “healing and reconciliation.” We find that hopes for the seemingly positive psychologically- and socially-recuperative outcomes accompanying the prospect of rebuilding risk a variety of generalisation akin to wish-fulfilment that Sontag finds in disaster films. The Psychology of Science Fiction and Disaster FilmsIn “The Imagination of Disaster,” written at or close to the height of the Cold War, Sontag ruminates on what America’s interest in, if not preoccupation with, science fiction films tell us about ourselves. Their popularity cannot be explained in terms of their entertainment value alone; or if it can, then why audiences found (and still find) such films entertaining is something that itself needs explanation.Depicted in media like photography and film, utopian and dystopian thought have at least one thing in common. Their visions of either perfected or socially alienated worlds are commonly prompted by criticism of the social/political status quo and point to its reform. For Sontag, science fiction films portrayed both people’s worst nightmares concerning disaster and catastrophe (e.g. the end of the world; chaos; enslavement; mutation), as well as their facile victories over the kinds of moral, political, and social dissolution the films imaginatively depicted. Sontag does not explicitly attribute such “happy endings” to wish-fulfilling phantasy and ego-protection. (“Phantasy” is to be distinguished from fantasy. It is a psychoanalytic term for states of mind, often symbolic in form, resulting from infantile wish-fulfilment, desires and instincts.) She does, however, describe the kinds of fears, existential concerns (like annihilation), and crises of meaning they are designed (purpose built) to allay. The fears are a product of the time—the down and dark side of technology (e.g. depersonalisation; ambivalence towards science, scientists, and technology) and changes wrought in our working and personal lives by urbanisation. In short, then as now, science fictions films were both expressions of deep and genuine worries and of the pressing need to inventively set them to rest.When Sontag claims that “the imagination of disaster does not greatly differ” (224) from one period to another, this is because, psychologically speaking, neither the precipitating concerns and fears (death, loss of love, meaninglessness, etc.), nor the ways in which people’s minds endeavour to assuage them, substantively differ. What is different is the way they are depicted. This is unsurprisingly a function of the political, social, and moral situations and milieus that provide the context in which the imagination of disaster unfolds. In contemporary society, the extent to which the media informs and constructs the context in which the imagination operates is unprecedented.Sontag claims that there is little if any criticism of the real social and political conditions that bring about the fears the films depict (223). Instead, fantasy operates so as to displace and project the actual causes away from their all too human origins into outer space and onto aliens. In a sense, this is the core and raison d’etre for such films. By their very nature, science fiction films of the kind Sontag is discussing cannot concern themselves with genuine social or political criticism (even though the films are necessarily expressive of such criticism). Any serious questioning of the moral and political status quo—conditions that are responsible for the disasters befalling people—would hamper the operation of fantasy and its production of temporarily satisfying “solutions” to whatever catastrophe is being depicted.Sontag goes on to discuss various strategies science fiction employs to deal with such fears. For example, through positing a bifurcation between good and evil, and grossly oversimplifying the moral complexity of situations, it allows one to “give outlet to cruel or at least amoral feelings” (215) and to exercise feelings of superiority—moral and otherwise. Ambiguous feelings towards science and technology are repressed. Quick and psychologically satisfying fixes are sought for these by means of phantasy and the imaginative construction of invulnerable heroes. Much of what Sontag says can straightforwardly be applied to catastrophe in general. “Alongside the hopeful fantasy of moral simplification and international unity embodied in the science fiction films lurk the deepest anxieties about contemporary existence” (220). Sontag writes:In the films it is by means of images and sounds […] that one can participate in the fantasy of living through one’s own death and more, the death of cities, the destruction of humanity itself. Science fiction films are not about science. They are about disaster, which is one of the oldest subjects in art. In science fiction films disaster is rarely viewed intensively; it is always extensive. It is a matter of quality and ingenuity […] the science fiction film […] is concerned with the aesthetics of disaster […] and it is in the imagery of destruction that the core of a good science fiction film lies. (212–13)In science fiction films, disaster, though widespread, is viewed intensively as well as extensively. The disturbances constitutive of the disaster are moral and emotional as well as material. People are left without the mental or physical abilities they need to cope. Government is absent or useless. We find ourselves in what amounts to what Naomi Zack (“Philosophy and Disaster”; Ethics for Disaster) describes as a Hobbesian second state of nature—where government is inoperative and chaos (moral, social, political, personal) reigns. Science fiction’s way out is to imaginatively construct scenarios emotionally satisfying enough to temporarily assuage the distress (anomie or chaos) experienced in the film.There is, however, a tremendous difference in the way in which people who face catastrophic occurrences in their lives, as opposed to science fiction, address the problems. For one thing, they must be far closer to complex and quickly changing realities and uncertain truths than are the phantastic, temporarily gratifying, and morally unproblematic resolutions to the catastrophic scenarios that science fiction envisions. Genuine catastrophe, for example war, undermines and dismantles the structures—material structures to be sure but also those of justice, human kindness, and affectivity—that give us the wherewithal to function and that are shown to be inimical to catastrophe as such. Disaster dispenses with civilization while catastrophe displaces it.Special Effects and Changing StorylinesScience fiction and disaster film genres have been shaped by developments in visual simulation technologies providing opportunities for imaginatively mixing fact and fiction. Developments in filmmaking include computer or digital techniques for reproducing on the screen what can otherwise only be imagined as causal sequences of events and spectacles accompanying the wholesale destruction of buildings and cities—even entire planets. Indeed films are routinely promoted on the basis of how cinematographers and technicians have advanced the state of the art. The revival of 3-D movies with films such as Avatar (2009) and Prometheus (2012) is one of a number of developments augmenting the panoramas of 1950s classics featuring “melting tanks, flying bodies, crashing walls, awesome craters and fissures in the earth, plummeting spacecraft [and] colourful deadly rays” (Sontag 213). An emphasis on the scale of destruction and the wholesale obliteration of recognisable sites emblematic of “the city” (mega-structures like the industrial plant in Aliens (1986) and vast space ships like the “Death Star” in two Star Wars sequels) connect older films with new ones and impress the viewer with ever more extraordinary spectacle.Films that have been remade make for useful comparison. On the whole, these reinforce the continuation and predictability of some storylines (for instance, threats of extra-terrestrial invasion), but also the attenuation or disappearance of other narrative elements such as the monsters and anxieties released by mid-twentieth century atomic tests (Broderick). Remakes also highlight emerging themes requiring novel or updated critical frameworks. For example, environmental anxieties, largely absent in 1950s science fiction films (except for narratives involving colliding worlds or alien contacts) have appeared en masse in recent years, providing an updated view on the ethical issues posed by the fall of cities and communities (Taylor, “Urban”).In The Invasion of the Bodysnatchers and its remakes (1956, 1978, 1993), for example, the organic and vegetal nature of the aliens draws the viewer’s attention to an environment formed by combative species, allowing for threats of infestation, growth and decay of the self and individuality—a longstanding theme. In the most recent version, The Invasion (2007), special effects and directorial spirit render the orifice-seeking tendrils of the pod creatures threateningly vigorous and disturbing (Lim). More sanctimonious than physically invasive, the aliens in the 1951 version of The Day the Earth Stood Still are fed up with humankind’s fixation with atomic self-destruction, and threaten global obliteration on the earth (Cox). In the 2008 remake, the suave alien ambassador, Keanu Reeves, targets the environmental negligence of humanity.Science, including science as fiction, enters into disaster narratives in a variety of ways. Some are less obvious but provocative nonetheless; for example, movies dramatising the arrival of aliens such as War of the Worlds (1953 and 2005) or Alien (1979). These more subtle approaches can be personally confronting even without the mutation of victims into vegetables or zombies. Special effects technologies have made it possible to illustrate the course of catastrophic floods and earthquakes in considerable scientific and visual detail and to represent the interaction of natural disasters, the built environment, and people, from the scale of buildings, homes, and domestic lives to entire cities and urban populations.For instance, the blockbuster film The Day After Tomorrow (2004) runs 118 minutes, but has an uncertain fictional time frame of either a few weeks or 72 hours (if the film’s title is to taken literally). The movie shows the world as we know it being mostly destroyed. Tokyo is shattered by hailstones and Los Angeles is twisted by cyclones the likes of which Dorothy would never have seen. New York disappears beneath a mountainous tsunami. All of these events result from global climate change, though whether this is due to human (in) action or other causes is uncertain. Like their predecessors, the new wave of disaster movies like The Day After Tomorrow makes for questionable “art” (Annan). Nevertheless, their reception opens a window onto broader political and moral contexts for present anxieties. Some critics have condemned The Day After Tomorrow for its scientific inaccuracies—questioning the scale or pace of climate change. Others acknowledge errors while commending efforts to raise environmental awareness (Monbiot). Coincident with the film and criticisms in both the scientific and political arena is a new class of environmental heretic—the climate change denier. This is a shadowy character commonly associated with the presidency of George W. Bush and the oil lobby that uses minor inconsistencies of science to claim that climate change does not exist. One thing underlying both twisting facts for the purposes of making science fiction films and ignoring evidence of climate change is an infantile orientation towards the unknown. In this regard, recent films do what science fiction disaster films have always done. While freely mixing truths and half-truths for the purpose of heightened dramatic effect, they fulfil psychological tasks such as orchestrating nightmare scenarios and all too easy victories on the screen. Uncertainty regarding the precise cause, scale, or duration of cataclysmic natural phenomena is mirrored by suspension of disbelief in the viability of some human responses to portrayals of urban disaster. Science fiction, in other words, invites us to accept as possible the flight of Americans and their values to Mexico (The Day After Tomorrow), the voyage into earth’s molten core (The Core 2003), or the disposal of lava in LA’s drainage system (Volcano 1997). Reinforcing Sontag’s point, here too there is a lack of criticism of the real social and political conditions that bring about the fears depicted in the films (223). Moreover, much like news coverage, images in recent natural disaster films (like their predecessors) typically finish at the point where survivors are obliged to pick up the pieces and start all over again—the latter is not regarded as newsworthy. Allowing for developments in science fiction films and the disaster genre, Sontag’s observation remains accurate. The films are primarily concerned “with the aesthetics of destruction, with the peculiar beauties to be found in wreaking havoc, in making a mess” (213) rather than rebuilding. The Imagination of Disaster RecoverySontag’s essay contributes to an important critical perspective on science fiction film. Variations on her “psychological point of view” have been explored. (The two discourses—psychology and cinema—have parallel and in some cases intertwined histories). Moreover, in the intervening years, psychological or psychoanalytical terms and narratives have themselves become even more a part of popular culture. They feature in recent disaster films and disaster recovery discourse in the “real” world.Today, with greater frequency than in the 1950s and 60s films arguably, representations of alien invasion or catastrophic global warming serve to background conflict resolutions of a more quotidian and personal nature. Hence, viewers are led to suspect that Tom Cruise will be more likely to survive the rapacious monsters in the latest The War of the Worlds if he can become less narcissistic and a better father. Similarly, Dennis Quaid’s character will be much better prepared to serve a newly glaciated America for having rescued his son (and marriage) from the watery deep-freezer that New York City becomes in The Day After Tomorrow. In these films the domestic and familial comprise a domain of inter-personal and communal relations from which victims and heroes appear. Currents of thought from the broad literature of disaster studies and Western media also call upon this domain. The imagination of disaster recovery has come to partly resemble a set of problems organised around the needs of traumatised communities. These serve as an object of urban governance, planning, and design conceived in different ways, but largely envisioned as an organic unity that connects urban populations, their pasts, and settings in a meaningful, psychologically significant manner (Furedi; Hutchison and Bleiker; Boano). Terms like “place” or concepts like Boano’s “place-attachment" (38) feature in this discourse to describe this unity and its subjective dimensions. Consider one example. In August 2006, one year after Katrina, the highly respected Journal of Architectural Education dedicated a special issue to New Orleans and its reconstruction. Opening comments by editorialist Barbara Allen include claims presupposing enduring links between the New Orleans community conceived as an organic whole, its architectural heritage imagined as a mnemonic vehicle, and the city’s unique setting. Though largely unsupported (and arguably unsupportable) the following proposition would find agreement across a number of disaster studies and resonates in commonplace reasoning:The culture of New Orleans is unique. It is a mix of ancient heritage with layers and adaptations added by successive generations, resulting in a singularly beautiful cultural mosaic of elements. Hurricane Katrina destroyed buildings—though not in the city’s historic core—and displaced hundreds of thousands of people, but it cannot wipe out the memories and spirit of the citizens. (4) What is intriguing about the claim is an underlying intellectual project that subsumes psychological and sociological domains of reasoning within a distinctive experience of community, place, and memory. In other words, the common belief that memory is an intrinsic part of the human condition of shock and loss gives form to a theory of how urban communities experience disaster and how they might re-build—and justify rebuilding—themselves. This is problematic and invites anachronistic thinking. While communities are believed to be formed partly by memories of a place, “memory” is neither a collective faculty nor is it geographically bounded. Whose memories are included and which ones are not? Are these truly memories of one place or do they also draw on other real or imagined places? Moreover—and this is where additional circumspection is inspired by our reading of Sontag’s essay—does Allen’s editorial contribute to an aestheticised image of place, rather than criticism of the social and political conditions required for reconstruction to proceed with justice, compassionately and affectively? Allowing for civil liberties to enter the picture, Allen adds “it is necessary to enable every citizen to come back to this exceptional city if they so desire” (4). However, given that memories of places and desires for their recovery are not univocal, and often contain competing visions of what was and should be, it is not surprising they should result in competing expectations for reconstruction efforts. This has clearly proven the case for New Orleans (Vederber; Taylor, “Typologies”)ConclusionThe comparison of films invites an extension of Sontag’s analysis of the imagination of disaster to include the psychology, politics, and morality of rebuilding. Can a “psychological point of view” help us to understand not only the motives behind capturing so many scenes of destruction on screen and television, but also something of the creative impulses driving reconstruction? This invites a second question. How do some impulses, particularly those caricatured as the essence of an “enterprise culture” (Heap and Ross) associated with America’s “can-do” or others valorised as positive outcomes of catastrophe in The Upside of Down (Homer-Dixon), highlight or possibly obscure criticism of the conditions which made cities like New Orleans vulnerable in the first place? The broad outline of an answer to the second question begins to appear only when consideration of the ethics of disaster and rebuilding are taken on board. If “the upside” of “the down” wrought by Hurricane Katrina, for example, is rebuilding of any kind, at any price, and for any person, then the equation works (i.e., there is a silver lining for every cloud). If, however, the range of positives is broadened to include issues of social justice, then the figures require more complex arithmetic.ReferencesAllen, Barbara. “New Orleans and Katrina: One Year Later.” Journal of Architectural Education 60.1 (2006): 4.Annan, David. Catastrophe: The End of the Cinema? London: Lorrimer, 1975.Boano, Camillo. “‘Violent Space’: Production and Reproduction of Security and Vulnerabilities.” The Journal of Architecture 16 (2011): 37–55.Broderick, Mick, ed. Hibakusha Cinema: Hiroshima, Nagasaki and the Nuclear Image in Japanese Film. London: Kegan Paul, 1996.Cox, David. “Get This, Aliens: We Just Don’t Care!” The Guardian 15 Dec. 2008 ‹http://www.guardian.co.uk/film/filmblog/2008/dec/15/the-day-the-earth-stood-still›. Furedi, Frank. “The Changing Meaning of Disaster.” Area 39.4 (2007): 482–89.Heap, Shaun H., and Angus Ross, eds. Understanding the Enterprise Culture: Themes in the Work of Mary Douglas. Edinburgh: Edinburgh University Press, 1992. Homer-Dixon, Thomas. The Upside of Down: Catastrophe, Creativity and the Renewal of Civilization. Washington, DC: Island Press, 2006.Hutchison, Emma, and Roland Bleiker. “Emotional Reconciliation: Reconstituting Identity and Community after Trauma.” European Journal of Social Theory 11 (2008): 385–403.Lim, Dennis. “Same Old Aliens, But New Neuroses.” New York Times 12 Aug. 2007: A17.Monbiot, George. “A Hard Rain's A-gonna Fall.” The Guardian 14 May 2004.Sontag, Susan. “The Imagination of Disaster” (1965). Against Interpretation and Other Essays. New York: Dell, 1979. 209–25.Taylor, William M. “Typologies of Katrina: Mnemotechnics in Post-Disaster New Orleans.” Interstices 13 (2012): 71–84.———. “Urban Disasters: Visualising the Fall of Cities and the Forming of Human Values.” Journal of Architecture 11.5 (2006): 603–12.Verderber, Stephen. “Five Years After – Three New Orleans Neighborhoods.” Journal of Architectural Education 64.1 (2010): 107–20.Zack, Naomi. Ethics for Disaster. New York: Rowman and Littlefield, 2009.———. “Philosophy and Disaster.” Homeland Security Affairs 2, article 5 (April 2006): ‹http://www.hsaj.org/?article=2.1.5›.FilmographyAlien. Dir. Ridley Scott. Brandywine Productions, 1979.Aliens. Dir. James Cameron. Brandywine Productions, 1986.Avatar. Dir. James Cameron. Lightstorm Entertainment et al., 2009.The Core. Dir. Jon Amiel. Paramount Pictures, 2003.The Day after Tomorrow. Dir. Roland Emmerich. 20th Century Fox, 2004.The Invasion of the Body Snatchers. Dir. Don Siegel. Allied Artists, 1956; also 1978 and 1993.The Invasion. Dirs. Oliver Hirschbiegel and Jame McTeigue. Village Roadshow et al, 2007.Prometheus. Dir. Ridley Scott. Scott Free and Brandywine Productions, 2012Star Wars Episode IV: A New Hope. Dir. George Lucas. Lucasfilm, 1977.Star Wars Episode VI: Return of the Jedi. Dir. George Lucas. Lucasfilm, 1983.Volcano. Dir. Mick Jackson. 20th Century Fox, 1997.War of the Worlds. Dir. George Pal. Paramount, 1953; also Steven Spielberg. Paramount, 2005.Acknowledgments The authors are grateful to Oenone Rooksby and Joely-Kym Sobott for their assistance and advice when preparing this article. It was also made possible in part by a grant from the Australian Research Council.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography