Статті в журналах з теми "Data Acquisition Console"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Data Acquisition Console.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-24 статей у журналах для дослідження на тему "Data Acquisition Console".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Thapliyal, Aditya, and CRS Kumar. "Development of Data Acquisition Console and Web Server Using Raspberry Pi for Marine Platforms." International Journal of Information Technology and Computer Science 8, no. 11 (November 8, 2016): 46–53. http://dx.doi.org/10.5815/ijitcs.2016.11.06.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Liu, Xin, Zheng Liu, Jia-Tuo Chen, Xin-Lou Yu, Bin-Jun Lai, Bo Zhan, and Sai-Fan Huang. "Ergonomic Reliability Assessment for Passenger Car Interface Design Based on EWM-MADM and Human Cognitive Reliability Experiments." Mathematical Problems in Engineering 2020 (November 16, 2020): 1–10. http://dx.doi.org/10.1155/2020/4757202.

Повний текст джерела
Анотація:
The ergonomic reliability assessment of interface design scheme for the passenger car is very practical to enhance driving safety performance. In addition, it can significantly reduce development costs and the development cycle of the new car. From the perspective of guiding the improvement of the central console interactive interface design of the passenger car, the most effective method to build the ergonomic reliability assessment method of the interactive interface is to evaluate and predict the human reliability objectively and subjectively and to design, feedback, and guide the design process of the ergonomic interface for the passenger car. Firstly, the questionnaire survey and the classification of ergonomic reliability analysis factors are analyzed to be put forward based on predecessors; the judgment layer factors and index layer factors of human-machine interaction interface in automobile central console are put forward. Secondly, entropy weight method (EWM) and multiple attribute decision-making (MADM) were used for objective evaluation and subjective evaluation, respectively. Thirdly, the interaction interfaces in central consoles of three different passenger cars are taken as examples; objective simulated experimental test based on entropy weight method and subjective scoring evaluation based on MADM were conducted, respectively. Besides, the objective evaluation and subjective evaluation are coupled by fuzzy comprehensive evaluation. Finally, to verify the effectiveness and rationality of the ergonomic reliability assessment method, human cognitive reliability experiments are made based on the data acquisition from the eye-tracking experiments.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Xu, Dong Ming, and Li Sheng Shu. "Research on General Remote Monitoring System for Industrial Equipment." Applied Mechanics and Materials 201-202 (October 2012): 678–81. http://dx.doi.org/10.4028/www.scientific.net/amm.201-202.678.

Повний текст джерела
Анотація:
A general embedded remote monitoring system for industrial equipment has been designed based on Field Bus Technology and Internet Technology. The Field Bus includes PROFIBUS, CAN, RS-485 and RS-232. The data of the device controlled in the system can be detected and transmitted to PC via Internet. Upper monitor in the system is remote PC. Console computer, which can communicate with industrial equipment controller by Field Bus, is the controller of field data acquisition unit. Two working modes can be realized in the system. One is the data of operating status, which is gotten by Field Bus before transferred to PC. The other is the data of operating status, which is detected by field data acquisition unit in remote monitoring system. Maintenance and management becomes convenient after applying the remote monitoring system.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Hieu, Dang Quang, and Nguyen Xuan Truong. "Data Processing Method in Medium Range Coastal Radar Complex." Journal of the Russian Universities. Radioelectronics, no. 3 (July 19, 2018): 35–41. http://dx.doi.org/10.32603/1993-8985-2018-21-3-35-41.

Повний текст джерела
Анотація:
The article presents the basic principles of design and development of integrated middle range Coastal Surveillance System (CSS) used for water surface lookout. It provides solutions for such missions as command and control of maritime forces, border monitoring and control, prevention of illegal activities such as piracy, smuggling, illegal immigration, illegal fishing, supporting search and rescue (SAR) operations, and creates a common situation awareness picture of the Naval Theatre. The system structure diagram is designed to solve computational overload problem when processing large volume of data received from radar stations. The measurement-level fusion algorithm is developed based on the JPDA framework, in which radar data received from a single or group of radars and AIS data is aggregated in a processing center. The servers and workstations make use of local area network (LAN), using standard Gigabit Ethernet technologies for local network communications. Acquisition, analysis, storage and distribution of target data is executed in servers, then the data is sent to automated operator stations (console), where functional operations for managing, identifying and displaying of target on digital situational map are performed.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Wu, Sheng, Weiliang Wen, Yongjian Wang, Jiangchuan Fan, Chuanyu Wang, Wenbo Gou, and Xinyu Guo. "MVS-Pheno: A Portable and Low-Cost Phenotyping Platform for Maize Shoots Using Multiview Stereo 3D Reconstruction." Plant Phenomics 2020 (March 12, 2020): 1–17. http://dx.doi.org/10.34133/2020/1848437.

Повний текст джерела
Анотація:
Plant phenotyping technologies play important roles in plant research and agriculture. Detailed phenotypes of individual plants can guide the optimization of shoot architecture for plant breeding and are useful to analyze the morphological differences in response to environments for crop cultivation. Accordingly, high-throughput phenotyping technologies for individual plants grown in field conditions are urgently needed, and MVS-Pheno, a portable and low-cost phenotyping platform for individual plants, was developed. The platform is composed of four major components: a semiautomatic multiview stereo (MVS) image acquisition device, a data acquisition console, data processing and phenotype extraction software for maize shoots, and a data management system. The platform’s device is detachable and adjustable according to the size of the target shoot. Image sequences for each maize shoot can be captured within 60-120 seconds, yielding 3D point clouds of shoots are reconstructed using MVS-based commercial software, and the phenotypic traits at the organ and individual plant levels are then extracted by the software. The correlation coefficient (R2) between the extracted and manually measured plant height, leaf width, and leaf area values are 0.99, 0.87, and 0.93, respectively. A data management system has also been developed to store and manage the acquired raw data, reconstructed point clouds, agronomic information, and resulting phenotypic traits. The platform offers an optional solution for high-throughput phenotyping of field-grown plants, which is especially useful for large populations or experiments across many different ecological regions.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Yi, Ding Rong. "A Mechatronic System for Interactive Scan Plan Prescription during Real-Time Cardiac Magnetic Resonance Imaging." Applied Mechanics and Materials 427-429 (September 2013): 2009–12. http://dx.doi.org/10.4028/www.scientific.net/amm.427-429.2009.

Повний текст джерела
Анотація:
Real-time magnetic resonance imaging (MRI) has many advantages as compared to traditional MRI and can be used for the visualization of dynamic moving cardiac structures and functions without cardiac gating, as the fast data acquisition apparently freezes the motion resulting from heart beating and lung breathing. During the past decades, fast pulse sequences and image reconstruction algorithms had been developed to improve the temporal resolution with acceptable spatial resolution. However, the bottle neck of current real-time MRI systems is the availability of a user-friendly prescription tool to allow a MRI technician to prescribe the 6-Degree-of-freedom imaging plane of the MRI system. To meet the needs of real-time MRI, a 3D input tool is developed which facilitates user interactive specification of the center position and plane orientation of the MRI imaging plane. This paper reports such a custom designed 6 degree-of-freedom 3D input device, which allows the user to interactively and intuitively manipulate the scan plane to direct the real-time imaging capability to a target position based on the visual feedback provided on the MRI console in the form of real-time MRI images
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Wagnetz, Ute, Heidi C. Roberts, Taebong Chung, Demetris Patsios, Kenneth R. Chapman, and Narinder S. Paul. "Dynamic Airway Evaluation with Volume CT: Initial Experience." Canadian Association of Radiologists Journal 61, no. 2 (April 2010): 90–97. http://dx.doi.org/10.1016/j.carj.2009.11.007.

Повний текст джерела
Анотація:
Purpose The purpose of the study was to prospectively establish the use of a novel multidetector computed tomography unit (MDCT) with 320 × 0.5 detector rows for the evaluation of tracheomalacia by using a dynamic expiratory low-dose technique. Methods Six adult patients (5 men, 1 woman; mean age, 53.7 years [37–70 years]) referred for a clinical suspicion of tracheomalacia were studied on a 320-row MDCT unit by using the following parameters: 120 kVp, 40–50 mA, 0.5-second gantry rotation, and z-axis coverage of 160 mm sufficient to cover the thoracic trachea to the proximal bronchi. Image acquisition occurred during a forceful exhalation. The image data set was subject to the following analyses: cross-sectional area of airway lumen at 4 predefined locations (thoracic inlet, aortic arch, carina, and bronchus intermedius) and measurement of airway volume. Results All 6 patients had evidence of tracheomalacia, the proximal trachea collapsed at a later phase of expiration (3–4 seconds) than the distal trachea (2–3 seconds). The most common region of airway collapse occurred at the level of the aortic arch (5/6 [83%]), Three patients (50%) had diffuse segmental luminal narrowing that involved the tracheobronchial tree. The radiation dose (estimated dose length product, computed tomography console) measured 293.9 mGy in 1 subject and 483.5 mGy in 5 patients. Conclusions Four-dimensional true isophasic and isovolumetric imaging of the central airways by using 320-row MDCT is a viable technique for the diagnosis of tracheomalacia; it provides a comprehensive assessment of airways dynamic.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Chang, Allen, Armen Derboghossians, Jennifer Kaswick, Brian Kim, Howard Jung, Jeff Slezak, Melanie Wuerstle, Stephen G. Williams, and Gary W. Chien. "Achieving proficiency with robot-assisted radical prostatectomy: Laparoscopic-trained versus robotics-trained surgeons." Canadian Urological Association Journal 7, no. 11-12 (November 8, 2013): 711. http://dx.doi.org/10.5489/cuaj.360.

Повний текст джерела
Анотація:
Background: Initiating a robotics program is complex, in regards to achieving favourable outcomes, effectively utilizing an expensive surgical tool, and granting console privileges to surgeons. We report the implementation of a community-based robotics program among minimally-invasive surgery (MIS) urologists with and without formal robotics training.Methods: From August 2008 to December 2010 at Kaiser Permanente Southern California, 2 groups of urologists performing robot-assisted radical prostatectomy (RARP) were followed since the time of robot acquisition at a single institution. The robotics group included 4 surgeons with formal robotics training and the laparoscopic group with another 4 surgeons who were robot-naïve, but skilled in laparoscopy. The laparoscopic group underwent an initial 7-day mentorship period. Surgical proficiency was measured by various operative and pathological outcome variables. Data were evaluated using comparative statistics and multivariate analysis.Results: A total of 420 and 549 RARPs were performed by the robotics and laparoscopic groups, respectively. Operative times were longer in the laparoscopic group (p = 0.002), but estimated blood loss was similar. The robotics group had a significantly better overall positive surgical margin rate of 19.9% compared to the laparoscopic group (27.8%) (p = 0.005). Both groups showed improvements in operative and pathological parameters as they accrued experience, and achieved similar results towards the end of the study.Conclusions: Robot-naïve laparoscopic surgeons may achieve similar outcomes to robotic surgeons relatively early after a graduated mentorship period. This study may apply to a community-based practice in which multiple urologists with varied training backgrounds are granted robot privileges.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Chen, Chen. "Design and Application of the Indoor Environment Monitoring & Control System at Tianjin First Central Hospital." Applied Mechanics and Materials 380-384 (August 2013): 635–38. http://dx.doi.org/10.4028/www.scientific.net/amm.380-384.635.

Повний текст джерела
Анотація:
With advance of our human beings science and technology and enhance of the living standards, more and more people have addressed higher requirements on the environmental conditions in a hospital, therefore, the traditional and no-intelligent monitoring devices are being replaced by the automated and networked monitoring systems gradually. In this case, application of the wireless sensor network just fits this need. This paper proposes the Tianjin First Central Hospital indoor environment monitoring & control system of distributed acquisition and execution, and centralized management by focusing on the needs for the technical indicators of the hospital indoor environment. During design of the system, an universal design concept was put forward, and also a non-standard communication protocol for the wireless sensor network designed independently in combination with the OSI open standard. In this paper, realization of the communication protocol among the nodes with embedded software and the operation mechanism of the modes themselves are discussed, also a console panel has been developed for the data center. Several software design algorithms are proposed with respect to the network layout. This paper also describes the test platform of the Tianjin First Central Hospital indoor environment monitoring & control system established with the network components designed, and provides the test and verification results, including the monitored data of the various gases, corresponding automatic control functions, and underlay BER analysis. The results show that this system can basically realize automatic monitoring on the Tianjin First Central Hospital indoor environment. At present, the sensitive gases include CO, CO2, O2, NH3 and formaldehyde, sensitive environments temperature, humidity and light intensity, and controlled targets ventilation and lighting. This paper offers an optional solution for environment monitoring and has certain theoretical value and engineering significance.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Merkle, Frank, Dino Kurtovic, Christoph Starck, Cynthia Pawelke, Sina Gierig, and Volkmar Falk. "Evaluation of attention, perception, and stress levels of clinical cardiovascular perfusionists during cardiac operations: a pilot study." Perfusion 34, no. 7 (March 14, 2019): 544–51. http://dx.doi.org/10.1177/0267659119828563.

Повний текст джерела
Анотація:
Background: Performing cardiopulmonary bypass is a complex task which involves evaluating visual input from patient monitors and technical parameters displayed at the heart-lung machine console as well as reacting to other sensory input. Only few studies are available concerning the competency requirements for clinical cardiovascular perfusionists, including attention, perception, and coping with mental stress. This study aims at evaluating attention, perception, and stress levels of clinical cardiovascular perfusionists during cardiopulmonary bypass. Methods: Nine clinical cardiovascular perfusionists voluntarily offered to participate in the study. Participants were asked to wear Tobii 2 eye-tracking glasses throughout the procedures. Specific time points were analyzed (cardiopulmonary bypass on, initial cardioplegia delivery, steady state, cross-clamp off, and weaning from cardiopulmonary bypass). Data acquisition was supplemented by participants’ self-evaluation regarding their stress levels and by National Aeronautics and Space Administration Task Load Index (NASA TLX) questionnaires. Results: Seven datasets were sufficient to be evaluated. The clinical cardiovascular perfusionists’ professional experience ranged from 0.5 to 24 years. Evaluation of eye-tracking data revealed large variations in areas of interest hits, fixation, and dwell times. Across all phases, the venous reservoir, mean arterial pressure, arterial pump display, cardioplegia control, and data management system received the highest levels of attention. Pupil diameter measurements increased at start of cardiopulmonary bypass, cardioplegia delivery, and weaning off, but returned to base level during steady state. Clinical cardiovascular perfusionists’ self-evaluation showed that subjective stress level was highest at the start and the end of the procedure. NASA TLX questionnaires revealed medium-to-high mental and temporal workloads, but low physical workloads. Performance, effort, and frustration indices showed medium workloads. Conclusion: During cardiopulmonary bypass, perfusionists are subjected to stress. Peak stress levels were highest during start and end of cardiopulmonary bypass. Furthermore, visual attention and perception varied between the operative phases. Further studies are indicated to evaluate the design of heart-lung machines and stress-coping strategies during cardiopulmonary bypass.
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Furlani, Stefano, Valeria Vaccher, Vanja Macovaz, and Stefano Devoto. "A Cost-Effective Method to Reproduce the Morphology of the Nearshore and Intertidal Zone in Microtidal Environments." Remote Sensing 12, no. 11 (June 10, 2020): 1880. http://dx.doi.org/10.3390/rs12111880.

Повний текст джерела
Анотація:
The photogrammetric method is widely used in coastal areas and in submerged environments. Time-lapse images collected with unmanned aerial vehicles are used to reproduce the emerged areas, while images taken by divers are used to reproduce submerged ones. Conversely, 3D models of natural or human-made objects lying at the water level are severely affected by the difference in refractive index between air and seawater. For this reason, the matching of 3D models of emergent and submerged coasts has been very rarely tested and never used in Earth Sciences. The availability of a large number of time-lapse images, collected at the intertidal zone during previous snorkel surveys, encouraged us to test the merging of 3D models of emerged and submerged environments. Considering the rapid and effective nature of the aforementioned program of swim surveys, photogrammetric targets were not used during image acquisition. This forced us to test the matching of the independent models by recognizing prominent landforms along the waterline. Here we present the approach used to test the method, the instrumentation used for the field tests, and the setting of cameras fixed to a specially built aluminum support console and discuss both its advantages and its limits compared to UAVs. 3D models of sea cliffs were generated by applying structure-from-motion (SfM) photogrammetry. Horizontal time-lapse images, collected with action cameras while swimming parallel to the coastline at nearly constant velocity, were used for the tests. Subsequently, prominent coastal landforms were used to couple the independent models obtained from the emergent and submerged cliffs. The method was pilot tested in two coastal sites in the north-eastern Adriatic (part of the Mediterranean basin). The first site was a 25 m sea wall of sandstone set within a small harbor, while the second site was a 150 m route below plunging limestone cliffs. The data show that inexpensive action cameras provide a sufficient resolution to support and integrate geomorphological field surveys along rocky coastlines.
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Armstrong, Christopher, Diarmuid Kavanagh, Sara Lal, and Peter Rossiter. "Combining Popular Game Consoles and OSGi to Investigate Autonomous In-The-Field Biomedical Data Acquisition and Management." International Journal of Electronics and Telecommunications 56, no. 1 (March 1, 2010): 87–98. http://dx.doi.org/10.2478/v10177-010-0011-6.

Повний текст джерела
Анотація:
Combining Popular Game Consoles and OSGi to Investigate Autonomous In-The-Field Biomedical Data Acquisition and ManagementThe need and interest in conducting biomedical research outside the traditional laboratory is increasing. In the field testing such as in the participant's home or work environment is a growing consideration when undertaking biomedical investigation. This type of research requires at a minimum semi-autonomous computer systems that collect such data and send it back to the laboratory for processing and dissemination with the smallest amount of attendance by the participant or even the experimenter. A key aspect of supporting this type of research is the selection of the appropriate software and hardware components. These supporting systems need to be reliable, allow considerable customizability and be readily accessible but also able to be locked down. In this paper we report a set of requirements for the hardware and software for such a system. We then utilise these requirements to evaluate the use of game consoles as a hardware platform in comparison to other hardware choices. We finish by outline one particular aspect of the supporting software used to support the chosen hardware platform based on the OSGi framework.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Курбатов, А. В., Д. А. Кондрашов, И. А. Драничников, and Ф. А. Попов. "INFORMATION MEASURING SYSTEM STRENGTH TESTS. SYNCHRONIZATION SUBSYSTEM OF MEASURING AND EXECUTIVE EQUIPMENT." Южно-Сибирский научный вестник, no. 1(41) (February 28, 2022): 17–22. http://dx.doi.org/10.25699/sssb.2022.41.1.010.

Повний текст джерела
Анотація:
В промышленности и научных исследованиях существуют быстропротекающие дорогостоящие процессы, которые невозможно будет повторить. Такие процессы, в частности, имеют место при стендовых испытаниях изделий из высокоэнергетических материалов. Как объект измерений они характеризуются несколькими десятками видов измеряемых параметров, значительным количеством измерительных каналов, жесткими требованиями к точности как собственно измерений, так и фиксации времени их проведения. В Информационно-Измерительной Системе (ИИС) прочностных испытаний, разработанной в АО «ФНПЦ «Алтай», решение данной задачи осуществляется средствами рассмотренной в статье подсистемы, реализованной в виде аппаратно-программного комплекса «Центральный пульт». В структуре ИИС он обеспечивает выполнение функций: синхронизация процессов измерений модулей в многоканальной системе сбора данных; мониторинг процессов, протекающих при испытании изделия; управление внутренними системами испытательного стенда по заданным сценариям; сохранение технических данных о процессах, протекающих во время испытания изделия, с их точными временными характеристиками. Во время проведения испытания подсистема обеспечивает синхронизацию процессов запуска и остановки многоканальных систем сбора данных, первичных преобразователей, исполняющих и управляющих систем. Мониторинг процессов, протекающих в изделии и на стенде во время испытания, позволяет отслеживать и фиксировать неисправности испытуемого изделия или стенда. Управление системами испытательного стенда осуществляется по заданным сценариям, позволяющим реализовать временные диаграммы работы аппаратуры - циклограммы. Данные о состоянии процессов, имеющих место при испытании изделий, в т.ч. и их точные временные характеристики, сохраняются в базе данных. В процессе эксплуатации подсистемы в составе ИИС она была применена при испытаниях различных видов изделий с использованием двухсот датчиков потенциометрии, сотни датчиков тензометрии и сотни температурных датчиков, обеспечив высокую точность синхронизации измерительных процессов при научных исследованиях быстропротекающих дорогостоящих процессов. In industry and research, there are high-speed, costly processes that cannot be replicated. Such processes, in particular, take place during bench tests of products made of high-energy materials. As an object of measurements, they are characterized by several dozen types of measured parameters, a significant number of measuring channels, strict requirements for the accuracy of both the actual measurements and the recording of the time of their carrying out. In the Information-Measuring System (IMS) of strength tests, developed at JSC "Federal Scientific and Practical Center" Altai ", the solution of this problem is carried out by means of the subsystem considered in the article, implemented in the form of the hardware-software complex" Central console ". In the structure of the IMS, it provides the following functions: synchronization of the measurement processes of modules in a multichannel data collection system; monitoring of processes occurring during product testing; control of the internal systems of the test bench according to specified scenarios; preservation of technical data on the processes occurring during product testing, with their exact time characteristics. During the test, the subsystem provides synchronization of the processes of starting and stopping multichannel data acquisition systems, primary converters, execution and control systems. Monitoring of the processes occurring in the product and on the bench during testing allows you to track and fix malfunctions of the tested product or bench. The control of the test bench systems is carried out according to the specified scenarios, which make it possible to implement the timing diagrams of the equipment operation - cyclograms. Data on the state of the processes taking place during the testing of products, incl. and their exact timing are stored in the database. During the operation of the subsystem as part of the IMS, it was used in testing various types of products using two hundred potentiometry sensors, hundreds of strain gauges and hundreds of temperature sensors, ensuring high accuracy of synchronization of measuring processes in scientific research of high-speed expensive processes.
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Lepicovsky, J. "Laser Velocimeter Measurements in a Model Propeller Flowfield." Journal of Fluids Engineering 110, no. 4 (December 1, 1988): 350–54. http://dx.doi.org/10.1115/1.3243562.

Повний текст джерела
Анотація:
The objective of this work was to demonstrate the usability of a laser velocimeter data acquisition and reduction techniques for ensemble-averaged velocity measurements near and between rotating propeller or fan blades. A relatively simple experiment was set up to measure the flowfield of a two-bladed model propeller operating at static (non-flight) conditions to verify the data reduction procedures. The mean velocity and ensemble-averaged blade-to-blade velocity distributions were acquired. The experimental results, plotted in a novel consise form, showed separated and reversed flow regions on a rotating static propeller. The flowfield distortion along the blade height in the vicinity of the propeller disc was also observed.
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Anderson, Michael Andrew. "Structure from Motion and Archaeological Excavation: Experiences of the Via Consolare Project in Pompeii." Studies in Digital Heritage 4, no. 2 (April 10, 2021): 78–107. http://dx.doi.org/10.14434/sdh.v4i2.27260.

Повний текст джерела
Анотація:
The last decade of advances in Image-Based Modeling (IBM) data acquisition based on Structure from Motion (SfM) have made it possible as never before to record excavated archaeological deposits, historical architectural remains, artifacts, and geographical surroundings in the field. Armed only with digital cameras and low-cost or open-source software, researchers can now produce accurate point clouds of millions of points, capturing archaeological information in high-resolution detail. But what changes will IBM really bring to the standards, requirements, and expectations of practical field methodology for projects operating on shoe-string budgets? Since 2010, the Via Consolare Project, a small archaeological research project from a State level University, has employed an entirely open-source and “free for academic use” IBM pipeline to record a variety of archaeological features in Insula VII 6 and the “Villa delle Colonne a mosaico” in Pompeii. Ranging from surviving architecture, to rubble fill layers, to the interiors of inaccessible cisterns and drains, this work has been carried out in preparation for the eventual coordination of these data into a 3D GIS of all recorded stratigraphy. Rarely were sufficient resources available for dedicated equipment or personnel to be devoted to this task. While practical implementation, even in a low-budget excavation environment, has confirmed that this technology can indeed augment archaeological field documentation and provide investigation opportunities that would otherwise be impossible, it failed to replace traditional handdrafted recording techniques and was found to present significant challenges and a number of hidden costs. This emphasizes a need for appropriate and cautious planning in implementation, especially in projects with limited means.
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Quiroz Tascón, Stephen, Julian Zapata Jiménez, and Hector Fernando Vargas Montoya. "Predicción de ciberataques en sistemas industriales SCADA a través de la implementación del filtro Kalman." TecnoLógicas 23, no. 48 (May 15, 2020): 249–67. http://dx.doi.org/10.22430/22565337.1586.

Повний текст джерела
Анотація:
En los sistemas industriales SCADA (Supervisory Control And Data Acquisition), conocer el estado de cada dispositivo permite obtener información de su comportamiento. De esta forma se pueden deducir acciones y conformar estrategias diferentes que ayuden a reducir el riesgo cibernético. En este artículo de investigación aplicada, se presenta un modelo de predicción de posibles ciberataques en un sistema SCADA. Dicha predicción se hace con un filtro Kalman. Un filtro Kalman procesa los eventos de ciberseguridad capturados a través de un sistema de detección de intrusos (aplicado en un sistema de simulación de SCADA) y genera una proyección futura de la probabilidad de que se consolide un ataque. Con esta información, los administradores de sistemas podrán tomar alguna decisión sobre cómo actuar frente a inminentes ataques informáticos. Se realizó una instalación de diferentes componentes tecnológicos y se ejecutaron 3 ataques informáticos al SCADA: (i) posibles escaneos, (ii) robo de información y (iii) sobrescritura de comandos y datos generando Denial of Service o DoS. los eventos de seguridad fueron detectados por un sistema de detección de intrusos y enviados a un software configurado con las características del filtro Kalmanpara entregar como salida las posibles predicciones de ataques. Como resultado, se puede ver cómo a partir de las entradas es posible conocer la probabilidad de que un ataque informático sea exitoso con base en los eventos históricos y las fórmulas aplicadas del filtro.
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Chandana, S., B. R. Purnima, and Prabhu Ravikala Vittal. "Classification of Individuals Based on Autonomic Response to Virtual Gaming." Journal of Computational and Theoretical Nanoscience 17, no. 9 (July 1, 2020): 4385–93. http://dx.doi.org/10.1166/jctn.2020.9082.

Повний текст джерела
Анотація:
Modern games consists of digital gaming consoles that involves interaction with a user and has an interface to generate visual feedback through 2D/3D monitor. These games have several psychological side effects like loss of spatial awareness, back pains, insomnia, addiction, aggression, stress, and hypertension. Virtual reality (VR) Gaming is one of the most emerging and novel technologies in the field of entertainment. Evaluation of this new technology has become important in order to analyze the effects of its predecessors (2D and 3D gaming). The main focus of this paper is on detection of stress levels in individuals due to VR gaming and classify them depending on their sympathetic and parasympathetic dominance. This is done through acquisition of electrocardiogram (ECG) and photo plethysmograph signals (PPG) signals and extracting their time domain and frequency domain features before, during and after gaming (Fatma Uysal and Mahmut Tokmakçi, 2018. Evaluation of stress parameters based on heart rate variability measurement. Department of Biomedical Engineering, Erciyes University, Kayseri, Turkey. fatmauysal@erciyes.edu.tr, tokmakci@erciyes.edu.tr., da Silva1, A.G.C.B., Arauj, D.N., et al, 2018. Increase in perceived stress is correlated to lower heart rate variability in healthy young subjects. Departamento de Fisioterapia, Universidade Federal do Rio Grande do Norte, Natal, Rio Grande do Norte, Brazil. s/n., 81531–980, Curitiba, Parana, Brazil. E-mail: fernandoaldias@gmail.com.). The physiological signal variation is analyzed by performing Heart Rate Variability (HRV) analysis over ECG signals which is one of the fast emerging methods in non-invasive research and clinical tools for assessing autonomic nervous system function (Juan Sztajzel, 2004. Heart rate variability: Aa non-invasive electrocardiographic method to measure the autonomic nervous system. Cardiology Center and Medical Policlinics, University Hospital, Geneva, Switzerland, SWISS MED WKLY 2004;134:514–522. www.smw.ch). Pulse-transmissiontime-variability (PTTV), which is extracted, has high coherence with heart rate variability and is also used as an objective measure of stress. In this paper we obtain the response of an individual during VR gaming and correlate them with the HRV/PTT parameters. The game chosen for the data acquisition was ‘VR city view rope crossing-360 android VR,’ during which data recording is done. It was found that there was a quantitative increase in physiological stress when individuals were exposed to virtual high heights in comparison with time relative to unaltered viewing. Mean Heart rate showed a significant increase during gaming for both boys and girls which indicates that the body is under the influence of a sympathetic activity like a physical exercise.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Schram, Alison M., Xing Xu, Outi Kilpivaara, Semanti Mukherjee, Aaron D. Viny, Olga Guryanova, Robert J. Klein, and Ross L. Levine. "Genetic and Functional Investigation of Germline JAK2 Alleles That Predispose to Myeloproliferative Neoplasms." Blood 118, no. 21 (November 18, 2011): 124. http://dx.doi.org/10.1182/blood.v118.21.124.124.

Повний текст джерела
Анотація:
Abstract Abstract 124 A somatic activating mutation in the pseudokinase domain of JAK2 (JAK2V617F) is found in the majority of patients with myeloproliferative neoplams (MPN). Using a genome-wide approach, we and others identified a germline haplotype in the JAK2 locus (rs10974944) that predisposes to the development of JAK2V617F-positive MPN. Importantly, this haplotype is associated with in cis acquisition of the somatic JAK2 mutation. An extended linkage disequilibrium block of 300kb is observed at this locus and others have reported an association between single nucleotide polymorphisms (SNPs) within this haplotype and risk of inflammatory bowel disease consistent with increased JAK-STAT signaling in patients who carry this risk haplotype. The mechanism by which this germline locus contributes to MPN pathogenesis has not been delineated. We hypothesized that the identified allele heightens the risk of developing MPN by either a) increasing the mutational rate at the JAK2 locus, or b) imparting a selective advantage on cells that acquire the somatic mutation through increased JAK2 expression. To address the mutational hypothesis, we performed targeted, high coverage, next-generation sequencing of the entire haplotype and of the entire JAK2 locus in 12 patients homozygous for the risk allele, and in 12 patients without the risk allele. Importantly we did not note an increased rate of somatic mutations (coding or noncoding) in patients homozygous for the risk haplotype. In addition, we expanded our GWAS to include 200 additional cases genotyped using the Illumina 1,000,000 SNP genotyping array. The number of SNPs did not significantly differ between the risk haplotype and non-risk haplotype, further suggesting that there is no increase in mutability attributable to the risk genotype. By constructing a phylogenetic tree, we found that the risk haplotype is ancestral to modern humans and demonstrates evidence of ancestral positive selection, although there was no evidence of recent selection at this locus. Taken together these data suggest that the JAK2 MPN risk hapolotype does not increase the mutational rate at this locus. We next investigated whether the risk allele affects JAK2 expression in hematopoietic cells. We compared the relative abundance of an exonic SNP within the haplotype using matched genomic DNA and cDNA from 8 MPN patients heterozygous for the risk allele. In each case we found that the risk allele was more highly expressed in cDNA compared to the non-risk allele despite similar allelic ratios in genomic DNA. The results suggest an increase in allele-specific expression of JAK2 associated with the JAK2 risk haplotype. We annotated all germline variants in cis with the JAK2 risk haplotype using next generation sequencing data of the entire JAK2 haplotype from MPN patients and from the 1000 Genomes project. We then used Encode ChIP-seq data and the ConSite web-based transcription factor binding prediction model to identify SNPs within the JAK2 haplotype that affect transcription factor binding. We identified a SNP within the JAK2 promoter region, rs1887428, as a potential causative allele because it is significantly associated with MPN (p=9.11E-11) and c-Fos/c-Jun is predicted to preferentially bind to the risk allele. In order to determine if this preferential transcription factor binding leads to a haplotype-specific increase in expression of JAK2, we performed luciferase assays in cells expressing reporter constructs with the two different alleles at rs1887428. Importantly, this demonstrated increased transcriptional activity in cells containing the risk allele at rs1887428, suggesting that enhanced transcription factor binding at rs1887428 may lead to increased JAK2 expression and confer a selective advantage on cells containing the risk haplotype. The effects of allelic variation at rs1887428 on JAK2 expression in hematopoietic cells will be presented. Taken together, our data suggests that the JAK2 MPN risk haplotype contributes to MPN pathogenesis through allele-specific transcription factor binding and JAK2 expression, which increases the selective advantage of JAK2 mutations arising on the risk haplotype. This study provides insight into how predisposing loci increase the predisposition to MPN and to other hematopoietic malignancies. Disclosures: No relevant conflicts of interest to declare.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

D’Ettorre, Claudia, Agostino Stilli, George Dwyer, Maxine Tran, and Danail Stoyanov. "Autonomous pick-and-place using the dVRK." International Journal of Computer Assisted Radiology and Surgery, May 15, 2021. http://dx.doi.org/10.1007/s11548-021-02397-y.

Повний текст джерела
Анотація:
Abstract Purpose Robotic-assisted partial nephrectomy (RAPN) is a tissue-preserving approach to treating renal cancer, where ultrasound (US) imaging is used for intra-operative identification of tumour margins and localisation of blood vessels. With the da Vinci Surgical System (Sunnyvale, CA), the US probe is inserted through an auxiliary access port, grasped by the robotic tool and moved over the surface of the kidney. Images from US probe are displayed separately to the surgical site video within the surgical console leaving the surgeon to interpret and co-registers information which is challenging and complicates the procedural workflow. Methods We introduce a novel software architecture to support a hardware soft robotic rail designed to automate intra-operative US acquisition. As a preliminary step towards complete task automation, we automatically grasp the rail and position it on the tissue surface so that the surgeon is then able to manipulate manually the US probe along it. Results A preliminary clinical study, involving five surgeons, was carried out to evaluate the potential performance of the system. Results indicate that the proposed semi-autonomous approach reduced the time needed to complete a US scan compared to manual tele-operation. Conclusion Procedural automation can be an important workflow enhancement functionality in future robotic surgery systems. We have shown a preliminary study on semi-autonomous US imaging, and this could support more efficient data acquisition.
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Pednekar, Amol S., Benjamin Y. C. Cheong, and Raja Muthupillai. "Ultrafast Computation of Left Ventricular Ejection Fraction by Using Temporal Intensity Variation in Cine Cardiac Magnetic Resonance." Texas Heart Institute Journal 48, no. 4 (September 1, 2021). http://dx.doi.org/10.14503/thij-20-7238.

Повний текст джерела
Анотація:
Cardiac magnetic resonance enables comprehensive cardiac evaluation; however, intense time and labor requirements for data acquisition and processing have discouraged many clinicians from using it. We have developed an alternative image-processing algorithm that requires minimal user interaction: an ultrafast algorithm that computes left ventricular ejection fraction (LVEF) by using temporal intensity variation in cine balanced steady-state free precession (bSSFP) short-axis images, with or without contrast medium. We evaluated the algorithm's performance against an expert observer's analysis for segmenting the LV cavity in 65 study participants (LVEF range, 12%–70%). In 12 instances, contrast medium was administered before cine imaging. Bland-Altman analysis revealed quantitative effects of LV basal, midcavity, and apical morphologic variation on the algorithm's accuracy. Total computation time for the LV stack was <2.5 seconds. The algorithm accurately delineated endocardial boundaries in 1,132 of 1,216 slices (93%). When contours in the extreme basal and apical slices were not adequate, they were replaced with manually drawn contours. The Bland-Altman mean differences were <1.2 mL (0.8%) for end-diastolic volume, <5 mL (6%) for end-systolic volume, and <3% for LVEF. Standard deviation of the difference was ≤4.1% of LV volume for all sections except the midcavity in end-systole (8.3% of end-systolic volume). We conclude that temporal intensity variation–based ultrafast LVEF computation is clinically accurate across a range of LV shapes and wall motions and is suitable for postcontrast cine SSFP imaging. Our algorithm enables real-time processing of cine bSSFP images on a commercial scanner console within 3 seconds in an unobtrusive automated process.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Armstrong, Christopher, Diarmuid Kavanagh, Sara Lal, and Peter Rossiter. "Exploration of Game Consoles as a legitimate computing platform for in-the-field biomedical data acquisition and management." African Journal of Information & Communication Technology 7, no. 1 (February 5, 2013). http://dx.doi.org/10.5130/ajict.v7i1.1287.

Повний текст джерела
Анотація:
Biomedical research increasingly requires for testings be conducted outside the lab, in the field such as the participant’s home or work environment. This type of research requires semi-autonomous computer systems that collect such data and send it back to the lab for processing and dissemination. A key aspect of this type of research is the selection of the required software and hardware components. These systems need to be reliable, allow considerable customizability and be readily accessible but also able to be locked down. In this paper we report a set of requirements for the hardware and software for such a system. We then utilise these requirements to evaluate the use of game consoles as a hardware platform in comparison to other hardware choices.
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Deac, Gicu-Călin, Crina-Narcisa Georgescu, Cicerone Laurentiu Popa, and Costel Emil Cotet. "Virtual Reality Digital Twin for a Smart Factory." International Journal of Modeling and Optimization, December 2020, 190–95. http://dx.doi.org/10.7763/ijmo.2020.v10.769.

Повний текст джерела
Анотація:
This paper describes authors’ research in developing collaborative virtual reality applications as an interface for monitoring big data by creating a digital twin of the factory and sync the movement of virtual machines with the real ones. The platform allows an interactive reading of the sensor telemetry and processes data, maintenance information and access to a large technical library. For data acquisition and reports, a novel image data method was used. The data values that are encoded as pixel colors of images, using different encoding methods for each data type (text, integer, float, Boolean) are also encrypted using an image as a symmetric encryption key and are stored in the cloud in a time base folder structure, assuring a better data compression, security and speed, compared with the existing solutions based on JSON and NoSQL databases. The platform allows the remote access from the VR environment to the machines consoles and allows parametrization and remote commands.
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Moore, Christopher Luke. "Digital Games Distribution: The Presence of the Past and the Future of Obsolescence." M/C Journal 12, no. 3 (July 15, 2009). http://dx.doi.org/10.5204/mcj.166.

Повний текст джерела
Анотація:
A common criticism of the rhythm video games genre — including series like Guitar Hero and Rock Band, is that playing musical simulation games is a waste of time when you could be playing an actual guitar and learning a real skill. A more serious criticism of games cultures draws attention to the degree of e-waste they produce. E-waste or electronic waste includes mobiles phones, computers, televisions and other electronic devices, containing toxic chemicals and metals whose landfill, recycling and salvaging all produce distinct environmental and social problems. The e-waste produced by games like Guitar Hero is obvious in the regular flow of merchandise transforming computer and video games stores into simulation music stores, filled with replica guitars, drum kits, microphones and other products whose half-lives are short and whose obsolescence is anticipated in the annual cycles of consumption and disposal. This paper explores the connection between e-waste and obsolescence in the games industry, and argues for the further consideration of consumers as part of the solution to the problem of e-waste. It uses a case study of the PC digital distribution software platform, Steam, to suggest that the digital distribution of games may offer an alternative model to market driven software and hardware obsolescence, and more generally, that such software platforms might be a place to support cultures of consumption that delay rather than promote hardware obsolescence and its inevitability as e-waste. The question is whether there exists a potential for digital distribution to be a means of not only eliminating the need to physically transport commodities (its current 'green' benefit), but also for supporting consumer practices that further reduce e-waste. The games industry relies on a rapid production and innovation cycle, one that actively enforces hardware obsolescence. Current video game consoles, including the PlayStation 3, the Xbox 360 and Nintendo Wii, are the seventh generation of home gaming consoles to appear within forty years, and each generation is accompanied by an immense international transportation of games hardware, software (in various storage formats) and peripherals. Obsolescence also occurs at the software or content level and is significant because the games industry as a creative industry is dependent on the extensive management of multiple intellectual properties. The computing and video games software industry operates a close partnership with the hardware industry, and as such, software obsolescence directly contributes to hardware obsolescence. The obsolescence of content and the redundancy of the methods of policing its scarcity in the marketplace has been accelerated and altered by the processes of disintermediation with a range of outcomes (Flew). The music industry is perhaps the most advanced in terms of disintermediation with digital distribution at the center of the conflict between the legitimate and unauthorised access to intellectual property. This points to one issue with the hypothesis that digital distribution can lead to a reduction in hardware obsolescence, as the marketplace leader and key online distributor of music, Apple, is also the major producer of new media technologies and devices that are the paragon of stylistic obsolescence. Stylistic obsolescence, in which fashion changes products across seasons of consumption, has long been observed as the dominant form of scaled industrial innovation (Slade). Stylistic obsolescence is differentiated from mechanical or technological obsolescence as the deliberate supersedence of products by more advanced designs, better production techniques and other minor innovations. The line between the stylistic and technological obsolescence is not always clear, especially as reduced durability has become a powerful market strategy (Fitzpatrick). This occurs where the design of technologies is subsumed within the discourses of manufacturing, consumption and the logic of planned obsolescence in which the product or parts are intended to fail, degrade or under perform over time. It is especially the case with signature new media technologies such as laptop computers, mobile phones and portable games devices. Gamers are as guilty as other consumer groups in contributing to e-waste as participants in the industry's cycles of planned obsolescence, but some of them complicate discussions over the future of obsolescence and e-waste. Many gamers actively work to forestall the obsolescence of their games: they invest time in the play of older games (“retrogaming”) they donate labor and creative energy to the production of user-generated content as a means of sustaining involvement in gaming communities; and they produce entirely new game experiences for other users, based on existing software and hardware modifications known as 'mods'. With Guitar Hero and other 'rhythm' games it would be easy to argue that the hardware components of this genre have only one future: as waste. Alternatively, we could consider the actual lifespan of these objects (including their impact as e-waste) and the roles they play in the performances and practices of communities of gamers. For example, the Elmo Guitar Hero controller mod, the Tesla coil Guitar Hero controller interface, the Rock Band Speak n' Spellbinder mashup, the multiple and almost sacrilegious Fender guitar hero mods, the Guitar Hero Portable Turntable Mod and MAKE magazine's Trumpet Hero all indicate a significant diversity of user innovation, community formation and individual investment in the post-retail life of computer and video game hardware. Obsolescence is not just a problem for the games industry but for the computing and electronics industries more broadly as direct contributors to the social and environmental cost of electrical waste and obsolete electrical equipment. Planned obsolescence has long been the experience of gamers and computer users, as the basis of a utopian mythology of upgrades (Dovey and Kennedy). For PC users the upgrade pathway is traversed by the consumption of further hardware and software post initial purchase in a cycle of endless consumption, acquisition and waste (as older parts are replaced and eventually discarded). The accumulation and disposal of these cultural artefacts does not devalue or accrue in space or time at the same rate (Straw) and many users will persist for years, gradually upgrading and delaying obsolescence and even perpetuate the circulation of older cultural commodities. Flea markets and secondhand fairs are popular sites for the purchase of new, recent, old, and recycled computer hardware, and peripherals. Such practices and parallel markets support the strategies of 'making do' described by De Certeau, but they also continue the cycle of upgrade and obsolescence, and they are still consumed as part of the promise of the 'new', and the desire of a purchase that will finally 'fix' the users' computer in a state of completion (29). The planned obsolescence of new media technologies is common, but its success is mixed; for example, support for Microsoft's operating system Windows XP was officially withdrawn in April 2009 (Robinson), but due to the popularity in low cost PC 'netbooks' outfitted with an optimised XP operating system and a less than enthusiastic response to the 'next generation' Windows Vista, XP continues to be popular. Digital Distribution: A Solution? Gamers may be able to reduce the accumulation of e-waste by supporting the disintermediation of the games retail sector by means of online distribution. Disintermediation is the establishment of a direct relationship between the creators of content and their consumers through products and services offered by content producers (Flew 201). The move to digital distribution has already begun to reduce the need to physically handle commodities, but this currently signals only further support of planned, stylistic and technological obsolescence, increasing the rate at which the commodities for recording, storing, distributing and exhibiting digital content become e-waste. Digital distribution is sometimes overlooked as a potential means for promoting communities of user practice dedicated to e-waste reduction, at the same time it is actively employed to reduce the potential for the unregulated appropriation of content and restrict post-purchase sales through Digital Rights Management (DRM) technologies. Distributors like Amazon.com continue to pursue commercial opportunities in linking the user to digital distribution of content via exclusive hardware and software technologies. The Amazon e-book reader, the Kindle, operates via a proprietary mobile network using a commercially run version of the wireless 3G protocols. The e-book reader is heavily encrypted with Digital Rights Management (DRM) technologies and exclusive digital book formats designed to enforce current copyright restrictions and eliminate second-hand sales, lending, and further post-purchase distribution. The success of this mode of distribution is connected to Amazon's ability to tap both the mainstream market and the consumer demand for the less-than-popular; those books, movies, music and television series that may not have been 'hits' at the time of release. The desire to revisit forgotten niches, such as B-sides, comics, books, and older video games, suggests Chris Anderson, linked with so-called “long tail” economics. Recently Webb has queried the economic impact of the Long Tail as a business strategy, but does not deny the underlying dynamics, which suggest that content does not obsolesce in any straightforward way. Niche markets for older content are nourished by participatory cultures and Web 2.0 style online services. A good example of the Long Tail phenomenon is the recent case of the 1971 book A Lion Called Christian, by Anthony Burke and John Rendall, republished after the author's film of a visit to a resettled Christian in Africa was popularised on YouTube in 2008. Anderson's Long Tail theory suggests that over time a large number of items, each with unique rather than mass histories, will be subsumed as part of a larger community of consumers, including fans, collectors and everyday users with a long term interest in their use and preservation. If digital distribution platforms can reduce e-waste, they can perhaps be fostered by to ensuring digital consumers have access to morally and ethically aware consumer decisions, but also that they enjoy traditional consumer freedoms, such as the right to sell on and change or modify their property. For it is not only the fixation on the 'next generation' that contributes to obsolescence, but also technologies like DRM systems that discourage second hand sales and restrict modification. The legislative upgrades, patches and amendments to copyright law that have attempted to maintain the law's effectiveness in competing with peer-to-peer networks have supported DRM and other intellectual property enforcement technologies, despite the difficulties that owners of intellectual property have encountered with the effectiveness of DRM systems (Moore, Creative). The games industry continues to experiment with DRM, however, this industry also stands out as one of the few to have significantly incorporated the user within the official modes of production (Moore, Commonising). Is the games industry capable (or willing) of supporting a digital delivery system that attempts to minimise or even reverse software and hardware obsolescence? We can try to answer this question by looking in detail at the biggest digital distributor of PC games, Steam. Steam Figure 1: The Steam Application user interface retail section Steam is a digital distribution system designed for the Microsoft Windows operating system and operated by American video game development company and publisher, Valve Corporation. Steam combines online games retail, DRM technologies and internet-based distribution services with social networking and multiplayer features (in-game voice and text chat, user profiles, etc) and direct support for major games publishers, independent producers, and communities of user-contributors (modders). Steam, like the iTunes games store, Xbox Live and other digital distributors, provides consumers with direct digital downloads of new, recent and classic titles that can be accessed remotely by the user from any (internet equipped) location. Steam was first packaged with the physical distribution of Half Life 2 in 2004, and the platform's eventual popularity is tied to the success of that game franchise. Steam was not an optional component of the game's installation and many gamers protested in various online forums, while the platform was treated with suspicion by the global PC games press. It did not help that Steam was at launch everything that gamers take objection to: a persistent and initially 'buggy' piece of software that sits in the PC's operating system and occupies limited memory resources at the cost of hardware performance. Regular updates to the Steam software platform introduced social network features just as mainstream sites like MySpace and Facebook were emerging, and its popularity has undergone rapid subsequent growth. Steam now eclipses competitors with more than 20 million user accounts (Leahy) and Valve Corporation makes it publicly known that Steam collects large amounts of data about its users. This information is available via the public player profile in the community section of the Steam application. It includes the average number of hours the user plays per week, and can even indicate the difficulty the user has in navigating game obstacles. Valve reports on the number of users on Steam every two hours via its web site, with a population on average between one and two million simultaneous users (Valve, Steam). We know these users’ hardware profiles because Valve Corporation makes the results of its surveillance public knowledge via the Steam Hardware Survey. Valve’s hardware survey itself conceptualises obsolescence in two ways. First, it uses the results to define the 'cutting edge' of PC technologies and publishing the standards of its own high end production hardware on the companies blog. Second, the effect of the Survey is to subsequently define obsolescent hardware: for example, in the Survey results for April 2009, we can see that the slight majority of users maintain computers with two central processing units while a significant proportion (almost one third) of users still maintained much older PCs with a single CPU. Both effects of the Survey appear to be well understood by Valve: the Steam Hardware Survey automatically collects information about the community's computer hardware configurations and presents an aggregate picture of the stats on our web site. The survey helps us make better engineering and gameplay decisions, because it makes sure we're targeting machines our customers actually use, rather than measuring only against the hardware we've got in the office. We often get asked about the configuration of the machines we build around the office to do both game and Steam development. We also tend to turn over machines in the office pretty rapidly, at roughly every 18 months. (Valve, Team Fortress) Valve’s support of older hardware might counter perceptions that older PCs have no use and begins to reverse decades of opinion regarding planned and stylistic obsolescence in the PC hardware and software industries. Equally significant to the extension of the lives of older PCs is Steam's support for mods and its promotion of user generated content. By providing software for mod creation and distribution, Steam maximises what Postigo calls the development potential of fan-programmers. One of the 'payoffs' in the information/access exchange for the user with Steam is the degree to which Valve's End-User Licence Agreement (EULA) permits individuals and communities of 'modders' to appropriate its proprietary game content for use in the creation of new games and games materials for redistribution via Steam. These mods extend the play of the older games, by requiring their purchase via Steam in order for the individual user to participate in the modded experience. If Steam is able to encourage this kind of appropriation and community support for older content, then the potential exists for it to support cultures of consumption and practice of use that collaboratively maintain, extend, and prolong the life and use of games. Further, Steam incorporates the insights of “long tail” economics in a purely digital distribution model, in which the obsolescence of 'non-hit' game titles can be dramatically overturned. Published in November 2007, Unreal Tournament 3 (UT3) by Epic Games, was unappreciated in a market saturated with games in the first-person shooter genre. Epic republished UT3 on Steam 18 months later, making the game available to play for free for one weekend, followed by discounted access to new content. The 2000 per cent increase in players over the game's 'free' trial weekend, has translated into enough sales of the game for Epic to no longer consider the release a commercial failure: It’s an incredible precedent to set: making a game a success almost 18 months after a poor launch. It’s something that could only have happened now, and with a system like Steam...Something that silently updates a purchase with patches and extra content automatically, so you don’t have to make the decision to seek out some exciting new feature: it’s just there anyway. Something that, if you don’t already own it, advertises that game to you at an agreeably reduced price whenever it loads. Something that enjoys a vast community who are in turn plugged into a sea of smaller relevant communities. It’s incredibly sinister. It’s also incredibly exciting... (Meer) Clearly concerns exist about Steam's user privacy policy, but this also invites us to the think about the economic relationship between gamers and games companies as it is reconfigured through the private contractual relationship established by the EULA which accompanies the digital distribution model. The games industry has established contractual and licensing arrangements with its consumer base in order to support and reincorporate emerging trends in user generated cultures and other cultural formations within its official modes of production (Moore, "Commonising"). When we consider that Valve gets to tax sales of its virtual goods and can further sell the information farmed from its users to hardware manufacturers, it is reasonable to consider the relationship between the corporation and its gamers as exploitative. Gabe Newell, the Valve co-founder and managing director, conversely believes that people are willing to give up personal information if they feel it is being used to get better services (Leahy). If that sentiment is correct then consumers may be willing to further trade for services that can reduce obsolescence and begin to address the problems of e-waste from the ground up. Conclusion Clearly, there is a potential for digital distribution to be a means of not only eliminating the need to physically transport commodities but also supporting consumer practices that further reduce e-waste. For an industry where only a small proportion of the games made break even, the successful relaunch of older games content indicates Steam's capacity to ameliorate software obsolescence. Digital distribution extends the use of commercially released games by providing disintermediated access to older and user-generated content. For Valve, this occurs within a network of exchange as access to user-generated content, social networking services, and support for the organisation and coordination of communities of gamers is traded for user-information and repeat business. Evidence for whether this will actively translate to an equivalent decrease in the obsolescence of game hardware might be observed with indicators like the Steam Hardware Survey in the future. The degree of potential offered by digital distribution is disrupted by a range of technical, commercial and legal hurdles, primary of which is the deployment of DRM, as part of a range of techniques designed to limit consumer behaviour post purchase. While intervention in the form of legislation and radical change to the insidious nature of electronics production is crucial in order to achieve long term reduction in e-waste, the user is currently considered only in terms of 'ethical' consumption and ultimately divested of responsibility through participation in corporate, state and civil recycling and e-waste management operations. The message is either 'careful what you purchase' or 'careful how you throw it away' and, like DRM, ignores the connections between product, producer and user and the consumer support for environmentally, ethically and socially positive production, distribrution, disposal and recycling. This article, has adopted a different strategy, one that sees digital distribution platforms like Steam, as capable, if not currently active, in supporting community practices that should be seriously considered in conjunction with a range of approaches to the challenge of obsolescence and e-waste. References Anderson, Chris. "The Long Tail." Wired Magazine 12. 10 (2004). 20 Apr. 2009 ‹http://www.wired.com/wired/archive/12.10/tail.html›. De Certeau, Michel. The Practice of Everyday Life. Berkeley: U of California P, 1984. Dovey, Jon, and Helen Kennedy. Game Cultures: Computer Games as New Media. London: Open University Press,2006. Fitzpatrick, Kathleen. The Anxiety of Obsolescence. Nashville: Vanderbilt UP, 2008. Flew, Terry. New Media: An Introduction. South Melbourne: Oxford UP, 2008. Leahy, Brian. "Live Blog: DICE 2009 Keynote - Gabe Newell, Valve Software." The Feed. G4TV 18 Feb. 2009. 16 Apr. 2009 ‹http://g4tv.com/thefeed/blog/post/693342/Live-Blog-DICE-2009-Keynote-–-Gabe-Newell-Valve-Software.html›. Meer, Alec. "Unreal Tournament 3 and the New Lazarus Effect." Rock, Paper, Shotgun 16 Mar. 2009. 24 Apr. 2009 ‹http://www.rockpapershotgun.com/2009/03/16/unreal-tournament-3-and-the-new-lazarus-effect/›.Moore, Christopher. "Commonising the Enclosure: Online Games and Reforming Intellectual Property Regimes." Australian Journal of Emerging Technologies and Society 3. 2, (2005). 12 Apr. 2009 ‹http://www.swin.edu.au/sbs/ajets/journal/issue5-V3N2/abstract_moore.htm›. Moore, Christopher. "Creative Choices: Changes to Australian Copyright Law and the Future of the Public Domain." Media International Australia 114 (Feb. 2005): 71–83. Postigo, Hector. "Of Mods and Modders: Chasing Down the Value of Fan-Based Digital Game Modification." Games and Culture 2 (2007): 300-13. Robinson, Daniel. "Windows XP Support Runs Out Next Week." PC Business Authority 8 Apr. 2009. 16 Apr. 2009 ‹http://www.pcauthority.com.au/News/142013,windows-xp-support-runs-out-next-week.aspx›. Straw, Will. "Exhausted Commodities: The Material Culture of Music." Canadian Journal of Communication 25.1 (2000): 175. Slade, Giles. Made to Break: Technology and Obsolescence in America. Cambridge: Harvard UP, 2006. Valve. "Steam and Game Stats." 26 Apr. 2009 ‹http://store.steampowered.com/stats/›. Valve. "Team Fortress 2: The Scout Update." Steam Marketing Message 20 Feb. 2009. 12 Apr. 2009 ‹http://storefront.steampowered.com/Steam/Marketing/message/2269/›. Webb, Richard. "Online Shopping and the Harry Potter Effect." New Scientist 2687 (2008): 52-55. 16 Apr. 2009 ‹http://www.newscientist.com/article/mg20026873.300-online-shopping-and-the-harry-potter-effect.html?page=2›. With thanks to Dr Nicola Evans and Dr Frances Steel for their feedback and comments on drafts of this paper.
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Newman, James. "Save the Videogame! The National Videogame Archive: Preservation, Supersession and Obsolescence." M/C Journal 12, no. 3 (July 15, 2009). http://dx.doi.org/10.5204/mcj.167.

Повний текст джерела
Анотація:
Introduction In October 2008, the UK’s National Videogame Archive became a reality and after years of negotiation, preparation and planning, this partnership between Nottingham Trent University’s Centre for Contemporary Play research group and The National Media Museum, accepted its first public donations to the collection. These first donations came from Sony’s Computer Entertainment Europe’s London Studios who presented the original, pre-production PlayStation 2 EyeToy camera (complete with its hand-written #1 sticker) and Harmonix who crossed the Atlantic to deliver prototypes of the Rock Band drum kit and guitar controllers along with a slew of games. Since then, we have been inundated with donations, enquiries and volunteers offering their services and it is clear that we have exciting and challenging times ahead of us at the NVA as we seek to continue our collecting programme and preserve, conserve, display and interpret these vital parts of popular culture. This essay, however, is not so much a document of these possible futures for our research or the challenges we face in moving forward as it is a discussion of some of the issues that make game preservation a vital and timely undertaking. In briefly telling the story of the genesis of the NVA, I hope to draw attention to some of the peculiarities (in both senses) of the situation in which videogames currently exist. While considerable attention has been paid to the preservation and curation of new media arts (e.g. Cook et al.), comparatively little work has been undertaken in relation to games. Surprisingly, the games industry has been similarly neglectful of the histories of gameplay and gamemaking. Throughout our research, it has became abundantly clear that even those individuals and companies most intimately associated with the development of this form, do not hold their corporate and personal histories in the high esteem we expected (see also Lowood et al.). And so, despite the well-worn bluster of an industry that proclaims itself as culturally significant as Hollywood, it is surprisingly difficult to find a definitive copy of the boxart of the final release of a Triple-A title let alone any of the pre-production materials. Through our journeys in the past couple of years, we have encountered shoeboxes under CEOs’ desks and proud parents’ collections of tapes and press cuttings. These are the closest things to a formalised archive that we currently have for many of the biggest British game development and publishing companies. Not only is this problematic in and of itself as we run the risk of losing titles and documents forever as well as the stories locked up in the memories of key individuals who grow ever older, but also it is symptomatic of an industry that, despite its public proclamations, neither places a high value on its products as popular culture nor truly recognises their impact on that culture. While a few valorised, still-ongoing, franchises like the Super Mario and Legend of Zelda series are repackaged and (digitally) re-released so as to provide continuity with current releases, a huge number of games simply disappear from view once their short period of retail limelight passes. Indeed, my argument in this essay rests to some extent on the admittedly polemical, and maybe even antagonistic, assertion that the past business and marketing practices of the videogames industry are partly to blame for the comparatively underdeveloped state of game preservation and the seemingly low cultural value placed on old games within the mainstream marketplace. Small wonder, then, that archives and formalised collections are not widespread. However antagonistic this point may seem, this essay does not set out merely to criticise the games industry. Indeed, it is important to recognise that the success and viability of projects such as the NVA is derived partly from close collaboration with industry partners. As such, it is my hope that in addition to contributing to the conversation about the importance and need for formalised strategies of game preservation, this essay goes some way to demonstrating the necessity of universities, museums, developers, publishers, advertisers and retailers tackling these issues in partnership. The Best Game Is the Next Game As will be clear from these opening paragraphs, this essay is primarily concerned with ‘old’ games. Perhaps surprisingly, however, we shall see that ‘old’ games are frequently not that old at all as even the shiniest, and newest of interactive experiences soon slip from view under the pressure of a relentless industrial and institutional push towards the forthcoming release and the ‘next generation’. More surprising still is that ‘old’ games are often difficult to come by as they occupy, at best, a marginalised position in the contemporary marketplace, assuming they are even visible at all. This is an odd situation. Videogames are, as any introductory primer on game studies will surely reveal, big business (see Kerr, for instance, as well as trade bodies such as ELSPA and The ESA for up-to-date sales figures). Given the videogame industry seems dedicated to growing its business and broadening its audiences (see Radd on Sony’s ‘Game 3.0’ strategy, for instance), it seems strange, from a commercial perspective if no other, that publishers’ and developers’ back catalogues are not being mercilessly plundered to wring the last pennies of profit from their IPs. Despite being cherished by players and fans, some of whom are actively engaged in their own private collecting and curation regimes (sometimes to apparently obsessive excess as Jones, among others, has noted), videogames have, nonetheless, been undervalued as part of our national popular cultural heritage by institutions of memory such as museums and archives which, I would suggest, have largely ignored and sometimes misunderstood or misrepresented them. Most of all, however, I wish to draw attention to the harm caused by the videogames industry itself. Consumers’ attentions are focused on ‘products’, on audiovisual (but mainly visual) technicalities and high-definition video specs rather than on the experiences of play and performance, or on games as artworks or artefact. Most damagingly, however, by constructing and contributing to an advertising, marketing and popular critical discourse that trades almost exclusively in the language of instant obsolescence, videogames have been robbed of their historical value and old platforms and titles are reduced to redundant, legacy systems and easily-marginalised ‘retro’ curiosities. The vision of inevitable technological progress that the videogames industry trades in reminds us of Paul Duguid’s concept of ‘supersession’ (see also Giddings and Kennedy, on the ‘technological imaginary’). Duguid identifies supersession as one of the key tropes in discussions of new media. The reductive idea that each new form subsumes and replaces its predecessor means that videogames are, to some extent, bound up in the same set of tensions that undermine the longevity of all new media. Chun rightly notes that, in contrast with more open terms like multimedia, ‘new media’ has always been somewhat problematic. Unaccommodating, ‘it portrayed other media as old or dead; it converged rather than multiplied; it did not efface itself in favor of a happy if redundant plurality’ (1). The very newness of new media and of videogames as the apotheosis of the interactivity and multimodality they promise (Newman, "In Search"), their gleam and shine, is quickly tarnished as they are replaced by ever-newer, ever more exciting, capable and ‘revolutionary’ technologies whose promise and moment in the limelight is, in turn, equally fleeting. As Franzen has noted, obsolescence and the trail of abandoned, superseded systems is a natural, even planned-for, product of an infatuation with the newness of new media. For Kline et al., the obsession with obsolescence leads to the characterisation of the videogames industry as a ‘perpetual innovation economy’ whose institutions ‘devote a growing share of their resources to the continual alteration and upgrading of their products. However, it is my contention here that the supersessionary tendency exerts a more serious impact on videogames than some other media partly because the apparently natural logic of obsolescence and technological progress goes largely unchecked and partly because there remain few institutions dedicated to considering and acting upon game preservation. The simple fact, as Lowood et al. have noted, is that material damage is being done as a result of this manufactured sense of continual progress and immediate, irrefutable obsolescence. By focusing on the upcoming new release and the preview of what is yet to come; by exciting gamers about what is in development and demonstrating the manifest ways in which the sheen of the new inevitably tarnishes the old. That which is replaced is fit only for the bargain bin or the budget-priced collection download, and as such, it is my position that we are systematically undermining and perhaps even eradicating the possibility of a thorough and well-documented history for videogames. This is a situation that we at the National Videogame Archive, along with colleagues in the emerging field of game preservation (e.g. the International Game Developers Association Game Preservation Special Interest Group, and the Keeping Emulation Environments Portable project) are, naturally, keen to address. Chief amongst our concerns is better understanding how it has come to be that, in 2009, game studies scholars and colleagues from across the memory and heritage sectors are still only at the beginning of the process of considering game preservation. The IGDA Game Preservation SIG was founded only five years ago and its ‘White Paper’ (Lowood et al.) is just published. Surprisingly, despite the importance of videogames within popular culture and the emergence and consolidation of the industry as a potent creative force, there remains comparatively little academic commentary or investigation into the specific situation and life-cycles of games or the demands that they place upon archivists and scholars of digital histories and cultural heritage. As I hope to demonstrate in this essay, one of the key tasks of the project of game preservation is to draw attention to the consequences of the concentration, even fetishisation, of the next generation, the new and the forthcoming. The focus on what I have termed ‘the lure of the imminent’ (e.g. Newman, Playing), the fixation on not only the present but also the as-yet-unreleased next generation, has contributed to the normalisation of the discourses of technological advancement and the inevitability and finality of obsolescence. The conflation of gameplay pleasure and cultural import with technological – and indeed, usually visual – sophistication gives rise to a context of endless newness, within which there appears to be little space for the ‘outdated’, the ‘superseded’ or the ‘old’. In a commercial and cultural space in which so little value is placed upon anything but the next game, we risk losing touch with the continuities of development and the practices of play while simultaneously robbing players and scholars of the critical tools and resources necessary for contextualised appreciation and analysis of game form and aesthetics, for instance (see Monnens, "Why", for more on the value of preserving ‘old’ games for analysis and scholarship). Moreover, we risk losing specific games, platforms, artefacts and products as they disappear into the bargain bucket or crumble to dust as media decay, deterioration and ‘bit rot’ (Monnens, "Losing") set in. Space does not here permit a discussion of the scope and extent of the preservation work required (for instance, the NVA sets its sights on preserving, documenting, interpreting and exhibiting ‘videogame culture’ in its broadest sense and recognises the importance of videogames as more than just code and as enmeshed within complex networks of productive, consumptive and performative practices). Neither is it my intention to discuss here the specific challenges and numerous issues associated with archival and exhibition tools such as emulation which seek to rebirth code on up-to-date, manageable, well-supported hardware platforms but which are frequently insensitive to the specificities and nuances of the played experience (see Newman, "On Emulation", for some further notes on videogame emulation, archiving and exhibition and Takeshita’s comments in Nutt on the technologies and aesthetics of glitches, for instance). Each of these issues is vitally important and will, doubtless become a part of the forthcoming research agenda for game preservation scholars. My focus here, however, is rather more straightforward and foundational and though it is deliberately controversial, it is my hope that its casts some light over some ingrained assumptions about videogames and the magnitude and urgency of the game preservation project. Videogames Are Disappearing? At a time when retailers’ shelves struggle under the weight of newly-released titles and digital distribution systems such as Steam, the PlayStation Network, Xbox Live Marketplace, WiiWare, DSiWare et al bring new ways to purchase and consume playable content, it might seem strange to suggest that videogames are disappearing. In addition to what we have perhaps come to think of as the ‘usual suspects’ in the hardware and software publishing marketplace, over the past year or so Apple have, unexpectedly and perhaps even surprising themselves, carved out a new gaming platform with the iPhone/iPod Touch and have dramatically simplified the notoriously difficult process of distributing mobile content with the iTunes App Store. In the face of this apparent glut of games and the emergence and (re)discovery of new markets with the iPhone, Wii and Nintendo DS, videogames seem an ever more a vital and visible part of popular culture. Yet, for all their commercial success and seemingly penetration the simple fact is that they are disappearing. And at an alarming rate. Addressing the IGDA community of game developers and producers, Henry Lowood makes the point with admirable clarity (see also Ruggill and McAllister): If we fail to address the problems of game preservation, the games you are making will disappear, perhaps within a few decades. You will lose access to your own intellectual property, you will be unable to show new developers the games you designed or that inspired you, and you may even find it necessary to re-invent a bunch of wheels. (Lowood et al. 1) For me, this point hit home most persuasively a few years ago when, along with Iain Simons, I was invited by the British Film Institute to contribute a book to their ‘Screen Guides’ series. 100 Videogames (Newman and Simons) was an intriguing prospect that provided us with the challenge and opportunity to explore some of the key moments in videogaming’s forty year history. However, although the research and writing processes proved to be an immensely pleasurable and rewarding experience that we hope culminated in an accessible, informative volume offering insight into some well-known (and some less-well known) games, the project was ultimately tinged with a more than a little disappointment and frustration. Assuming our book had successfully piqued the interest of our readers into rediscovering games previously played or perhaps investigating games for the first time, what could they then do? Where could they go to find these games in order to experience their delights (or their flaws and problems) at first hand? Had our volume been concerned with television or film, as most of the Screen Guides are, then online and offline retailers, libraries, and even archives for less widely-available materials, would have been obvious ports of call. For the student of videogames, however, the choices are not so much limited as practically non-existant. It is only comparatively recently that videogame retailers have shifted away from an almost exclusive focus on new releases and the zeitgeist platforms towards a recognition of old games and systems through the creation of the ‘pre-owned’ marketplace. The ‘pre-owned’ transaction is one in which old titles may be traded in for cash or against the purchase of new releases of hardware or software. Surely, then, this represents the commercial viability of classic games and is a recognition on the part of retail that the new release is not the only game in town. Yet, if we consider more carefully the ‘pre-owned’ model, we find a few telling points. First, there is cold economic sense to the pre-owned business model. In their financial statements for FY08, ‘GAME revealed that the service isn’t just a key part of its offer to consumers, but its also represents an ‘attractive’ gross margin 39 per cent.’ (French). Second, and most important, the premise of the pre-owned business as it is communicated to consumers still offers nothing but primacy to the new release. That one would trade-in one’s old games in order to consume these putatively better new ones speaks eloquently in the language of obsolesce and what Dovey and Kennedy have called the ‘technological imaginary’. The wire mesh buckets of old, pre-owned games are not displayed or coded as treasure troves for the discerning or completist collector but rather are nothing more than bargain bins. These are not classic games. These are cheap games. Cheap because they are old. Cheap because they have had their day. This is a curious situation that affects videogames most unfairly. Of course, my caricature of the videogame retailer is still incomplete as a good deal of the instantly visible shopfloor space is dedicated neither to pre-owned nor new releases but rather to displays of empty boxes often sporting unfinalised, sometimes mocked-up, boxart flaunting titles available for pre-order. Titles you cannot even buy yet. In the videogames marketplace, even the present is not exciting enough. The best game is always the next game. Importantly, retail is not alone in manufacturing this sense of dissatisfaction with the past and even the present. The specialist videogames press plays at least as important a role in reinforcing and normalising the supersessionary discourse of instant obsolescence by fixing readers’ attentions and expectations on the just-visible horizon. Examining the pages of specialist gaming publications reveals them to be something akin to Futurist paeans dedicating anything from 70 to 90% of their non-advertising pages to previews, interviews with developers about still-in-development titles (see Newman, Playing, for more on the specialist gaming press’ love affair with the next generation and the NDA scoop). Though a small number of publications specifically address retro titles (e.g. Imagine Publishing’s Retro Gamer), most titles are essentially vehicles to promote current and future product lines with many magazines essentially operating as delivery devices for cover-mounted CDs/DVDs offering teaser videos or playable demos of forthcoming titles to further whet the appetite. Manufacturing a sense of excitement might seem wholly natural and perhaps even desirable in helping to maintain a keen interest in gaming culture but the effect of the imbalance of popular coverage has a potentially deleterious effect on the status of superseded titles. Xbox World 360’s magnificently-titled ‘Anticip–O–Meter’ ™ does more than simply build anticipation. Like regular features that run under headings such as ‘The Next Best Game in The World Ever is…’, it seeks to author not so much excitement about the imminent release but a dissatisfaction with the present with which unfavourable comparisons are inevitably drawn. The current or previous crop of (once new, let us not forget) titles are not simply superseded but rather are reinvented as yardsticks to judge the prowess of the even newer and unarguably ‘better’. As Ashton has noted, the continual promotion of the impressiveness of the next generation requires a delicate balancing act and a selective, institutionalised system of recall and forgetting that recovers the past as a suite of (often technical) benchmarks (twice as many polygons, higher resolution etc.) In the absence of formalised and systematic collecting, these obsoleted titles run the risk of being forgotten forever once they no longer serve the purpose of demonstrating the comparative advancement of the successors. The Future of Videogaming’s Past Even if we accept the myriad claims of game studies scholars that videogames are worthy of serious interrogation in and of themselves and as part of a multifaceted, transmedial supersystem, we might be tempted to think that the lack of formalised collections, archival resources and readily available ‘old/classic’ titles at retail is of no great significance. After all, as Jones has observed, the videogame player is almost primed to undertake this kind of activity as gaming can, at least partly, be understood as the act and art of collecting. Games such as Animal Crossing make this tendency most manifest by challenging their players to collect objects and artefacts – from natural history through to works of visual art – so as to fill the initially-empty in-game Museum’s cases. While almost all videogames from The Sims to Katamari Damacy can be considered to engage their players in collecting and collection management work to some extent, Animal Crossing is perhaps the most pertinent example of the indivisibility of the gamer/archivist. Moreover, the permeability of the boundary between the fan’s collection of toys, dolls, posters and the other treasured objects of merchandising and the manipulation of inventories, acquisitions and equipment lists that we see in the menus and gameplay imperatives of videogames ensures an extensiveness and scope of fan collecting and archival work. Similarly, the sociality of fan collecting and the value placed on private hoarding, public sharing and the processes of research ‘…bridges to new levels of the game’ (Jones 48). Perhaps we should be as unsurprised that their focus on collecting makes videogames similar to eBay as we are to the realisation that eBay with its competitiveness, its winning and losing states, and its inexorable countdown timer, is nothing if not a game? We should be mindful, however, of overstating the positive effects of fandom on the fate of old games. Alongside eBay’s veneration of the original object, p2p and bittorrent sites reduce the videogame to its barest. Quite apart from the (il)legality of emulation and videogame ripping and sharing (see Conley et al.), the existence of ‘ROMs’ and the technicalities of their distribution reveals much about the peculiar tension between the interest in old games and their putative cultural and economic value. (St)ripped down to the barest of code, ROMs deny the gamer the paratextuality of the instruction manual or boxart. In fact, divorced from its context and robbed of its materiality, ROMs perhaps serve to make the original game even more distant. More tellingly, ROMs are typically distributed by the thousand in zipped files. And so, in just a few minutes, entire console back-catalogues – every game released in every territory – are available for browsing and playing on a PC or Mac. The completism of the collections allows detailed scrutiny of differences in Japanese versus European releases, for instance, and can be seen as a vital investigative resource. However, that these ROMs are packaged into collections of many thousands speaks implicitly of these games’ perceived value. In a similar vein, the budget-priced retro re-release collection helps to diminish the value of each constituent game and serves to simultaneously manufacture and highlight the manifestly unfair comparison between these intriguingly retro curios and the legitimately full-priced games of now and next. Customer comments at Amazon.co.uk demonstrate the way in which historical and technological comparisons are now solidly embedded within the popular discourse (see also Newman 2009b). Leaving feedback on Sega’s PS3/Xbox 360 Sega MegaDrive Ultimate Collection customers berate the publisher for the apparently meagre selection of titles on offer. Interestingly, this charge seems based less around the quality, variety or range of the collection but rather centres on jarring technological schisms and a clear sense of these titles being of necessarily and inevitably diminished monetary value. Comments range from outraged consternation, ‘Wtf, only 40 games?’, ‘I wont be getting this as one disc could hold the entire arsenal of consoles and games from commodore to sega saturn(Maybe even Dreamcast’ through to more detailed analyses that draw attention to the number of bits and bytes but that notably neglect any consideration of gameplay, experientiality, cultural significance or, heaven forbid, fun. “Ultimate” Collection? 32Mb of games on a Blu-ray disc?…here are 40 Megadrive games at a total of 31 Megabytes of data. This was taking the Michael on a DVD release for the PS2 (or even on a UMD for the PSP), but for a format that can store 50 Gigabytes of data, it’s an insult. Sega’s entire back catalogue of Megadrive games only comes to around 800 Megabytes - they could fit that several times over on a DVD. The ultimate consequence of these different but complementary attitudes to games that fix attentions on the future and package up decontextualised ROMs by the thousand or even collections of 40 titles on a single disc (selling for less than half the price of one of the original cartridges) is a disregard – perhaps even a disrespect – for ‘old’ games. Indeed, it is this tendency, this dominant discourse of inevitable, natural and unimpeachable obsolescence and supersession, that provided one of the prime motivators for establishing the NVA. As Lowood et al. note in the title of the IGDA Game Preservation SIG’s White Paper, we need to act to preserve and conserve videogames ‘before it’s too late’.ReferencesAshton, D. ‘Digital Gaming Upgrade and Recovery: Enrolling Memories and Technologies as a Strategy for the Future.’ M/C Journal 11.6 (2008). 13 Jun 2009 ‹http://journal.media-culture.org.au/index.php/mcjournal/article/viewArticle/86›.Buffa, C. ‘How to Fix Videogame Journalism.’ GameDaily 20 July 2006. 13 Jun 2009 ‹http://www.gamedaily.com/articles/features/how-to-fix-videogame-journalism/69202/?biz=1›. ———. ‘Opinion: How to Become a Better Videogame Journalist.’ GameDaily 28 July 2006. 13 Jun 2009 ‹http://www.gamedaily.com/articles/features/opinion-how-to-become-a-better-videogame-journalist/69236/?biz=1. ———. ‘Opinion: The Videogame Review – Problems and Solutions.’ GameDaily 2 Aug. 2006. 13 Jun 2009 ‹http://www.gamedaily.com/articles/features/opinion-the-videogame-review-problems-and-solutions/69257/?biz=1›. ———. ‘Opinion: Why Videogame Journalism Sucks.’ GameDaily 14 July 2006. 13 Jun 2009 ‹http://www.gamedaily.com/articles/features/opinion-why-videogame-journalism-sucks/69180/?biz=1›. Cook, Sarah, Beryl Graham, and Sarah Martin eds. Curating New Media, Gateshead: BALTIC, 2002. Duguid, Paul. ‘Material Matters: The Past and Futurology of the Book.’ In Gary Nunberg, ed. The Future of the Book. Berkeley, CA: University of California Press, 1996. 63–101. French, Michael. 'GAME Reveals Pre-Owned Trading Is 18% of Business.’ MCV 22 Apr. 2009. 13 Jun 2009 ‹http://www.mcvuk.com/news/34019/GAME-reveals-pre-owned-trading-is-18-per-cent-of-business›. Giddings, Seth, and Helen Kennedy. ‘Digital Games as New Media.’ In J. Rutter and J. Bryce, eds. Understanding Digital Games. London: Sage. 129–147. Gillen, Kieron. ‘The New Games Journalism.’ Kieron Gillen’s Workblog 2004. 13 June 2009 ‹http://gillen.cream.org/wordpress_html/?page_id=3›. Jones, S. The Meaning of Video Games: Gaming and Textual Strategies, New York: Routledge, 2008. Kerr, A. The Business and Culture of Digital Games. London: Sage, 2006. Lister, Martin, John Dovey, Seth Giddings, Ian Grant and Kevin Kelly. New Media: A Critical Introduction. London and New York: Routledge, 2003. Lowood, Henry, Andrew Armstrong, Devin Monnens, Zach Vowell, Judd Ruggill, Ken McAllister, and Rachel Donahue. Before It's Too Late: A Digital Game Preservation White Paper. IGDA, 2009. 13 June 2009 ‹http://www.igda.org/wiki/images/8/83/IGDA_Game_Preservation_SIG_-_Before_It%27s_Too_Late_-_A_Digital_Game_Preservation_White_Paper.pdf›. Monnens, Devin. ‘Why Are Games Worth Preserving?’ In Before It's Too Late: A Digital Game Preservation White Paper. IGDA, 2009. 13 June 2009 ‹http://www.igda.org/wiki/images/8/83/IGDA_Game_Preservation_SIG_-_Before_It%27s_Too_Late_-_A_Digital_Game_Preservation_White_Paper.pdf›. ———. ‘Losing Digital Game History: Bit by Bit.’ In Before It's Too Late: A Digital Game Preservation White Paper. IGDA, 2009. 13 June 2009 ‹http://www.igda.org/wiki/images/8/83/IGDA_Game_Preservation_SIG_-_Before_It%27s_Too_Late_-_A_Digital_Game_Preservation_White_Paper.pdf›. Newman, J. ‘In Search of the Videogame Player: The Lives of Mario.’ New Media and Society 4.3 (2002): 407-425.———. ‘On Emulation.’ The National Videogame Archive Research Diary, 2009. 13 June 2009 ‹http://www.nationalvideogamearchive.org/index.php/2009/04/on-emulation/›. ———. ‘Our Cultural Heritage – Available by the Bucketload.’ The National Videogame Archive Research Diary, 2009. 10 Apr. 2009 ‹http://www.nationalvideogamearchive.org/index.php/2009/04/our-cultural-heritage-available-by-the-bucketload/›. ———. Playing with Videogames, London: Routledge, 2008. ———, and I. Simons. 100 Videogames. London: BFI Publishing, 2007. Nutt, C. ‘He Is 8-Bit: Capcom's Hironobu Takeshita Speaks.’ Gamasutra 2008. 13 June 2009 ‹http://www.gamasutra.com/view/feature/3752/›. Radd, D. ‘Gaming 3.0. Sony’s Phil Harrison Explains the PS3 Virtual Community, Home.’ Business Week 9 Mar. 2007. 13 June 2009 ‹http://www.businessweek.com/innovate/content/mar2007/id20070309_764852.htm?chan=innovation_game+room_top+stories›. Ruggill, Judd, and Ken McAllister. ‘What If We Do Nothing?’ Before It's Too Late: A Digital Game Preservation White Paper. IGDA, 2009. 13 June 2009. ‹http://www.igda.org/wiki/images/8/83/IGDA_Game_Preservation_SIG_-_Before_It%27s_Too_Late_-_A_Digital_Game_Preservation_White_Paper.pdf›. 16-19.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії