Articoli di riviste sul tema "Microsoft System center operations manager"

Segui questo link per vedere altri tipi di pubblicazioni sul tema: Microsoft System center operations manager.

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Vedi i top-22 articoli di riviste per l'attività di ricerca sul tema "Microsoft System center operations manager".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Vedi gli articoli di riviste di molte aree scientifiche e compila una bibliografia corretta.

1

Pacuszka, Marta, Sebastian Bukowiec, Esteban Puentes e Guillaume Metral. "Chopin Management System: improving Windows infrastructure monitoring and management". EPJ Web of Conferences 245 (2020): 05002. http://dx.doi.org/10.1051/epjconf/202024505002.

Testo completo
Abstract (sommario):
CERN Windows server infrastructure consists of about 650 servers. The management and maintenance is often a challenging task as the data to be monitored is disparate and has to be collected from various sources. Currently, alarms are collected from the Microsoft System Center Operation Manager (SCOM) and many administrative actions are triggered through e-mails sent by various systems or scripts. The objective of the Chopin Management System project is to maximize automation and facilitate the management of the infrastructure. The current status of the infrastructure, including essential health checks, is centralized and presented through a dashboard. The system collects information necessary for managing the infrastructure in the real-time, such as hardware configuration or Windows updates, and reacts to any change or failure instantly. As part of the system design, big data streaming technologies are employed in order to assure the scalability and fault-tolerance of the service, should the number of servers drastically grow. Server events are aggregated and processed in real-time through the use of these technologies, ensuring quick response to possible failures. This paper presents details of the architecture and design decisions taken in order to achieve a modern, maintainable and extensible system for Windows Server Infrastructure management at CERN.
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Li, Lin Lin, e Liang Xu Sun. "Online Examination System for Microsoft Office Software Operations". Advanced Materials Research 756-759 (settembre 2013): 911–15. http://dx.doi.org/10.4028/www.scientific.net/amr.756-759.911.

Testo completo
Abstract (sommario):
online examination is an effective solution to the level evaluation problem for computer basic operations. This paper proposed an online examination system for computer basic operations, especially, Microsoft Office software operations. The system mainly achieved functions including making exams intelligently, collecting and marking documents submitted automatically in the exam by the database, socket, ado and VBA program methods. This system supported some kinds of Microsoft Office software including Word, Excel, PowerPoint and Access. The actual running result showed the system could help teachers to improve work efficiency and students to improve software operations by the way of online actual operation in computers. The system has been running in the USTL university computer lab center for some times which has already proved that it was very valid to solve level evaluation problem for Microsoft Office software operations.
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Armi, Vella. "Implementasi Metode Average Akuntansi Dalam Membangun Sistem Informasi Inventory Menggunakan Framework Codeigniter". Journal of Vocational Education and Information Technology (JVEIT) 2, n. 2 (30 dicembre 2021): 54–63. http://dx.doi.org/10.56667/jveit.v2i2.33.

Testo completo
Abstract (sommario):
The purpose of this study is the application of the Average Accounting method in building an inventory information system using a codeigniter framework to help warehouse managers of Bank Nagari Koto Baru Branch in completing inventory reports quickly and accurately. This research conducted a variety of problems, such as: data collection of available goods, supplier data collection, data collection of incoming goods, data collection of users, and data collection of goods used, then the data is processed into inventory reports using Microsoft Excel software. During the data collection process the warehouse manager must be careful to avoid data redundancy (double data) because the number of items to be recorded is very large. The system design in this study uses the Information Systems Flow (ASI) approach. The results of this study are an inventory information system by implementing the average accounting method using a codeigniter framework as a product that is very useful for the warehouse manager of Bank Nagari Koto Baru Branch to facilitate warehouse operations quickly, precisely, easily, accurately and produce quality inventory reports to help in decision making. This system is expected to be a very useful tool for managers in optimal warehouse operations.
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Faiz, Muhammad Nur, Rusydi Umar e Anton Yudhana. "Implementasi Live Forensics untuk Perbandingan Browser pada Keamanan Email". JISKA (Jurnal Informatika Sunan Kalijaga) 1, n. 3 (30 marzo 2017): 108. http://dx.doi.org/10.14421/jiska.2017.13-02.

Testo completo
Abstract (sommario):
Digital Forensics become one popular term because Currently many violations of cyber crime. Digital techniques Computer Forensics performed or with analyze digital device, whether the device is a media Actors or as a media victim. Digital Forensic Analysis Being divided into two, traditional / dead and alive. Forensic analysis traditionally involves digital data Deposited permanent Operates in Irish, while live forensic analysis involves analysis of data Namely temporary in Random Access Memory or Deposited hearts transport equipment in the Network. Singer proposes journal Forensic analysis of life in the latest operation system windows 10. That study focused IN case several email security browsers Sales Operations manager of Google Chrome, Mozilla Firefox, and Microsoft Internet Explorer Edge. In addition, although many digital forensics type software applications not free, goal on research objectives compares browser security information so it will be more minimize abuse email.
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Werkmeister, Raymond F., Becky L. Luscher e Donn E. Hancher. "Kentucky Contract Time Determination System". Transportation Research Record: Journal of the Transportation Research Board 1712, n. 1 (gennaio 2000): 185–95. http://dx.doi.org/10.3141/1712-22.

Testo completo
Abstract (sommario):
The results of research funded by the Kentucky Transportation Cabinet (KyTC) to develop a new method for determining of construction contract time for its highway construction contracts are reported. The current system in Kentucky was analyzed to determine how a new system could provide better-estimated durations. Current methods of other departments of transportation were analyzed to provide insight for a new system. It was predetermined that a computer system was best suited; however, the development of the system needed input from KyTC engineers with construction experience. A task force of the study advisory committee worked with Kentucky Transportation Center research engineers to develop the basis for the new contract time determination system, called the Kentucky Contract Time Determination System (KY-CTDS). The KY-CTDS program was developed to provide a conceptual estimating tool for prediction of construction contract time for the Kentucky Department of Highways. It makes use of predetermined project classifications and lists major activities that are believed to control the project schedule. Production rates and activity relationships have also been determined and are embedded in the program. Experienced KyTC engineers can make final adjustments using the model. This program uses Microsoft Project 98 and Microsoft Excel, version 7.0, software and operates on a personal computer. It outputs a graphical bar chart schedule to be used only for estimation of contract time for bidding purposes. Program output should also help in resolving construction disputes. The program is not suitable for detailed scheduling of construction operations.
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Mykhalska, O. "CONCEPTUAL PRINCIPLES OF CONSTRUCTION OF MANAGEMENT ACCOUNTING OF CENTERS OF RESPONSIBILITY IN BUDGETING AT OIL AND FAT ENTERPRISES OF UKRAINE". Financial and credit activity: problems of theory and practice 1, n. 36 (17 febbraio 2021): 84–91. http://dx.doi.org/10.18371/fcaptp.v1i36.227626.

Testo completo
Abstract (sommario):
The article analyzes the management accounting of responsibility centers that form specific objects of cost accounting, namely budget responsibility centers. The presented system of management post-operational cost accounting, meets the requirements of budgeting and becomes a necessary condition for monitoring the level of compliance with standards and provides a reasonable formation of production costs for technological redistribution at oil and fat enterprises. The information basis of the budgeting process is management accounting. Moreover, management accounting provides the necessary information for budget planning, control and analysis of budget execution, and is one of the components of the budgeting process. Assessment of responsibility centers is useful for understanding the cycles of economic activity, economic processes and structural units in the framework of budgeting. The operations-based management accounting system mainly meets the requirements of budgeting, which becomes a necessary condition for control over the standard conditions of accounting. In order to obtain more detailed information for analysis and control, it is proposed to organize cost accounting by operations-functions of technological processes, which allows analysts to find bottlenecks in technology by identifying redundant technological functions and operations and strengthen control over compliance with proven budgets for each «responsibility center». Accounting for responsibility centers aims to evaluate the management activities of a particular responsibility center and to provide a responsibility report and an information report for the evaluation of senior management. It is a method of management accounting that measures the results of each responsibility center, which allows each manager to realize the responsibility in their field of cost accounting and budgeting.
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Ardmare, Pitak, Anyamanee Nakarakaw, Nurulhuda Doolah, Arseeyah Lateh, Fakutdeen Tapohtoh, Zunuri Sedeh, Habilla Chapakiya, Ameen Mahmad, Nifarid Radenamad e Winai Dahlan. "The development of a multi-dimensional reporting system for monitoring operations and the decision of the administrators. study case of Halal Science Center Chulalongkorn University, Pattani Office". Proceedings of The International Halal Science and Technology Conference 14, n. 1 (10 marzo 2022): 202–7. http://dx.doi.org/10.31098/ihsatec.v14i1.502.

Testo completo
Abstract (sommario):
The Halal Science Center Chulalongkorn University at Pattani Office was established in 2009 with the primary mission of developing areas according to the Indonesia-MalaysiaThailand Growth Triangle (IMT-GT) through the project to increase personnel potential from the border provinces in the southern region until now. In addition, must report the performance of the various activities Performance in finance, supplies, content, people, public relations, Etc. Through weekly reports. The Executive Committee of The Halal Science Center Chulalongkorn University's monthly report and the report found that have the problem of the past operations cannot see the overall picture of the whole operation. Therefore, this research was conducted to develop a system for monitoring and reporting the performance in a multi-dimensional format and testing the users' satisfaction. So, results of the study showed that the multi-dimensional performance tracking and reporting system had been developed with Microsoft Excel that can reduce the operating time reduce about 70.00%, while Users of the monitoring and reporting system have an outstanding level of satisfaction from 15 total users.
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Masadeh, Ali. "Application of Using the Activity-Based Costing System on Product Development in Jordan’s Manufacturing Listed Manufacturing Firms". International Journal of Professional Business Review 8, n. 6 (23 giugno 2023): e02458. http://dx.doi.org/10.26668/businessreview/2023.v8i6.2458.

Testo completo
Abstract (sommario):
Purpose: The aim of this research was to examine how the Activity-Based Costing system (ABC) affected product development in Jordanian manufacturing public shareholding companies. Design/methodology/approach: According to the monthly statistical bulletin of the Amman Stock Exchange and the Securities Depository Center, the study population for the year 2022 included (56) Jordanian industrial firms. The questionnaire was distributed to the financial manager, production manager, sales manager, and accountant. In comparison to the total number of circulated questionnaires, (132) were authorized for statistical analysis. Regression analysis and correlation were used to analyze the data and extract results related to the statistical methods used in the study, namely, descriptive statistics, Cronbach's alpha equation, Pearson's correlation coefficient, and linear regression analysis. Findings: The study revealed that there is a connection between the ABC system and product development in Jordanian industrial companies, where this approach contributes to the improvement and development of products by tracking the stages of production since its inception and concentrating on activities that add value to the company and eliminating activities that do not add any value, naturally improving the competitive position of the company. Research Practical implications: Hence, one of this study's main findings suggests developing strategies for methodically and practically gathering comprehensive data on the company's operations. The study advised that Jordan's industrial public shareholding companies stay up with recent advancements in the field of activity-based costing. Originality/value: This study recommends that for the company to stay up with the acceleration in advancement in the areas of activity costs and the employment of latent capabilities for the company's advantage, the study also recommended that the company train its employees to develop their capabilities to serve its strategies. As the findings also reveal that most of the manufacturing firms in Jordan have the infrastructure to adopt and implement the ABC system.
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Odarchenko, Roman, Serhii Dakov e Larisa Dakova. "RESEARCH OF CYBER SECURITY MECHANISMS IN MODERN 5G CELLULAR NETWORKS". Information systems and technologies security, n. 1 (5) (2021): 27–36. http://dx.doi.org/10.17721/ists.2021.1.25-34.

Testo completo
Abstract (sommario):
The main feature of the 5G network is Network slicing. This concept enables network resource efficiency, deployment flexibility, and support for rapid growth in over the top (OTT) applications and services. Network Slicing involves splitting the 5G physical architecture into multiple virtual networks or layers. Each network layer (slice) includes control layer functions, user traffic level functions, and a radio access network. Slice isolation is an important requirement that allows the basic concept of Network slicing to be applied to the simultaneous coexistence of multiple fragments in a single infrastructure. This property is achieved by the fact that the performance of each slice should not affect the performance of the other. The architecture of network fragments expands in two main aspects: slice protection (cyber attacks or malfunctions affect only the target slice and have a limited impact on the life cycle of other existing ones) and slice privacy (private information about each slice, such as user statistics) does not exchange between other slices). In 5G, the interaction of the user's equipment with the data networks is established using PDU sessions. Multiple PDU sessions can be active at the same time to connect to different networks. In this case, different sessions can be created using different network functions following the concept of Network Slicing. The concept of "network architecture", which is developed on hardware solutions, is losing its relevance. It will be more appropriate to call 5G a system, or a platform because it is implemented using software solutions. 5G functions are implemented in VNF virtual software functions running in the network virtualization infrastructure, which, in turn, is implemented in the physical infrastructure of data centers, based on standard commercial COTS equipment, which includes only three types of standard devices - server, switch and a storage system. For the correct operation of a network, it is necessary to provide constant monitoring of parameters which are described above. Monitoring is a specially organized, periodic observation of the state of objects, phenomena, processes for their assessment, control, or forecasting. The monitoring system collects and processes information that can be used to improve the work process, as well as to inform about the presence of deviations. There is a lot of network monitoring software available today, but given that 5G is implemented on virtual elements, it is advisable to use the System Center Operations Manager component to monitor network settings and performance and to resolve deviations on time. The Operations Manager reports which objects are out of order sends alerts when problems are detected and provides information to help determine the cause of the problem and possible solutions. So, for the 5G network, it is extremely important to constantly monitor its parameters for the timely elimination of deviations, as it can impair the performance and interaction of smart devices, as well as the quality of communication and services provided. System Center Operations Manager provides many opportunities for this. The purpose and objectives of the work. The work aims to analyze the main mechanisms of cybersecurity in 5G cellular networks
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Zlatanović-Marković, Valentina, Biljana Nikolić, Mirjana Stakić e Milovan Mitić. "Public relations in the educational system". Megatrend revija 19, n. 1 (2022): 267–82. http://dx.doi.org/10.5937/megrev2201267z.

Testo completo
Abstract (sommario):
The aim of this paper is to determine the effects of the implementation of the program "School and the public" on the organization of the school system in the period of program implementation from 2009-2014. and the level of communication between the education system and the public. Methods: 694 respondents fulfilled the questionnaire for the training participants used in actual research, and prepared by the Institute for the Advancement of Education and Upbringing, intended exclusively for the Center for Professional Development of Employees. The collected data were processed by descriptive statistics in Microsoft Excel and the results obtained are presented in tables and graphs. Results: The fact that the large percentage of participants (more than 90%) spoke positively about the program speaks in favor of the claim that employees in education are aware of the need for personal training and engagement in communication processes between the educational institution and the public. CONCLUSION: Participants' answers clearly indicated the obvious improvement of communication between the educational institution and the public through the affirmation of the work of the institution itself, as well as the establishment of a positive image and solving important issues and problems in the functioning and work of the institution.
Gli stili APA, Harvard, Vancouver, ISO e altri
11

Kusumadiarti, Rini Suwartika, e Rendra Ripandi. "RANCANG BANGUN SISTEM INFORMASI PELAYANAN PENUNJANG MEDIS LABORATORIUM DI PUSKESMAS KOPO BANDUNG". JURNAL PETIK 5, n. 1 (2 aprile 2019): 48–54. http://dx.doi.org/10.31980/jpetik.v5i1.441.

Testo completo
Abstract (sommario):
Abstract - Progress in the field of health is very rapidly in developing, so many findings are obtained withinformation technology assistance both in the field of organizing hospitals, health centers, treatment anddevelopment research from the health sciences itself. The combination of information technology with theactivities of people who use these technologies, to support operations and management is an informationsystem. This study aims to design and build an information system implementation of laboratory medicalservices in the UPT Kopo Bandung Health Center. Technique of data collection by means of observation,interviews and literature. Software development methods use the waterfall method with Data Flow Diagrams(DFD) as software design. The suggestions given to overcome the problems in the laboratory medical supportsystem are: There needs to be a better development of the system. and need for training for HR who managethis system regarding the use of the application.Keyword : Information System, Laboratory Service, Waterfall, Microsoft Visual Studio 2010Abstrak - Kemajuan dalam bidang kesehatan sangat berkembang dengan begitu pesat, sehingga banyaktemuan-temuan yang didapatkan dengan bantuan teknologi informasi baik dalam bidang pengorganisasianrumah sakit, puskesmas, pengobatan maupun penelitian pengembangan dari ilmu kesehatan itu sendiri.Kombinasi antara teknologi informasi dengan aktivitas orang yang menggunakan teknologi tersebut, untukmendukung operasi dan manajemen adalah sistem informasi. Penelitian ini bertujuan untuk merancang danmembangun implementasi sistem informasi pelayanan enunjang medis laboratorium di UPT Puskesmas KopoBandungTeknik pengumpulan data dengan cara observasi, wawancara dan studi pustaka. Metodepengembangan perangkat lunak menggunakan metode waterfall dengan Data Flow Diagram (DFD) sebagaiperancangan software. Adapun saran yang diberikan untuk mengatasi permasalahan dalam sistem pelayananpenunjang medis laboratorium adalah : Perlu adanya pengembangan sistem ke arah yang lebih baik, Perluadanya pelatihan untuk SDM yang mengelola sistem ini terkait penggunaan aplikasi.Kata Kunci : Sistem Informasi, Pelayanan Laboratorium, Waterfall, Microsoft Visual Studio 2010
Gli stili APA, Harvard, Vancouver, ISO e altri
12

Lim, J. R., S. R. M. Kutty, A. A. S. Ghaleb, N. M. Y. Almahbashi e A. M. Al-Sabaeei. "Development of dead-end system calculation model for water reticulation design using Microsoft Excel with optimized algorithm: a case study at regional operations center (ROC) Melaka, Malaysia". IOP Conference Series: Materials Science and Engineering 849 (30 maggio 2020): 012094. http://dx.doi.org/10.1088/1757-899x/849/1/012094.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
13

HakemZadeh, Farimah, e Vishwanath V. Baba. "Toward a theory of collaboration for evidence-based management". Management Decision 54, n. 10 (21 novembre 2016): 2587–616. http://dx.doi.org/10.1108/md-06-2015-0243.

Testo completo
Abstract (sommario):
Purpose The purpose of this paper is to address the research-practice gap in management and advocate the need for an independent organization, called the evidence-based management (EBMgt) collaboration to facilitate generation and dissemination of knowledge that is rigorous, relevant, and actionable. Design/methodology/approach The authors use a theory-building approach to collaboration. They identify existing challenges in the research-practice gap literature and argue that EBMgt offers the most viable alternative to narrow this gap. They offer a theory of collaboration with supporting propositions that engages the generators, disseminators, and users of management knowledge in an ongoing sustainable collaboration toward EBMgt. Findings The authors envision evidence at the center of the EBMgt collaboration. They offer a process model of EBMgt incorporating a collaboration that ensures the fusion of rigor, relevance, and actionability of management knowledge toward the production of strong evidence that is of value to a decision maker. They suggest that the collaboration generate evidence in the form of a systematic review (SR) using a standard template and make it available online to management decision makers around the world in real time. They outline the parameters of the SR and offer details on the design of the Template. Research limitations/implications The theory of collaboration brings together various competing ideas and recommendations made over the past few decades to close the research-practice gap in management. The theory can be used as a guideline to establish and maintain the operation of an EBMgt collaboration. Practical implications The authors offer details on the format and content of a standardized SR along with a template to execute it. They believe it would appeal to a practicing manager to know the state-of-the-art knowledge that applies to a decision that he or she is about to make in real time. Originality/value The work provides a theoretical platform for the idea of EBMgt collaboration that was not available before. The authors add value to the research-practice gap literature by addressing critical concerns including the identification of relevant research questions, evaluating and grading evidence, fostering communication between researchers and practitioners, and translating research to practicing managers. The integration of research and organizational knowledge in the form of an SR that provides decision support to a practicing manager is of significant value to the profession. The conceptualization of the collaboration, not as a research method but as a separate social system that links key management knowledge stakeholders together adds originality to collaboration research.
Gli stili APA, Harvard, Vancouver, ISO e altri
14

Hanson, Laura Nelle, Jennifer Weis, Sasa Andrijasevic, Sharon Elcombe, Rachel Hardtke, Andrea Kukla e Linda Sanders. "4575 Implementing a Workflow Management Tool for Clinical Trials". Journal of Clinical and Translational Science 4, s1 (giugno 2020): 32. http://dx.doi.org/10.1017/cts.2020.132.

Testo completo
Abstract (sommario):
OBJECTIVES/GOALS: A workflow management tool is essential in order to help support consistent processes with transparency in next steps of the study process. Prior to this tool, staff has relied upon extensive training and coaching on the study process. While resources and guidelines exist, it requires additional time for staff to identify these resources and allows for confusion and rework. Implementation of a systematic workflow management tool was identified as a critical need in order to support streamlined processes, improve transparency and support business continuity, and to accelerate the study process. METHODS/STUDY POPULATION: This effort was undertaken as part of the Protocol Lifecycle Management effort to implement a comprehensive clinical trial management system for clinical research studies. Mayo Clinic has designed a workflow management tool within the Velos eResearch system. The workflow manager is dynamic and will present specific activities based on the study design and responses to data entered on the ad hoc forms. A Workflow Build group contributed to the design of the workflow in order to reflect appropriate, current operational processes. The workflow was vetted and validated with research teams. In addition to designing activities, planned dates and target timelines were established for relevant workflows to help promote transparency in the study start-up timelines and allow study staff to identify overdue activities. Study status controls were designed in the workflow to protect study staff from inadvertently changing the status until appropriate activities are complete. RESULTS/ANTICIPATED RESULTS: A dynamic workflow has been designed and implemented in the Velos eResearch system to support Mayo Clinic research sites. This system will be implemented February 24, 2020 to all consenting studies. DISCUSSION/SIGNIFICANCE OF IMPACT: The implementation of this workflow management tool is critical to help support research operations in a large, academic medical center. Benefits to implementation are expected to include improved transparency in the study status and next steps, reductions in rework due to confusion in next steps, better understanding from new staff in the appropriate study process, and improved timelines for study start-up. As we prepare for the implementation of the Velos eResearch system at Mayo Clinic, the workflow management tool has been identified in training sessions as a positive benefit.
Gli stili APA, Harvard, Vancouver, ISO e altri
15

Schmidt, Karsten, Marian Graumann e Joachim Bobrich. "Rapid Mapping for German Federal Authorities". Abstracts of the ICA 1 (15 luglio 2019): 1–2. http://dx.doi.org/10.5194/ica-abs-1-322-2019.

Testo completo
Abstract (sommario):
<p><strong>Abstract.</strong> The Federal Agency for Cartography and Geodesy (BKG) in Germany is a technical agency under the Federal Ministry of the Interior, Building and Community (BMI). The organization consists of three departments namely Geoinformation, Geodesy and a Business Operations/Support department. Each department has multiple units which carry out their own respective duty in their field of subject. Tasks related to providing and homogenizing spatial data are located inside the geoinformation department. The main focus lies here on uniting the data collected by the federal states to form different types of nationwide datasets. The geodesy department focusses on real time satellite navigation systems, very long baseline interferometry and geodetic reference systems.</p><p>A recent development in the federal authorities has been the demand for quick maps and spatial data to support fast decision making processes. The BKG has established a complete workflow which allows for rapid data processing and the distribution to the respective customers. It is split into six sub processes with several branching options, processing and backpropagation steps. This allows for a systematic approach to covering the customer’s demand for spatial data and for the internal production chain. The workflow is backed up by a project management system, a quality management system and a production management system each covered in a handbook specially prepared to ensure consistency along the workflow.</p><p>The workflow gets started by the customer posing their request either via the service center (SC) unit, via mail or phone or through the controlling institution as a decree or an executive order. The first contact is crucial as important information is being exchanged and a rough concept of the demand is shaped. Once the actual task has reached the BKG a ticket number and an internal processing identifier will be assigned by the SC and the resource manager (RM) respectively. These unique keys will be used for the communication with the customer and additionally utilized as a tool for the administrative actions being carried out in the background. This action involves querying the license status of the customer which in addition determines the maximum cost of the task and the authorization of the data being used. In parallel the task, now referred to as a ticket, is being implemented in the project management system. Certain additional information will be assigned to the ticket like the project manager (PM) and the project processor. Once this step has been completed the working directory for the ticket will be created by using a predefined folder structure. This structure is composed of a data, export, communications and documentation folder. Depending on the nature of the ticket various preconfigured templates are at hand and will be used to ensure uniformity of the final products. If needed further communication with the customer is carried out. Data research is the last step in this part of the workflow (see figure 1).</p><p>The processing of the actual data makes up most of the workflow regarding time. Aspects like a uniform styleguide and quality controls by colleagues are performed alongside the work progress. If any non comforming aspect is found, the feedback is given directly to the person in charge of the ticket while also being mirrored in the project management system. Once the product is ready for deployment, final communication with the customer is executed. Additional changes stated by the customer can be implemented whereby an extra iteration of the quality process will be necessary. The data deployment is either carried out as a print product, a DVD, an OGC service or via a download option (see figure 2).</p><p>The final steps of the workflow involve the report of delivery to the service center, the creation of the cost report and the closure of the ticket in the project management system. The cost report is created in a semi-automatic process from data stored in the project management system and combined with the material cost report. It is then send via e-mail to the service center which will carry out any necessary invoice collections. Furthermore the internal data folder structure is cleaned up and a screenshot is taken of the product to use it as a subsidiary product in the dedicated online gallery of created products. Finally the closing information is relayed to the service center and the ticket gets closed (see figure 3). This whole workflow can be executed within several hours, days or weeks, depending on the deadline imposed by the customer or the event. Actual rapid mapping activations are done by multiple people simultaneously working on different steps of the previously explained workflow. This allows for fast response times and the accomplishment of the activation in time.</p>
Gli stili APA, Harvard, Vancouver, ISO e altri
16

Rassenfoss, Stephen. "Drilling Automation - A Robot Takes Over the Drilling Floor". Journal of Petroleum Technology 73, n. 12 (1 dicembre 2021): 18–22. http://dx.doi.org/10.2118/1221-0018-jpt.

Testo completo
Abstract (sommario):
The work on the drilling floor of the PaceR801 rig revolves around a stout robot methodically picking up sections of pipe and moving them precisely over the drilling center to rapidly connect the pipe. While it is one of many technological advances on the rig Nabors bills as “the world’s first fully automated land rig,” the robot is the one “that gets most people interested and excited,” said Travis Purvis, senior vice president, global operations for Nabors Industries Ltd. As of 18 October, the PaceR801 had completed the first well on an ExxonMobil pad and was drilling the lateral on the second well of the three-well pad. After the third is done, the extended test will move to the next pad. It is risky to announce who came in first in a competitive race in a secretive business. But Nabors stands out because the PaceR801 has an automated drilling floor, a range of other automated functions above and below ground, and most significantly, it is the only one using its rig to drill producing wells for a customer. Jason Gahr, operations manager for unconventional drilling at ExxonMobil, said the research collaboration “demonstrates the ability to optimize drilling using the combined power of robotics, automation, computing, and data.” Since the announcement, Nabors has heard from other oil companies. “There is strong interest in the rig in many markets,” Purvis said. The companies’ interests range from automating more drilling functions by retrofitting rigs, to wanting to hire the rig, whose name is frequently shortened in conversations to R801. There is only one PaceR801 and it is going to be tied up for a while. “We expect to drill multiple test wells on multiple pads and continue to work on the technology” with ExxonMobil, Purvis said. It was created to show off the fruits of a 5-year drive to create a totally automated version of its Pace high- specification rig. Nabors likens it to the concept cars built by automakers to show off their vision of the future and to promote innovation within the company. In this case, it is a vision of the near future. While the automated drilling floor is new, much of the rest is recently proven technology. Two of the drilling automation programs used—Nabors’ SmartSLIDE and SmartNAV—are already on 30% of the Nabors fleet, said Austin Groover, director of operations for smart products at Nabors. Those applications, which manage drilling of the curve and directional drilling, plus a third that automates drilling a stand of pipe, SmartDRILL, have been used by ExxonMobil for 2 years. Maximizing the performance of a rig with multiple proven technologies plus a new one such as a robotic drilling floor required developing a system that coordinates the movements of those apps and the rig hardware while drilling. When Nabors experts describe that process, they rely on musical metaphors, from the robot doing its little dance to a conductor leading a symphony.
Gli stili APA, Harvard, Vancouver, ISO e altri
17

Soldatova, Kateryna, e Svitlana Nosok. "Construction of Proactive Monitoring Model using Forecasting Techniques in the SCOM Software Complex". Theoretical and Applied Cybersecurity 4, n. 1 (17 febbraio 2023). http://dx.doi.org/10.20535/tacs.2664-29132022.1.266612.

Testo completo
Abstract (sommario):
The majority of companies depend on their information systems, the stability of infrastructure operations and the failover of computing resources. Various monitoring tools are mostly used to automate the benchmarking process of company. The company that has a large distributed infrastructure should pay close attention to this process, as it makes the state of operations difficult to maintain, and increases the probability of the loss of functionality for errors or even shutdown of some servers. The one of solutions is reactive monitoring. Reactive monitoring is a technique where system administrators use monitoring tools to continuously collect data that determine the active and current status of information system environment. Measurements obtained from real-time monitoring tools illustrate the performance data of current information environment. However, if we discuss the main metrics of system resources, such as the level of processor load, RAM or disk usage, their change can be quite fast. And for servers that are responsible for critical functions, the problem of full resource usage is important. This problem can be solved with proactive monitoring. Proactive monitoring is a set of monitoring tools that not only collect real-time information, but also predict possible failures before they impact end users[1]. The purpose of this article is to choose methods of time series forecasting for the resources load that are going to be combined into a single hybrid method. The final solution will be used in the management pack of software complex System Center Operations Manager (SCOM) that is widely used by companies with large infrastructure[2]. The forecasting methods such as Least squares, SMA and EMA were considered in this work.
Gli stili APA, Harvard, Vancouver, ISO e altri
18

Fussner, Julie M., Kelly Montgomery, Tinatin Gumberidze e Erin Supan. "Abstract TP414: Support from a Comprehensive Stroke Center Improves Door to tPA Times at Primary Stroke Centers". Stroke 47, suppl_1 (febbraio 2016). http://dx.doi.org/10.1161/str.47.suppl_1.tp414.

Testo completo
Abstract (sommario):
Target Stroke, a national quality improvement initiative of the American Heart Association /American Stroke Association (AHA/ASA) to improve the timeliness of administration of intravenous (IV) tissue plasminogen activator (tPA) to eligible stroke patients, was launched in 2010. The door-to-needle time goal is 60 minutes (mins) from hospital arrival. Earlier administration of IV t-PA is associated with greater functional recovery. Since 2009 University Hospitals Comprehensive Stroke and Cerebrovascular Center (UHCSCC) has meet quarterly with its 7 system community hospitals to share stroke core measure data, review clinical practice guidelines and address new system initiatives for the care of the stroke patient. The purpose of this project is to demonstrate how a comprehensive stroke center (CSC) can assist a primary stroke center (PSC) to improve their door to tPA treatment times. In 2010 to support the primary stroke centers, the UHCSCC developed standardized stroke education for nurses including an online course for tPA. In 2014 an additional online interactive module was created to assist nurses in programing the Alaris IV pump to improve their speed. In 2013 the quarterly system meetings started to include door to CT and door to tPA data with discussions about best practices and challenges. The AHA Target Stroke campaign recommendations and evidenced-based strategies were reviewed and a gap analysis at each hospital was completed to identify opportunities. Throughout 2012-2013 the stroke coordinator at UHCSCC led monthly conference calls with the community stroke coordinators. Since 2014 the stroke operations manager visits each community hospital monthly to work with the stroke coordinator and their teams reviewing TPA cases. Finally, a formal feedback took was developed and is sent to the PSC to provide patient outcomes and opportunities on all TPA cases that are transferred to the CSC. The AHA Get With The Guidelines stroke registry is used to monitor compliance. In 2012 the University Hospitals Health System average door to tPA in 60 mins was only 41%. January - June 2015, the system average has improved 86%. Community primary stroke centers benefit from the comprehensive stroke center interventions and support to improve door to tPA in 60 mins.
Gli stili APA, Harvard, Vancouver, ISO e altri
19

Atassi, Reem, e Reem Atassi. "Geological Landslide Disaster Monitoring Based on Wireless Network Technology". International Journal of Wireless and Ad Hoc Communication, 2021, 21–32. http://dx.doi.org/10.54216/ijwac.020102.

Testo completo
Abstract (sommario):
With the comprehensive influence of natural evolution and human activities, the damage degree of geological disasters is increasing. How to effectively early warning geological disasters has become a problem of concern. How to effectively provide early warning of geological disasters has become a concern of people. This research mainly discusses the geological landslide disaster monitoring based on wireless network technology. First, establish two important early warning indicators of rainfall and geological landslide displacement. The monitoring system is powered by a rechargeable 12V lithium battery, combined with solar panels, which can be charged when the sun is full to ensure the stable operation of the system. The AT45DB161B chip with 16M bytes storage capacity is selected to store data such as geological landslide displacement and rainfall. Use Microsoft SQL Server 2008 database management system to complete database content query, addition, modification, and deletion operations. The TLP521-2 photocoupler is used to isolate the GPIO interface of STM32 from the external unit to improve the anti-interference ability. The communication between the field data collector and the monitoring center data server adopts the GPRS packet data transmission method based on the TCP/IP protocol. Currently, the PDU in the network is an IP data packet. The realization of the TCP/IP protocol at the field data collector is all completed in the master single-chip microcomputer. Use SIEMENSMC35GSM/GPRS module as data transmission terminal. The monitoring results show that the absolute error of the test data does not exceed 6mm in the horizontal distance, the vertical height difference does not exceed 9mm. The results show that the monitoring of geological landslide based on wireless network technology improves the accuracy of distance estimation and reduces the positioning error, which can provide scientific guidance for the planning, monitoring and early warning of landslide area.
Gli stili APA, Harvard, Vancouver, ISO e altri
20

"Implementing Mckinsey 7S Model of Organizational Diagnosis and Planned Change, Best Western Italy Case Analysis". Journal of International Business and Management, 29 ottobre 2021. http://dx.doi.org/10.37227/jibm-2021-09-1438.

Testo completo
Abstract (sommario):
Best Western Italy operates under BW international Inc. a leading hotel and resort brand that provides high quality accommodation services in numerous countries across the globe. All of Best Western hotels are owned and managed independently. BW Italy is the brand’s center for operations and reservation in the European market. It has gained recognition from its unique reservations services involving strategic partners to ensure their customers receive the best services at standardized rates. To reinforce its vision and mission, BW Italy management came up with a yearlong “Make a Difference” program to enhance employee commitment by aligning their goals to those of the company. This would entail organizational change which involves a system-wide transfer and application of behavioral science knowledge to the strategic development, enhancement, and reinforcement of the plans, process, and structures that result in organizational efficiency and effectiveness. The results included rearrangement of top management to establish a flatter organizational structure with improved distribution of leadership. A year after the program had been concluded, the General Manager was aware that its implementation had created colossal excitement and interaction among the employees who had taken part in it. However, four employees had refused to participate in the program while the company had recruited ten new employees after the program had ended. BW Italy needs to formulate a strategy to ensure that the “Make a Difference “program changes are internalized by its employees and institutionalized within the organization. The case analysis utilized secondary data acquired through a case study performed on BW Italy during the implementation of the program. Data was analyzed through the McKinsey 7s Model an effective tool used in analyzing organizational change. The McKinsey 7s Model is an effective tool aimed at depicting how effectiveness can be achieved within an organization through the interaction of 7 different organizational elements namely structure, strategy, skill, system, shared values, style, and staff. Keywords: Make a difference, Organizational change, Change program, Organizational goals, Mission and values, Employee commitment, Top management, Organizational diagnostic models, Performance outcomes
Gli stili APA, Harvard, Vancouver, ISO e altri
21

Livingstone, Randall M. "Let’s Leave the Bias to the Mainstream Media: A Wikipedia Community Fighting for Information Neutrality". M/C Journal 13, n. 6 (23 novembre 2010). http://dx.doi.org/10.5204/mcj.315.

Testo completo
Abstract (sommario):
Although I'm a rich white guy, I'm also a feminist anti-racism activist who fights for the rights of the poor and oppressed. (Carl Kenner)Systemic bias is a scourge to the pillar of neutrality. (Cerejota)Count me in. Let's leave the bias to the mainstream media. (Orcar967)Because this is so important. (CuttingEdge)These are a handful of comments posted by online editors who have banded together in a virtual coalition to combat Western bias on the world’s largest digital encyclopedia, Wikipedia. This collective action by Wikipedians both acknowledges the inherent inequalities of a user-controlled information project like Wikpedia and highlights the potential for progressive change within that same project. These community members are taking the responsibility of social change into their own hands (or more aptly, their own keyboards).In recent years much research has emerged on Wikipedia from varying fields, ranging from computer science, to business and information systems, to the social sciences. While critical at times of Wikipedia’s growth, governance, and influence, most of this work observes with optimism that barriers to improvement are not firmly structural, but rather they are socially constructed, leaving open the possibility of important and lasting change for the better.WikiProject: Countering Systemic Bias (WP:CSB) considers one such collective effort. Close to 350 editors have signed on to the project, which began in 2004 and itself emerged from a similar project named CROSSBOW, or the “Committee Regarding Overcoming Serious Systemic Bias on Wikipedia.” As a WikiProject, the term used for a loose group of editors who collaborate around a particular topic, these editors work within the Wikipedia site and collectively create a social network that is unified around one central aim—representing the un- and underrepresented—and yet they are bound by no particular unified set of interests. The first stage of a multi-method study, this paper looks at a snapshot of WP:CSB’s activity from both content analysis and social network perspectives to discover “who” geographically this coalition of the unrepresented is inserting into the digital annals of Wikipedia.Wikipedia and WikipediansDeveloped in 2001 by Internet entrepreneur Jimmy Wales and academic Larry Sanger, Wikipedia is an online collaborative encyclopedia hosting articles in nearly 250 languages (Cohen). The English-language Wikipedia contains over 3.2 million articles, each of which is created, edited, and updated solely by users (Wikipedia “Welcome”). At the time of this study, Alexa, a website tracking organisation, ranked Wikipedia as the 6th most accessed site on the Internet. Unlike the five sites ahead of it though—Google, Facebook, Yahoo, YouTube (owned by Google), and live.com (owned by Microsoft)—all of which are multibillion-dollar businesses that deal more with information aggregation than information production, Wikipedia is a non-profit that operates on less than $500,000 a year and staffs only a dozen paid employees (Lih). Wikipedia is financed and supported by the WikiMedia Foundation, a charitable umbrella organisation with an annual budget of $4.6 million, mainly funded by donations (Middleton).Wikipedia editors and contributors have the option of creating a user profile and participating via a username, or they may participate anonymously, with only an IP address representing their actions. Despite the option for total anonymity, many Wikipedians have chosen to visibly engage in this online community (Ayers, Matthews, and Yates; Bruns; Lih), and researchers across disciplines are studying the motivations of these new online collectives (Kane, Majchrzak, Johnson, and Chenisern; Oreg and Nov). The motivations of open source software contributors, such as UNIX programmers and programming groups, have been shown to be complex and tied to both extrinsic and intrinsic rewards, including online reputation, self-satisfaction and enjoyment, and obligation to a greater common good (Hertel, Niedner, and Herrmann; Osterloh and Rota). Investigation into why Wikipedians edit has indicated multiple motivations as well, with community engagement, task enjoyment, and information sharing among the most significant (Schroer and Hertel). Additionally, Wikipedians seem to be taking up the cause of generativity (a concern for the ongoing health and openness of the Internet’s infrastructures) that Jonathan Zittrain notably called for in The Future of the Internet and How to Stop It. Governance and ControlAlthough the technical infrastructure of Wikipedia is built to support and perhaps encourage an equal distribution of power on the site, Wikipedia is not a land of “anything goes.” The popular press has covered recent efforts by the site to reduce vandalism through a layer of editorial review (Cohen), a tightening of control cited as a possible reason for the recent dip in the number of active editors (Edwards). A number of regulations are already in place that prevent the open editing of certain articles and pages, such as the site’s disclaimers and pages that have suffered large amounts of vandalism. Editing wars can also cause temporary restrictions to editing, and Ayers, Matthews, and Yates point out that these wars can happen anywhere, even to Burt Reynold’s page.Academic studies have begun to explore the governance and control that has developed in the Wikipedia community, generally highlighting how order is maintained not through particular actors, but through established procedures and norms. Konieczny tested whether Wikipedia’s evolution can be defined by Michels’ Iron Law of Oligopoly, which predicts that the everyday operations of any organisation cannot be run by a mass of members, and ultimately control falls into the hands of the few. Through exploring a particular WikiProject on information validation, he concludes:There are few indicators of an oligarchy having power on Wikipedia, and few trends of a change in this situation. The high level of empowerment of individual Wikipedia editors with regard to policy making, the ease of communication, and the high dedication to ideals of contributors succeed in making Wikipedia an atypical organization, quite resilient to the Iron Law. (189)Butler, Joyce, and Pike support this assertion, though they emphasise that instead of oligarchy, control becomes encapsulated in a wide variety of structures, policies, and procedures that guide involvement with the site. A virtual “bureaucracy” emerges, but one that should not be viewed with the negative connotation often associated with the term.Other work considers control on Wikipedia through the framework of commons governance, where “peer production depends on individual action that is self-selected and decentralized rather than hierarchically assigned. Individuals make their own choices with regard to resources managed as a commons” (Viegas, Wattenberg and McKeon). The need for quality standards and quality control largely dictate this commons governance, though interviewing Wikipedians with various levels of responsibility revealed that policies and procedures are only as good as those who maintain them. Forte, Larco, and Bruckman argue “the Wikipedia community has remained healthy in large part due to the continued presence of ‘old-timers’ who carry a set of social norms and organizational ideals with them into every WikiProject, committee, and local process in which they take part” (71). Thus governance on Wikipedia is a strong representation of a democratic ideal, where actors and policies are closely tied in their evolution. Transparency, Content, and BiasThe issue of transparency has proved to be a double-edged sword for Wikipedia and Wikipedians. The goal of a collective body of knowledge created by all—the “expert” and the “amateur”—can only be upheld if equal access to page creation and development is allotted to everyone, including those who prefer anonymity. And yet this very option for anonymity, or even worse, false identities, has been a sore subject for some in the Wikipedia community as well as a source of concern for some scholars (Santana and Wood). The case of a 24-year old college dropout who represented himself as a multiple Ph.D.-holding theology scholar and edited over 16,000 articles brought these issues into the public spotlight in 2007 (Doran; Elsworth). Wikipedia itself has set up standards for content that include expectations of a neutral point of view, verifiability of information, and the publishing of no original research, but Santana and Wood argue that self-policing of these policies is not adequate:The principle of managerial discretion requires that every actor act from a sense of duty to exercise moral autonomy and choice in responsible ways. When Wikipedia’s editors and administrators remain anonymous, this criterion is simply not met. It is assumed that everyone is behaving responsibly within the Wikipedia system, but there are no monitoring or control mechanisms to make sure that this is so, and there is ample evidence that it is not so. (141) At the theoretical level, some downplay these concerns of transparency and autonomy as logistical issues in lieu of the potential for information systems to support rational discourse and emancipatory forms of communication (Hansen, Berente, and Lyytinen), but others worry that the questionable “realities” created on Wikipedia will become truths once circulated to all areas of the Web (Langlois and Elmer). With the number of articles on the English-language version of Wikipedia reaching well into the millions, the task of mapping and assessing content has become a tremendous endeavour, one mostly taken on by information systems experts. Kittur, Chi, and Suh have used Wikipedia’s existing hierarchical categorisation structure to map change in the site’s content over the past few years. Their work revealed that in early 2008 “Culture and the arts” was the most dominant category of content on Wikipedia, representing nearly 30% of total content. People (15%) and geographical locations (14%) represent the next largest categories, while the natural and physical sciences showed the greatest increase in volume between 2006 and 2008 (+213%D, with “Culture and the arts” close behind at +210%D). This data may indicate that contributing to Wikipedia, and thus spreading knowledge, is growing amongst the academic community while maintaining its importance to the greater popular culture-minded community. Further work by Kittur and Kraut has explored the collaborative process of content creation, finding that too many editors on a particular page can reduce the quality of content, even when a project is well coordinated.Bias in Wikipedia content is a generally acknowledged and somewhat conflicted subject (Giles; Johnson; McHenry). The Wikipedia community has created numerous articles and pages within the site to define and discuss the problem. Citing a survey conducted by the University of Würzburg, Germany, the “Wikipedia:Systemic bias” page describes the average Wikipedian as:MaleTechnically inclinedFormally educatedAn English speakerWhiteAged 15-49From a majority Christian countryFrom a developed nationFrom the Northern HemisphereLikely a white-collar worker or studentBias in content is thought to be perpetuated by this demographic of contributor, and the “founder effect,” a concept from genetics, linking the original contributors to this same demographic has been used to explain the origins of certain biases. Wikipedia’s “About” page discusses the issue as well, in the context of the open platform’s strengths and weaknesses:in practice editing will be performed by a certain demographic (younger rather than older, male rather than female, rich enough to afford a computer rather than poor, etc.) and may, therefore, show some bias. Some topics may not be covered well, while others may be covered in great depth. No educated arguments against this inherent bias have been advanced.Royal and Kapila’s study of Wikipedia content tested some of these assertions, finding identifiable bias in both their purposive and random sampling. They conclude that bias favoring larger countries is positively correlated with the size of the country’s Internet population, and corporations with larger revenues work in much the same way, garnering more coverage on the site. The researchers remind us that Wikipedia is “more a socially produced document than a value-free information source” (Royal & Kapila).WikiProject: Countering Systemic BiasAs a coalition of current Wikipedia editors, the WikiProject: Countering Systemic Bias (WP:CSB) attempts to counter trends in content production and points of view deemed harmful to the democratic ideals of a valueless, open online encyclopedia. WP:CBS’s mission is not one of policing the site, but rather deepening it:Generally, this project concentrates upon remedying omissions (entire topics, or particular sub-topics in extant articles) rather than on either (1) protesting inappropriate inclusions, or (2) trying to remedy issues of how material is presented. Thus, the first question is "What haven't we covered yet?", rather than "how should we change the existing coverage?" (Wikipedia, “Countering”)The project lays out a number of content areas lacking adequate representation, geographically highlighting the dearth in coverage of Africa, Latin America, Asia, and parts of Eastern Europe. WP:CSB also includes a “members” page that editors can sign to show their support, along with space to voice their opinions on the problem of bias on Wikipedia (the quotations at the beginning of this paper are taken from this “members” page). At the time of this study, 329 editors had self-selected and self-identified as members of WP:CSB, and this group constitutes the population sample for the current study. To explore the extent to which WP:CSB addressed these self-identified areas for improvement, each editor’s last 50 edits were coded for their primary geographical country of interest, as well as the conceptual category of the page itself (“P” for person/people, “L” for location, “I” for idea/concept, “T” for object/thing, or “NA” for indeterminate). For example, edits to the Wikipedia page for a single person like Tony Abbott (Australian federal opposition leader) were coded “Australia, P”, while an edit for a group of people like the Manchester United football team would be coded “England, P”. Coding was based on information obtained from the header paragraphs of each article’s Wikipedia page. After coding was completed, corresponding information on each country’s associated continent was added to the dataset, based on the United Nations Statistics Division listing.A total of 15,616 edits were coded for the study. Nearly 32% (n = 4962) of these edits were on articles for persons or people (see Table 1 for complete coding results). From within this sub-sample of edits, a majority of the people (68.67%) represented are associated with North America and Europe (Figure A). If we break these statistics down further, nearly half of WP:CSB’s edits concerning people were associated with the United States (36.11%) and England (10.16%), with India (3.65%) and Australia (3.35%) following at a distance. These figures make sense for the English-language Wikipedia; over 95% of the population in the three Westernised countries speak English, and while India is still often regarded as a developing nation, its colonial British roots and the emergence of a market economy with large, technology-driven cities are logical explanations for its representation here (and some estimates make India the largest English-speaking nation by population on the globe today).Table A Coding Results Total Edits 15616 (I) Ideas 2881 18.45% (L) Location 2240 14.34% NA 333 2.13% (T) Thing 5200 33.30% (P) People 4962 31.78% People by Continent Africa 315 6.35% Asia 827 16.67% Australia 175 3.53% Europe 1411 28.44% NA 110 2.22% North America 1996 40.23% South America 128 2.58% The areas of the globe of main concern to WP:CSB proved to be much less represented by the coalition itself. Asia, far and away the most populous continent with more than 60% of the globe’s people (GeoHive), was represented in only 16.67% of edits. Africa (6.35%) and South America (2.58%) were equally underrepresented compared to both their real-world populations (15% and 9% of the globe’s population respectively) and the aforementioned dominance of the advanced Westernised areas. However, while these percentages may seem low, in aggregate they do meet the quota set on the WP:CSB Project Page calling for one out of every twenty edits to be “a subject that is systematically biased against the pages of your natural interests.” By this standard, the coalition is indeed making headway in adding content that strategically counterbalances the natural biases of Wikipedia’s average editor.Figure ASocial network analysis allows us to visualise multifaceted data in order to identify relationships between actors and content (Vego-Redondo; Watts). Similar to Davis’s well-known sociological study of Southern American socialites in the 1930s (Scott), our Wikipedia coalition can be conceptualised as individual actors united by common interests, and a network of relations can be constructed with software such as UCINET. A mapping algorithm that considers both the relationship between all sets of actors and each actor to the overall collective structure produces an image of our network. This initial network is bimodal, as both our Wikipedia editors and their edits (again, coded for country of interest) are displayed as nodes (Figure B). Edge-lines between nodes represents a relationship, and here that relationship is the act of editing a Wikipedia article. We see from our network that the “U.S.” and “England” hold central positions in the network, with a mass of editors crowding around them. A perimeter of nations is then held in place by their ties to editors through the U.S. and England, with a second layer of editors and poorly represented nations (Gabon, Laos, Uzbekistan, etc.) around the boundaries of the network.Figure BWe are reminded from this visualisation both of the centrality of the two Western powers even among WP:CSB editoss, and of the peripheral nature of most other nations in the world. But we also learn which editors in the project are contributing most to underrepresented areas, and which are less “tied” to the Western core. Here we see “Wizzy” and “Warofdreams” among the second layer of editors who act as a bridge between the core and the periphery; these are editors with interests in both the Western and marginalised nations. Located along the outer edge, “Gallador” and “Gerrit” have no direct ties to the U.S. or England, concentrating all of their edits on less represented areas of the globe. Identifying editors at these key positions in the network will help with future research, informing interview questions that will investigate their interests further, but more significantly, probing motives for participation and action within the coalition.Additionally, we can break the network down further to discover editors who appear to have similar interests in underrepresented areas. Figure C strips down the network to only editors and edits dealing with Africa and South America, the least represented continents. From this we can easily find three types of editors again: those who have singular interests in particular nations (the outermost layer of editors), those who have interests in a particular region (the second layer moving inward), and those who have interests in both of these underrepresented regions (the center layer in the figure). This last group of editors may prove to be the most crucial to understand, as they are carrying the full load of WP:CSB’s mission.Figure CThe End of Geography, or the Reclamation?In The Internet Galaxy, Manuel Castells writes that “the Internet Age has been hailed as the end of geography,” a bold suggestion, but one that has gained traction over the last 15 years as the excitement for the possibilities offered by information communication technologies has often overshadowed structural barriers to participation like the Digital Divide (207). Castells goes on to amend the “end of geography” thesis by showing how global information flows and regional Internet access rates, while creating a new “map” of the world in many ways, is still closely tied to power structures in the analog world. The Internet Age: “redefines distance but does not cancel geography” (207). The work of WikiProject: Countering Systemic Bias emphasises the importance of place and representation in the information environment that continues to be constructed in the online world. This study looked at only a small portion of this coalition’s efforts (~16,000 edits)—a snapshot of their labor frozen in time—which itself is only a minute portion of the information being dispatched through Wikipedia on a daily basis (~125,000 edits). Further analysis of WP:CSB’s work over time, as well as qualitative research into the identities, interests and motivations of this collective, is needed to understand more fully how information bias is understood and challenged in the Internet galaxy. The data here indicates this is a fight worth fighting for at least a growing few.ReferencesAlexa. “Top Sites.” Alexa.com, n.d. 10 Mar. 2010 ‹http://www.alexa.com/topsites>. Ayers, Phoebe, Charles Matthews, and Ben Yates. How Wikipedia Works: And How You Can Be a Part of It. San Francisco, CA: No Starch, 2008.Bruns, Axel. Blogs, Wikipedia, Second Life, and Beyond: From Production to Produsage. New York: Peter Lang, 2008.Butler, Brian, Elisabeth Joyce, and Jacqueline Pike. Don’t Look Now, But We’ve Created a Bureaucracy: The Nature and Roles of Policies and Rules in Wikipedia. Paper presented at 2008 CHI Annual Conference, Florence.Castells, Manuel. The Internet Galaxy: Reflections on the Internet, Business, and Society. Oxford: Oxford UP, 2001.Cohen, Noam. “Wikipedia.” New York Times, n.d. 12 Mar. 2010 ‹http://www.nytimes.com/info/wikipedia/>. Doran, James. “Wikipedia Chief Promises Change after ‘Expert’ Exposed as Fraud.” The Times, 6 Mar. 2007 ‹http://technology.timesonline.co.uk/tol/news/tech_and_web/article1480012.ece>. Edwards, Lin. “Report Claims Wikipedia Losing Editors in Droves.” Physorg.com, 30 Nov 2009. 12 Feb. 2010 ‹http://www.physorg.com/news178787309.html>. Elsworth, Catherine. “Fake Wikipedia Prof Altered 20,000 Entries.” London Telegraph, 6 Mar. 2007 ‹http://www.telegraph.co.uk/news/1544737/Fake-Wikipedia-prof-altered-20000-entries.html>. Forte, Andrea, Vanessa Larco, and Amy Bruckman. “Decentralization in Wikipedia Governance.” Journal of Management Information Systems 26 (2009): 49-72.Giles, Jim. “Internet Encyclopedias Go Head to Head.” Nature 438 (2005): 900-901.Hansen, Sean, Nicholas Berente, and Kalle Lyytinen. “Wikipedia, Critical Social Theory, and the Possibility of Rational Discourse.” The Information Society 25 (2009): 38-59.Hertel, Guido, Sven Niedner, and Stefanie Herrmann. “Motivation of Software Developers in Open Source Projects: An Internet-Based Survey of Contributors to the Linex Kernel.” Research Policy 32 (2003): 1159-1177.Johnson, Bobbie. “Rightwing Website Challenges ‘Liberal Bias’ of Wikipedia.” The Guardian, 1 Mar. 2007. 8 Mar. 2010 ‹http://www.guardian.co.uk/technology/2007/mar/01/wikipedia.news>. Kane, Gerald C., Ann Majchrzak, Jeremaih Johnson, and Lily Chenisern. A Longitudinal Model of Perspective Making and Perspective Taking within Fluid Online Collectives. Paper presented at the 2009 International Conference on Information Systems, Phoenix, AZ, 2009.Kittur, Aniket, Ed H. Chi, and Bongwon Suh. What’s in Wikipedia? Mapping Topics and Conflict Using Socially Annotated Category Structure. Paper presented at the 2009 CHI Annual Conference, Boston, MA.———, and Robert E. Kraut. Harnessing the Wisdom of Crowds in Wikipedia: Quality through Collaboration. Paper presented at the 2008 Association for Computing Machinery’s Computer Supported Cooperative Work Annual Conference, San Diego, CA.Konieczny, Piotr. “Governance, Organization, and Democracy on the Internet: The Iron Law and the Evolution of Wikipedia.” Sociological Forum 24 (2009): 162-191.———. “Wikipedia: Community or Social Movement?” Interface: A Journal for and about Social Movements 1 (2009): 212-232.Langlois, Ganaele, and Greg Elmer. “Wikipedia Leeches? The Promotion of Traffic through a Collaborative Web Format.” New Media & Society 11 (2009): 773-794.Lih, Andrew. The Wikipedia Revolution. New York, NY: Hyperion, 2009.McHenry, Robert. “The Real Bias in Wikipedia: A Response to David Shariatmadari.” OpenDemocracy.com 2006. 8 Mar. 2010 ‹http://www.opendemocracy.net/media-edemocracy/wikipedia_bias_3621.jsp>. Middleton, Chris. “The World of Wikinomics.” Computer Weekly, 20 Jan. 2009: 22-26.Oreg, Shaul, and Oded Nov. “Exploring Motivations for Contributing to Open Source Initiatives: The Roles of Contribution, Context and Personal Values.” Computers in Human Behavior 24 (2008): 2055-2073.Osterloh, Margit and Sandra Rota. “Trust and Community in Open Source Software Production.” Analyse & Kritik 26 (2004): 279-301.Royal, Cindy, and Deepina Kapila. “What’s on Wikipedia, and What’s Not…?: Assessing Completeness of Information.” Social Science Computer Review 27 (2008): 138-148.Santana, Adele, and Donna J. Wood. “Transparency and Social Responsibility Issues for Wikipedia.” Ethics of Information Technology 11 (2009): 133-144.Schroer, Joachim, and Guido Hertel. “Voluntary Engagement in an Open Web-Based Encyclopedia: Wikipedians and Why They Do It.” Media Psychology 12 (2009): 96-120.Scott, John. Social Network Analysis. London: Sage, 1991.Vego-Redondo, Fernando. Complex Social Networks. Cambridge: Cambridge UP, 2007.Viegas, Fernanda B., Martin Wattenberg, and Matthew M. McKeon. “The Hidden Order of Wikipedia.” Online Communities and Social Computing (2007): 445-454.Watts, Duncan. Six Degrees: The Science of a Connected Age. New York, NY: W. W. Norton & Company, 2003Wikipedia. “About.” n.d. 8 Mar. 2010 ‹http://en.wikipedia.org/wiki/Wikipedia:About>. ———. “Welcome to Wikipedia.” n.d. 8 Mar. 2010 ‹http://en.wikipedia.org/wiki/Main_Page>.———. “Wikiproject:Countering Systemic Bias.” n.d. 12 Feb. 2010 ‹http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Countering_systemic_bias#Members>. Zittrain, Jonathan. The Future of the Internet and How to Stop It. New Haven, CT: Yale UP, 2008.
Gli stili APA, Harvard, Vancouver, ISO e altri
22

Moore, Christopher Luke. "Digital Games Distribution: The Presence of the Past and the Future of Obsolescence". M/C Journal 12, n. 3 (15 luglio 2009). http://dx.doi.org/10.5204/mcj.166.

Testo completo
Abstract (sommario):
A common criticism of the rhythm video games genre — including series like Guitar Hero and Rock Band, is that playing musical simulation games is a waste of time when you could be playing an actual guitar and learning a real skill. A more serious criticism of games cultures draws attention to the degree of e-waste they produce. E-waste or electronic waste includes mobiles phones, computers, televisions and other electronic devices, containing toxic chemicals and metals whose landfill, recycling and salvaging all produce distinct environmental and social problems. The e-waste produced by games like Guitar Hero is obvious in the regular flow of merchandise transforming computer and video games stores into simulation music stores, filled with replica guitars, drum kits, microphones and other products whose half-lives are short and whose obsolescence is anticipated in the annual cycles of consumption and disposal. This paper explores the connection between e-waste and obsolescence in the games industry, and argues for the further consideration of consumers as part of the solution to the problem of e-waste. It uses a case study of the PC digital distribution software platform, Steam, to suggest that the digital distribution of games may offer an alternative model to market driven software and hardware obsolescence, and more generally, that such software platforms might be a place to support cultures of consumption that delay rather than promote hardware obsolescence and its inevitability as e-waste. The question is whether there exists a potential for digital distribution to be a means of not only eliminating the need to physically transport commodities (its current 'green' benefit), but also for supporting consumer practices that further reduce e-waste. The games industry relies on a rapid production and innovation cycle, one that actively enforces hardware obsolescence. Current video game consoles, including the PlayStation 3, the Xbox 360 and Nintendo Wii, are the seventh generation of home gaming consoles to appear within forty years, and each generation is accompanied by an immense international transportation of games hardware, software (in various storage formats) and peripherals. Obsolescence also occurs at the software or content level and is significant because the games industry as a creative industry is dependent on the extensive management of multiple intellectual properties. The computing and video games software industry operates a close partnership with the hardware industry, and as such, software obsolescence directly contributes to hardware obsolescence. The obsolescence of content and the redundancy of the methods of policing its scarcity in the marketplace has been accelerated and altered by the processes of disintermediation with a range of outcomes (Flew). The music industry is perhaps the most advanced in terms of disintermediation with digital distribution at the center of the conflict between the legitimate and unauthorised access to intellectual property. This points to one issue with the hypothesis that digital distribution can lead to a reduction in hardware obsolescence, as the marketplace leader and key online distributor of music, Apple, is also the major producer of new media technologies and devices that are the paragon of stylistic obsolescence. Stylistic obsolescence, in which fashion changes products across seasons of consumption, has long been observed as the dominant form of scaled industrial innovation (Slade). Stylistic obsolescence is differentiated from mechanical or technological obsolescence as the deliberate supersedence of products by more advanced designs, better production techniques and other minor innovations. The line between the stylistic and technological obsolescence is not always clear, especially as reduced durability has become a powerful market strategy (Fitzpatrick). This occurs where the design of technologies is subsumed within the discourses of manufacturing, consumption and the logic of planned obsolescence in which the product or parts are intended to fail, degrade or under perform over time. It is especially the case with signature new media technologies such as laptop computers, mobile phones and portable games devices. Gamers are as guilty as other consumer groups in contributing to e-waste as participants in the industry's cycles of planned obsolescence, but some of them complicate discussions over the future of obsolescence and e-waste. Many gamers actively work to forestall the obsolescence of their games: they invest time in the play of older games (“retrogaming”) they donate labor and creative energy to the production of user-generated content as a means of sustaining involvement in gaming communities; and they produce entirely new game experiences for other users, based on existing software and hardware modifications known as 'mods'. With Guitar Hero and other 'rhythm' games it would be easy to argue that the hardware components of this genre have only one future: as waste. Alternatively, we could consider the actual lifespan of these objects (including their impact as e-waste) and the roles they play in the performances and practices of communities of gamers. For example, the Elmo Guitar Hero controller mod, the Tesla coil Guitar Hero controller interface, the Rock Band Speak n' Spellbinder mashup, the multiple and almost sacrilegious Fender guitar hero mods, the Guitar Hero Portable Turntable Mod and MAKE magazine's Trumpet Hero all indicate a significant diversity of user innovation, community formation and individual investment in the post-retail life of computer and video game hardware. Obsolescence is not just a problem for the games industry but for the computing and electronics industries more broadly as direct contributors to the social and environmental cost of electrical waste and obsolete electrical equipment. Planned obsolescence has long been the experience of gamers and computer users, as the basis of a utopian mythology of upgrades (Dovey and Kennedy). For PC users the upgrade pathway is traversed by the consumption of further hardware and software post initial purchase in a cycle of endless consumption, acquisition and waste (as older parts are replaced and eventually discarded). The accumulation and disposal of these cultural artefacts does not devalue or accrue in space or time at the same rate (Straw) and many users will persist for years, gradually upgrading and delaying obsolescence and even perpetuate the circulation of older cultural commodities. Flea markets and secondhand fairs are popular sites for the purchase of new, recent, old, and recycled computer hardware, and peripherals. Such practices and parallel markets support the strategies of 'making do' described by De Certeau, but they also continue the cycle of upgrade and obsolescence, and they are still consumed as part of the promise of the 'new', and the desire of a purchase that will finally 'fix' the users' computer in a state of completion (29). The planned obsolescence of new media technologies is common, but its success is mixed; for example, support for Microsoft's operating system Windows XP was officially withdrawn in April 2009 (Robinson), but due to the popularity in low cost PC 'netbooks' outfitted with an optimised XP operating system and a less than enthusiastic response to the 'next generation' Windows Vista, XP continues to be popular. Digital Distribution: A Solution? Gamers may be able to reduce the accumulation of e-waste by supporting the disintermediation of the games retail sector by means of online distribution. Disintermediation is the establishment of a direct relationship between the creators of content and their consumers through products and services offered by content producers (Flew 201). The move to digital distribution has already begun to reduce the need to physically handle commodities, but this currently signals only further support of planned, stylistic and technological obsolescence, increasing the rate at which the commodities for recording, storing, distributing and exhibiting digital content become e-waste. Digital distribution is sometimes overlooked as a potential means for promoting communities of user practice dedicated to e-waste reduction, at the same time it is actively employed to reduce the potential for the unregulated appropriation of content and restrict post-purchase sales through Digital Rights Management (DRM) technologies. Distributors like Amazon.com continue to pursue commercial opportunities in linking the user to digital distribution of content via exclusive hardware and software technologies. The Amazon e-book reader, the Kindle, operates via a proprietary mobile network using a commercially run version of the wireless 3G protocols. The e-book reader is heavily encrypted with Digital Rights Management (DRM) technologies and exclusive digital book formats designed to enforce current copyright restrictions and eliminate second-hand sales, lending, and further post-purchase distribution. The success of this mode of distribution is connected to Amazon's ability to tap both the mainstream market and the consumer demand for the less-than-popular; those books, movies, music and television series that may not have been 'hits' at the time of release. The desire to revisit forgotten niches, such as B-sides, comics, books, and older video games, suggests Chris Anderson, linked with so-called “long tail” economics. Recently Webb has queried the economic impact of the Long Tail as a business strategy, but does not deny the underlying dynamics, which suggest that content does not obsolesce in any straightforward way. Niche markets for older content are nourished by participatory cultures and Web 2.0 style online services. A good example of the Long Tail phenomenon is the recent case of the 1971 book A Lion Called Christian, by Anthony Burke and John Rendall, republished after the author's film of a visit to a resettled Christian in Africa was popularised on YouTube in 2008. Anderson's Long Tail theory suggests that over time a large number of items, each with unique rather than mass histories, will be subsumed as part of a larger community of consumers, including fans, collectors and everyday users with a long term interest in their use and preservation. If digital distribution platforms can reduce e-waste, they can perhaps be fostered by to ensuring digital consumers have access to morally and ethically aware consumer decisions, but also that they enjoy traditional consumer freedoms, such as the right to sell on and change or modify their property. For it is not only the fixation on the 'next generation' that contributes to obsolescence, but also technologies like DRM systems that discourage second hand sales and restrict modification. The legislative upgrades, patches and amendments to copyright law that have attempted to maintain the law's effectiveness in competing with peer-to-peer networks have supported DRM and other intellectual property enforcement technologies, despite the difficulties that owners of intellectual property have encountered with the effectiveness of DRM systems (Moore, Creative). The games industry continues to experiment with DRM, however, this industry also stands out as one of the few to have significantly incorporated the user within the official modes of production (Moore, Commonising). Is the games industry capable (or willing) of supporting a digital delivery system that attempts to minimise or even reverse software and hardware obsolescence? We can try to answer this question by looking in detail at the biggest digital distributor of PC games, Steam. Steam Figure 1: The Steam Application user interface retail section Steam is a digital distribution system designed for the Microsoft Windows operating system and operated by American video game development company and publisher, Valve Corporation. Steam combines online games retail, DRM technologies and internet-based distribution services with social networking and multiplayer features (in-game voice and text chat, user profiles, etc) and direct support for major games publishers, independent producers, and communities of user-contributors (modders). Steam, like the iTunes games store, Xbox Live and other digital distributors, provides consumers with direct digital downloads of new, recent and classic titles that can be accessed remotely by the user from any (internet equipped) location. Steam was first packaged with the physical distribution of Half Life 2 in 2004, and the platform's eventual popularity is tied to the success of that game franchise. Steam was not an optional component of the game's installation and many gamers protested in various online forums, while the platform was treated with suspicion by the global PC games press. It did not help that Steam was at launch everything that gamers take objection to: a persistent and initially 'buggy' piece of software that sits in the PC's operating system and occupies limited memory resources at the cost of hardware performance. Regular updates to the Steam software platform introduced social network features just as mainstream sites like MySpace and Facebook were emerging, and its popularity has undergone rapid subsequent growth. Steam now eclipses competitors with more than 20 million user accounts (Leahy) and Valve Corporation makes it publicly known that Steam collects large amounts of data about its users. This information is available via the public player profile in the community section of the Steam application. It includes the average number of hours the user plays per week, and can even indicate the difficulty the user has in navigating game obstacles. Valve reports on the number of users on Steam every two hours via its web site, with a population on average between one and two million simultaneous users (Valve, Steam). We know these users’ hardware profiles because Valve Corporation makes the results of its surveillance public knowledge via the Steam Hardware Survey. Valve’s hardware survey itself conceptualises obsolescence in two ways. First, it uses the results to define the 'cutting edge' of PC technologies and publishing the standards of its own high end production hardware on the companies blog. Second, the effect of the Survey is to subsequently define obsolescent hardware: for example, in the Survey results for April 2009, we can see that the slight majority of users maintain computers with two central processing units while a significant proportion (almost one third) of users still maintained much older PCs with a single CPU. Both effects of the Survey appear to be well understood by Valve: the Steam Hardware Survey automatically collects information about the community's computer hardware configurations and presents an aggregate picture of the stats on our web site. The survey helps us make better engineering and gameplay decisions, because it makes sure we're targeting machines our customers actually use, rather than measuring only against the hardware we've got in the office. We often get asked about the configuration of the machines we build around the office to do both game and Steam development. We also tend to turn over machines in the office pretty rapidly, at roughly every 18 months. (Valve, Team Fortress) Valve’s support of older hardware might counter perceptions that older PCs have no use and begins to reverse decades of opinion regarding planned and stylistic obsolescence in the PC hardware and software industries. Equally significant to the extension of the lives of older PCs is Steam's support for mods and its promotion of user generated content. By providing software for mod creation and distribution, Steam maximises what Postigo calls the development potential of fan-programmers. One of the 'payoffs' in the information/access exchange for the user with Steam is the degree to which Valve's End-User Licence Agreement (EULA) permits individuals and communities of 'modders' to appropriate its proprietary game content for use in the creation of new games and games materials for redistribution via Steam. These mods extend the play of the older games, by requiring their purchase via Steam in order for the individual user to participate in the modded experience. If Steam is able to encourage this kind of appropriation and community support for older content, then the potential exists for it to support cultures of consumption and practice of use that collaboratively maintain, extend, and prolong the life and use of games. Further, Steam incorporates the insights of “long tail” economics in a purely digital distribution model, in which the obsolescence of 'non-hit' game titles can be dramatically overturned. Published in November 2007, Unreal Tournament 3 (UT3) by Epic Games, was unappreciated in a market saturated with games in the first-person shooter genre. Epic republished UT3 on Steam 18 months later, making the game available to play for free for one weekend, followed by discounted access to new content. The 2000 per cent increase in players over the game's 'free' trial weekend, has translated into enough sales of the game for Epic to no longer consider the release a commercial failure: It’s an incredible precedent to set: making a game a success almost 18 months after a poor launch. It’s something that could only have happened now, and with a system like Steam...Something that silently updates a purchase with patches and extra content automatically, so you don’t have to make the decision to seek out some exciting new feature: it’s just there anyway. Something that, if you don’t already own it, advertises that game to you at an agreeably reduced price whenever it loads. Something that enjoys a vast community who are in turn plugged into a sea of smaller relevant communities. It’s incredibly sinister. It’s also incredibly exciting... (Meer) Clearly concerns exist about Steam's user privacy policy, but this also invites us to the think about the economic relationship between gamers and games companies as it is reconfigured through the private contractual relationship established by the EULA which accompanies the digital distribution model. The games industry has established contractual and licensing arrangements with its consumer base in order to support and reincorporate emerging trends in user generated cultures and other cultural formations within its official modes of production (Moore, "Commonising"). When we consider that Valve gets to tax sales of its virtual goods and can further sell the information farmed from its users to hardware manufacturers, it is reasonable to consider the relationship between the corporation and its gamers as exploitative. Gabe Newell, the Valve co-founder and managing director, conversely believes that people are willing to give up personal information if they feel it is being used to get better services (Leahy). If that sentiment is correct then consumers may be willing to further trade for services that can reduce obsolescence and begin to address the problems of e-waste from the ground up. Conclusion Clearly, there is a potential for digital distribution to be a means of not only eliminating the need to physically transport commodities but also supporting consumer practices that further reduce e-waste. For an industry where only a small proportion of the games made break even, the successful relaunch of older games content indicates Steam's capacity to ameliorate software obsolescence. Digital distribution extends the use of commercially released games by providing disintermediated access to older and user-generated content. For Valve, this occurs within a network of exchange as access to user-generated content, social networking services, and support for the organisation and coordination of communities of gamers is traded for user-information and repeat business. Evidence for whether this will actively translate to an equivalent decrease in the obsolescence of game hardware might be observed with indicators like the Steam Hardware Survey in the future. The degree of potential offered by digital distribution is disrupted by a range of technical, commercial and legal hurdles, primary of which is the deployment of DRM, as part of a range of techniques designed to limit consumer behaviour post purchase. While intervention in the form of legislation and radical change to the insidious nature of electronics production is crucial in order to achieve long term reduction in e-waste, the user is currently considered only in terms of 'ethical' consumption and ultimately divested of responsibility through participation in corporate, state and civil recycling and e-waste management operations. The message is either 'careful what you purchase' or 'careful how you throw it away' and, like DRM, ignores the connections between product, producer and user and the consumer support for environmentally, ethically and socially positive production, distribrution, disposal and recycling. This article, has adopted a different strategy, one that sees digital distribution platforms like Steam, as capable, if not currently active, in supporting community practices that should be seriously considered in conjunction with a range of approaches to the challenge of obsolescence and e-waste. References Anderson, Chris. "The Long Tail." Wired Magazine 12. 10 (2004). 20 Apr. 2009 ‹http://www.wired.com/wired/archive/12.10/tail.html›. De Certeau, Michel. The Practice of Everyday Life. Berkeley: U of California P, 1984. Dovey, Jon, and Helen Kennedy. Game Cultures: Computer Games as New Media. London: Open University Press,2006. Fitzpatrick, Kathleen. The Anxiety of Obsolescence. Nashville: Vanderbilt UP, 2008. Flew, Terry. New Media: An Introduction. South Melbourne: Oxford UP, 2008. Leahy, Brian. "Live Blog: DICE 2009 Keynote - Gabe Newell, Valve Software." The Feed. G4TV 18 Feb. 2009. 16 Apr. 2009 ‹http://g4tv.com/thefeed/blog/post/693342/Live-Blog-DICE-2009-Keynote-–-Gabe-Newell-Valve-Software.html›. Meer, Alec. "Unreal Tournament 3 and the New Lazarus Effect." Rock, Paper, Shotgun 16 Mar. 2009. 24 Apr. 2009 ‹http://www.rockpapershotgun.com/2009/03/16/unreal-tournament-3-and-the-new-lazarus-effect/›.Moore, Christopher. "Commonising the Enclosure: Online Games and Reforming Intellectual Property Regimes." Australian Journal of Emerging Technologies and Society 3. 2, (2005). 12 Apr. 2009 ‹http://www.swin.edu.au/sbs/ajets/journal/issue5-V3N2/abstract_moore.htm›. Moore, Christopher. "Creative Choices: Changes to Australian Copyright Law and the Future of the Public Domain." Media International Australia 114 (Feb. 2005): 71–83. Postigo, Hector. "Of Mods and Modders: Chasing Down the Value of Fan-Based Digital Game Modification." Games and Culture 2 (2007): 300-13. Robinson, Daniel. "Windows XP Support Runs Out Next Week." PC Business Authority 8 Apr. 2009. 16 Apr. 2009 ‹http://www.pcauthority.com.au/News/142013,windows-xp-support-runs-out-next-week.aspx›. Straw, Will. "Exhausted Commodities: The Material Culture of Music." Canadian Journal of Communication 25.1 (2000): 175. Slade, Giles. Made to Break: Technology and Obsolescence in America. Cambridge: Harvard UP, 2006. Valve. "Steam and Game Stats." 26 Apr. 2009 ‹http://store.steampowered.com/stats/›. Valve. "Team Fortress 2: The Scout Update." Steam Marketing Message 20 Feb. 2009. 12 Apr. 2009 ‹http://storefront.steampowered.com/Steam/Marketing/message/2269/›. Webb, Richard. "Online Shopping and the Harry Potter Effect." New Scientist 2687 (2008): 52-55. 16 Apr. 2009 ‹http://www.newscientist.com/article/mg20026873.300-online-shopping-and-the-harry-potter-effect.html?page=2›. With thanks to Dr Nicola Evans and Dr Frances Steel for their feedback and comments on drafts of this paper.
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia