Academic literature on the topic 'Information update system'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Information update system.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Information update system"

1

Doncel, Josu. "Age of information of a server with energy requirements." PeerJ Computer Science 7 (March 1, 2021): e354. http://dx.doi.org/10.7717/peerj-cs.354.

Full text
Abstract:
We investigate a system with Poisson arrivals to two queues. One queue stores the status updates of the process of interest (or data packets) and the other handles the energy that is required to deliver the updates to the monitor. We consider that the energy is represented by packets of discrete unit. When an update ends service, it is sent to the energy queue and, if the energy queue has one packet, the update is delivered successfully and the energy packet disappears; however, in case the energy queue is empty, the update is lost. Both queues can handle, at most, one packet and the service time of updates is exponentially distributed. Using the Stochastic Hybrid System method, we characterize the average Age of Information of this system. Due to the difficulty of the derived expression, we also explore approximations of the average Age of Information of this system.
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Guang Ming, and Xiao Wu Li. "Update Semantics in Communicated Information System." Advanced Materials Research 403-408 (November 2011): 1460–65. http://dx.doi.org/10.4028/www.scientific.net/amr.403-408.1460.

Full text
Abstract:
An approach, which is called Communicated Information Systems, is introduced to describe the information available in a number of agents and specify the information communication among the agents. The systems are extensions of classical propositional logic in multi-agents context, providing with us a way by which not only the agent’s own information, but the information from other agents may be applied to agent’s reasoning as well. Communication rules, which are defined in the most essential form, can be regarded as the base to characterize some interesting cognitive proporties of agents. Since the corresponding communication rules can be chosen for different applications, the approach is general purpose one. The other main task is that the soundness and completeness of the Communicated Information Systems for the update semantics have been proved in the paper.
APA, Harvard, Vancouver, ISO, and other styles
3

LIU, REY-LONG. "ADAPTIVE AGENTS FOR EFFECTIVE INFORMATION MONITORING." International Journal of Cooperative Information Systems 12, no. 01 (March 2003): 37–60. http://dx.doi.org/10.1142/s0218843003000693.

Full text
Abstract:
Information motoring is an essential basis for management and decision making in various domains. Once an information update is detected, suitable procedures may be triggered to identify potential problems and opportunities. In this paper, we define effective information monitoring (EIM) and explore how adaptive agents may cooperate with each other in order to achieve their collective goal of EIM. Since each information item may be updated by various entities (e.g. information servers) at any time, EIM calls for a multiagent system that may detect more information updates in a timely manner using a controlled amount of system resources (e.g. loading of related information servers and the Intranet). To achieve EIM, the agents should adapt themselves to the dynamically changing behaviors of the information items being monitored. They learn to issue requests of using system resources and concede to those agents that are more likely to detect updates at the time of negotiation. The framework is theoretically and empirically evaluated. Its potential applications to management by exceptions are identified and discussed.
APA, Harvard, Vancouver, ISO, and other styles
4

Doncel, Josu. "Age of Information of Parallel Server Systems with Energy Harvesting." Entropy 23, no. 11 (November 21, 2021): 1549. http://dx.doi.org/10.3390/e23111549.

Full text
Abstract:
Motivated by current communication networks in which users can choose different transmission channels to operate and also by the recent growth of renewable energy sources, we study the average Age of Information of a status update system that is formed by two parallel homogeneous servers and such that there is an energy source that feeds the system following a random process. An update, after getting service, is delivered to the monitor if there is energy in a battery. However, if the battery is empty, the status update is lost. We allow preemption of updates in service and we assume Poisson generation times of status updates and exponential service times. We show that the average Age of Information can be characterized by solving a system with eight linear equations. Then, we show that, when the arrival rate to both servers is large, the average Age of Information is one divided by the sum of the service rates of the servers. We also perform a numerical analysis to compare the performance of our model with that of a single server with energy harvesting and to study in detail the aforementioned convergence result.
APA, Harvard, Vancouver, ISO, and other styles
5

TAKAMA, Yasufumi, and Takeshi KUROSAWA. "Visualization System for Monitoring Bug Update Information." IEICE Transactions on Information and Systems E97.D, no. 4 (2014): 654–62. http://dx.doi.org/10.1587/transinf.e97.d.654.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gardner, Reed M., T. Allan Pryor, and Homer R. Warner. "The HELP hospital information system: update 1998." International Journal of Medical Informatics 54, no. 3 (June 1999): 169–82. http://dx.doi.org/10.1016/s1386-5056(99)00013-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yüksel, B., and A. Yilmaz. "GEOSPATIAL DATABASE UPDATING SYSTEM with WMS and DIRECT CONNECTION METHOD." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4 (September 19, 2018): 737–43. http://dx.doi.org/10.5194/isprs-archives-xlii-4-737-2018.

Full text
Abstract:
<p><strong>Abstract.</strong> The demand for up-to-date geospatial data is in an upward trend. This brings a continuous and sustainable review and update of the geospatial data itself and database design. Turkish Topographic Vector Database (TOPOVT) is the main database of Turkey consisting 1<span class="thinspace"></span>:<span class="thinspace"></span>25.000 and higher scale resolution 3D topographic vector data covering whole country. It also consists of contours and geonames. TOPOVT has been and is still being produced by General Command of Mapping of Turkey mainly from 30<span class="thinspace"></span>cm resolution stereo aerial photos and completed in the field. TOPOVT is seamless and topologic database. The updating objective is five years cycle. The first production cycle of TOPOVT in vector has been completed recently. Before the updating cycle began, a system was designed to update TOPOVT efficiently and without losing any information content. It also holds historical background of the features updated.</p><p>With this topographic data update system, geospatial data can be updated rapidly and served to the users. Hereby both the process for map printing fastens and answers to the need of updated data for the TOPOVT database can be achieved. With this system, the data which will be updated can be displayed on the personal computer by TOPOVT database connection and the users can perform add, update and delete actions in the data according to their authorization. All the updates executed in the field can be monitored on the TOPOVT database in real time via internet connection.</p>
APA, Harvard, Vancouver, ISO, and other styles
8

MASUD, MD MEHEDI, ILUJU KIRINGA, and HASAN URAL. "UPDATE PROCESSING IN INSTANCE-MAPPED P2P DATA SHARING SYSTEMS." International Journal of Cooperative Information Systems 18, no. 03n04 (September 2009): 339–79. http://dx.doi.org/10.1142/s021884300900204x.

Full text
Abstract:
We consider the problem of update processing in a peer-to-peer (P2P) database network where each peer consists of an independently created relational database. We assume that peers store related data, but data has heterogeneity wrt instances and schemas. The differences in schema and data vocabulary are bridged by value correspondences called mapping tables. Peers build an overlay network called acquaintance network, in which each peer may get acquainted with any other peer that stores related data. In this setting, the updates are free to initiate in any peer and are executed over other peers which are acquainted directly or indirectly with the updates initiator. The execution of an update is achieved by translating, through mapping tables, the update into a set of updates that are executed against the acquainted peers. We consider both the soundness and completeness of update translation. When updates are generated and propagated in the network initiated from a peer, a tree is built dynamically called Update Dependency Tree (UDT). The UDT depicts the relationships among the component updates generated from the initial update. We also discuss the issues of the update propagation when a peer is temporarily unavailable or offline. Our propagation mechanism keeps track of a peer when the peer is not available for a certain period of time and once the peer comes back online the system propagates the updates destined to the returning peer to keep it's database synchronized. Moreover, conflict detection and resolution strategies have been proposed for such a dynamic P2P database network. We have implemented and experimentally tested a prototype of our update processing mechanism on a small P2P database network. We show the results of our experiments.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhu, Jin Song, and Jian Hui Wu. "Study on System Reliability Updating through Inspection Information for Existing Cable-Stayed Bridges." Advanced Materials Research 250-253 (May 2011): 2011–15. http://dx.doi.org/10.4028/www.scientific.net/amr.250-253.2011.

Full text
Abstract:
In order to accurately evaluate the reliability of the existing cable-stayed bridge, a method based on inspection information is proposed to update the system reliability. Using Bayesian method and inspection information, the modified model of cable-stayed bridge random variables is established, and then the failure probability of cable-stayed bridge components is updated. Theβ-Tcurves of changing rules of inspection information on system reliability index and service life are obtained. The method has been applied to a cable-stayed bridge, the results show that the proposed method is effective to update the system reliability and can predict the residual life of the existing cable-stayed bridges.
APA, Harvard, Vancouver, ISO, and other styles
10

Chen, Junqi, Yong Wang, Miao Ye, Qinghao Zhang, and Wenlong Ke. "A Load-Aware Multistripe Concurrent Update Scheme in Erasure-Coded Storage System." Wireless Communications and Mobile Computing 2022 (May 19, 2022): 1–15. http://dx.doi.org/10.1155/2022/5392474.

Full text
Abstract:
Erasure coding has been widely deployed in today’s data centers for it can significantly reduce extra storage costs while providing high storage reliability. However, erasure coding introduced more network traffic and computational overhead in the data update process. How to improve the efficiency and mitigate the system imbalance during the update process in erasure coding is still a challenging problem. Recently, most of the existing update schemes of erasure codes only focused on the single stripe update scenario and ignored the heterogeneity of the node and network status which cannot sufficiently deal with the problems of low update efficiency and load imbalance caused by the multistripe concurrent update. To solve this problem, this paper proposes a Load-Aware Multistripe concurrent Update (LAMU) scheme in erasure-coded storage systems. Notably, LAMU introduces the Software-Defined Network (SDN) mechanism to measure the node loads and network status in real time. It selects nonduplicated nodes with better performance such as CPU utilization, remaining memory, and I/O load as the computing nodes for multiple update stripes. Then, a multiattribute decision-making method is used to schedule the network traffic generated in the update process. This mechanism can improve the transmission efficiency of update traffic and make LAMU adapt to the multistripe concurrent update scenarios in heterogeneous network environments. Finally, we designed a prototype system of multistripe concurrent updates. The extensive experimental results show that LAMU could improve the update efficiency and provide better system load-balancing performance.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Information update system"

1

Abrahamsson, David. "Security Enhanced Firmware Update Procedures in Embedded Systems." Thesis, Linköping University, Department of Computer and Information Science, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-16914.

Full text
Abstract:

Many embedded systems are complex, and it is often required that the firmware in these systems are updatable by the end-user. For economical and confidentiality reasons, it is important that these systems only accept firmware approved by the firmware producer.

This thesis work focuses on creating a security enhanced firmware update procedure that is suitable for use in embedded systems. The common elements of embedded systems are described and various candidate algorithms are compared as candidates for firmware verification. Patents are used as a base for the proposal of a security enhanced update procedure. We also use attack trees to perform a threat analysis on an update procedure.

The results are a threat analysis of a home office router and the proposal of an update procedure. The update procedure will only accept approved firmware and prevents reversion to old, vulnerable, firmware versions. The firmware verification is performed using the hash function SHA-224 and the digital signature algorithm RSA with a key length of 2048. The selection of algorithms and key lengths mitigates the threat of brute-force and cryptanalysis attacks on the verification algorithms and is believed to be secure through 2030.

APA, Harvard, Vancouver, ISO, and other styles
2

Law, King Yiu. "Two routing strategies with cost update in integrated automated storage and retrieval system /." View abstract or full-text, 2007. http://library.ust.hk/cgi/db/thesis.pl?IELM%202007%20LAW.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kuhn, Olivier. "Methodology for knowledge-based engineering template update : focus on decision support and instances update." Phd thesis, Université Claude Bernard - Lyon I, 2010. http://tel.archives-ouvertes.fr/tel-00713174.

Full text
Abstract:
The present Ph.D. thesis addresses the problem of knowledge-based engineering template update in product design. The reuse of design knowledge has become a key asset for the company's competitiveness. Knowledge-based engineering templates allow to store best practices and knowhow via formulas, rules, scripts, etc. This design knowledge can then be reused by instantiating the template. The instantiation results in the creation of an instance of the template in the specified context. In the scope of complex and large products, such as cars or aircrafts, the maintenance of knowledge-based engineering templates is a challenging task. Several engineers from various disciplines work together and make evolve the templates in order to extend their capabilities or to fix bugs. Furthermore, in some cases, the modifications applied to templates should be forwarded to their instances in order that they benefit from the changes. These issues slow down the adoption of template technologies at a large scale within companies. The objective of this work is to propose an approach in order to support engineers in the template update related tasks. In order to address these issues, a process supporting the template update related tasks is defined. Then a framework is proposed that helps design engineers during the template update process by providing a decision support system and a strategy for the update of template instances. The former is a system designed to ease the collaboration between various experts in order to solve template related problems. The latter aims at providing a sequence of updates to follow, in order to forward the templates' modifications to their instances. This sequence is computed with data extracted from models and templates, which are stored in an ontology designed for this purpose. The ontology is used to represent and to infer knowledge about templates, products and their relations. This facilitates the construction of update sequences as it provides an efficient overview of relationships, even implicit ones.
APA, Harvard, Vancouver, ISO, and other styles
4

Moore, Jennifer Anne. "Image Integration and On-Screen Digitizing Method of Geographic Information System Update and Maintenance Applied to the Hofmann Forest." NCSU, 2002. http://www.lib.ncsu.edu/theses/available/etd-20020419-104500.

Full text
Abstract:

The Hofmann Forest is a self-sustaining forest that provides the North Carolina State University College of Natural Resources with support for research, education, and extension service. The management of the Hofmann Forest requires data concerning historical records, complete and current resource inventory, and the ability to model future forest conditions. A geographic information system (GIS) database was created for the Hofmann Forest in 1992 to facilitate achievement of these data objectives. The database was not maintained or used regularly. Ongoing forestry research and silvicultural activities are constantly changing the resource conditions on the forest. This research examined a practical and accurate method for maintaining currency in the GIS database. Digital imagery was integrated into the original GIS database, and silvicultural records were used to update the existing data layers. Digital orthophotography, in the form of USGS Digital Orthophoto Quarter-Quads (DOQQs), was the primary source of imagery, but where the imagery was unavailable or contained insufficient spatial detail, unrectified aerial photographs were scanned, registered, and substituted. For the vegetation data layer of the GIS, spatial and attribute updates were completed and evaluated for silvicultural operations covering over three thousand acres. Some updates involved only changes in attributes. Spatial updates were completed with the digital orthophotography or digital aerial photographs; of these, some updates involved fairly simple spatial editing and others involved more complex spatial editing. The updates required the digital aerial photographs were all spatially complex edits. Acreage estimates accompanied the silvicultural records. GIS-derived area measurements were compared with those on the silvicultural records. There was not a significant difference between the two measures of area, however some discrepancies were present. A series of comparison tests were designed and performed to identify the potential elements of the area discrepancies. Spatial complexity of the editing procedure, different sources of digital imagery, and size of updated vegetation polygons were all examined. Degree of spatial complexity in the updates did not significantly contribute to area discrepancies. There was no significant difference in area discrepancies when either the DOQQs or digital aerial photographs were used. Size of the updated vegetation polygons was significantly negatively correlated with the discrepancies, showing that small absolute differences in area in small polygons result in large relative discrepancy values. Differentially corrected global positioning system (GPS) data were used to assess the horizontal positional accuracy of the GIS data layers. Following National Map Accuracy Standard (NMAS) guidelines, a sample of 25 ?well-defined? locations were collected using a Trimble GPS Pathfinder ProXR receiver with real-time differential correction capabilities. These same locations were identified on the Roads layer of the GIS database, the DOQQs, and the digital aerial photography. Root mean-square error (RMSE) was calculated for each of the different data layers, using the GPS data as reference locations. Only the DOQQ-derived points met the NMAS Class 2 horizontal positional accuracy standard. RMSE for the aerial photography and Roads layer were greater than the limiting RMSE for the NMSE Class 3 standard. Based on these results, it can be concluded that DOQQs possess greater horizontal accuracy than the digital aerial photography and are the preferred imagery source for the on-screen digitizing. Should greater resolution be required for a database update, orthorectification of the digital aerial photography could be used to correct horizontal positional errors. Recently, software packages have become available to orthorectify aerial photographs effectively and affordably.The presence of extraneous features in the vegetation layer of the GIS database almost certainly contributes to the area discrepancies. Features such as windrows, fire ponds, and logging decks are included in vegetation polygon area but not silvicultural record area estimates. Future database improvements should consider subtracting these features (and their associated areas) from the vegetation layer and creating separate database layers for each type of feature. A methodology report was developed to accompany the GIS database as a reference for future updating. Continuous maintenance of the Hofmann Forest GIS database is necessary to provide timely information for on-site forest managers and research activities, and to preserve an account of forest conditions that may be useful in present and future management decisions. On-screen digitizing with integrated digital imagery proved to be a feasible method for updating and maintaining the Hofmann Forest GIS database.

APA, Harvard, Vancouver, ISO, and other styles
5

Sommerlot, Andrew Richard. "Coupling Physical and Machine Learning Models with High Resolution Information Transfer and Rapid Update Frameworks for Environmental Applications." Diss., Virginia Tech, 2017. http://hdl.handle.net/10919/89893.

Full text
Abstract:
Few current modeling tools are designed to predict short-term, high-risk runoff from critical source areas (CSAs) in watersheds which are significant sources of non point source (NPS) pollution. This study couples the Soil and Water Assessment Tool-Variable Source Area (SWAT-VSA) model with the Climate Forecast System Reanalysis (CFSR) model and the Global Forecast System (GFS) model short-term weather forecast, to develop a CSA prediction tool designed to assist producers, landowners, and planners in identifying high-risk areas generating storm runoff and pollution. Short-term predictions for streamflow, runoff probability, and soil moisture levels were estimated in the South Fork of the Shenandoah river watershed in Virginia. In order to allow land managers access to the CSA predictions a free and open source software based web was developed. The forecast system consists of three primary components; (1) the model, which preprocesses the necessary hydrologic forcings, runs the watershed model, and outputs spatially distributed VSA forecasts; (2) a data management structure, which converts high resolution rasters into overlay web map tiles; and (3) the user interface component, a web page that allows the user, to interact with the processed output. The resulting framework satisfied most design requirements with free and open source software and scored better than similar tools in usability metrics. One of the potential problems is that the CSA model, utilizing physically based modeling techniques requires significant computational time to execute and process. Thus, as an alternative, a deep learning (DL) model was developed and trained on the process based model output. The DL model resulted in a 9% increase in predictive power compared to the physically based model and a ten-fold decrease in run time. Additionally, DL interpretation methods applicable beyond this study are described including hidden layer visualization and equation extractions describing a quantifiable amount of variance in hidden layer values. Finally, a large-scale analysis of soil phosphorus (P) levels was conducted in the Chesapeake Bay watershed, a current location of several short-term forecast tools. Based on Bayesian inference methodologies, 31 years of soil P history at the county scale were estimated, with the associated uncertainty for each estimate. These data will assist in the planning and implantation of short term forecast tools with P management goals. The short term modeling and communication tools developed in this work contribute to filling a gap in scientific tools aimed at improving water quality through informing land manager's decisions.
PHD
APA, Harvard, Vancouver, ISO, and other styles
6

Bedewy, Ahmed M. "OPTIMIZING DATA FRESHNESS IN INFORMATION UPDATE SYSTEMS." The Ohio State University, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=osu1618573325086709.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sang, Yu. "INFORMATION-UPDATE SYSTEMS: MODELS, ALGORITHMS, AND ANALYSIS." Diss., Temple University Libraries, 2019. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/576162.

Full text
Abstract:
Computer and Information Science
Ph.D.
Age of information (AoI) has been proposed as a new metric to measure the staleness of data. For time-sensitive information, it is critical to keep the AoI at a low level. A lot of work have been done on the analysis and optimization on AoI in information-update systems. Prior studies on AoI optimization often consider a push model, which is concerned about when and how to "push" (i.e., generate and transmit) the updated information to the user. In stark contrast, we introduce a new pull model, which is more relevant for certain applications (such as the real-time stock quotes service), where a user sends requests to the servers to proactively "pull" the information of interest. Moreover, we propose to employ request replication to reduce the AoI. Interestingly, we find that under this new Pull model, replication schemes capture a novel tradeoff between different levels of information freshness and different response times across the servers, which can be exploited to minimize the expected AoI at the user's side. Specifically, assuming Poisson updating process for the servers and exponentially distributed response time with known expectation, we derive a closed-form formula for computing the expected AoI and obtain the optimal number of responses to wait for to minimize the expected AoI. Then, we extend our analysis to the setting where the user aims to maximize the utility, which is an exponential function of the negative AoI and represents the user's satisfaction level about the timeliness of the received information. We can similarly derive a closed-form formula of the expected utility and find the optimal number of responses to wait for. Further, we consider a more realistic scenario where the updating rate and the mean response time at the servers are unknown to the user. In this case, we formulate the utility maximization problem as a stochastic Multi-Armed Bandit (MAB) Problem. The formulated MAB problem has a special linear feedback graph, which can be leveraged to design policies with an improved regret upper bound. We also notice that one factor has been missing in most of the previous solutions on AoI minimization, which is the cost of performing updates. Therefore, we focus on the tradeoff between the AoI and the update cost, which is of significant importance in time-sensitive data-driven applications. We consider the applications where the information provider is directly connected to the data source, and the clients need to obtain the data from the information provider in a real-time manner (such as the real-time environmental monitoring system). The provider needs to update the data so that it can reply to the clients' requests with fresh information. However, the update cost limits the frequency that the server can refresh the data, which makes it important to design an efficient policy with optimal tradeoff between data freshness and update cost. We define the staleness cost, which reflects the AoI of the data and formulate the problem as the minimization over the summation of the update cost and the staleness cost. We first propose important guidelines of designing update policies in such information-update systems that can be applied to arbitrary request arrival processes. Then, we design an update policy with a simple threshold-based structure, which is easy to implement. Under the assumption of Poisson request arrival process, we derive the closed-form expression of the average cost of the threshold-based policy and prove its optimality among all online update policies. In almost all prior works, the analysis and optimization are based on traditional queueing models with the probabilistic approaches. However, in the traditional probabilistic study of general queueing models, the analysis is heavily dependent on the properties of specific distributions. Under this framework, it is also usually hard to handle distributions with heavy tail behavior. To that end, in this work, we take an alternative new approach and focus on the Peak Age of Information (PAoI), which is the largest age of each update shown to the end users. Specifically, we employ a recently developed analysis framework based on robust optimization and model the uncertainty in the stochastic arrival and service processes by uncertainty sets. This robust queueing framework enables us to approximate the steady-state PAoI performance of information-update systems with very general arrival and service processes, including those exhibiting heavy-tailed behavior. We first propose a new bound of the PAoI under the single-source system that performs much better than previous results, especially with light traffic. Then, we generalize it to multi-source systems with symmetric arrivals, which involves new technical challenges. It has been extensively investigated for various queueing models based on the probabilistic approaches. However, in the traditional probabilistic study of general queueing models, the analysis is heavily dependent on the properties of specific distributions, such as the memoryless property of the Poisson distribution. Under this framework, it is also usually hard to handle distributions with heavy tail behavior. To that end, we take an alternative new approach and focus on the Peak Age of Information (PAoI), which is the largest age of each update shown to the end users. Specifically, we employ a recently developed analysis framework based on robust optimization and model the uncertainty in the stochastic arrival and service processes by uncertainty sets. This robust queueing framework enables us to approximate the steady-state PAoI performance of information-update systems with very general arrival and service processes, including those exhibiting heavy-tailed behavior. We first propose a new bound of the PAoI under the single-source system that performs much better than previous results, especially with light traffic. Then, we generalize it to multi-source systems with symmetric arrivals, which involves new technical challenges.
Temple University--Theses
APA, Harvard, Vancouver, ISO, and other styles
8

Weinlichová, Jana. "Návrh algoritmů pro modul informačního systému." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2008. http://www.nusl.cz/ntk/nusl-228040.

Full text
Abstract:
Master´s thesis is considered with design of algorithms for new module of company information system. In the beginning of thesis there are characterized types of ways to describe an information systems. For specification of described system is briefly defined IBM Lotus Notes environment. Next chapter is about object-oriented analysis and design of a module of information system by using UML´s diagrams in modeling tool Enterprise Architect. In the third chapter is made analysis and design of module´s connection with current system, specificly update data in form. Thesis shows designed algorithms in environment of Lotus Domino Designer by using LotusScript and SQL languages and Lotus Domino Connector for access into the database by using ODBC. In last part of thesis is proposed to use a mapping tool to mapp the ITC infrastructure by using Change management process according to ITIL method, to manage all the changes in developing system effectively.
APA, Harvard, Vancouver, ISO, and other styles
9

Thuresson, Marcus. "Wrapping XML-Sources to Support Update Awareness." Thesis, University of Skövde, Department of Computer Science, 2000. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-488.

Full text
Abstract:

Data warehousing is a generally accepted method of providing corporate decision support. Today, the majority of information in these warehouses originates from sources within a company, although changes often occur from the outside. Companies need to look outside their enterprises for valuable information, increasing their knowledge of customers, suppliers, competitors etc.

The largest and most frequently accessed information source today is the Web, which holds more and more useful business information. Today, the Web primarily relies on HTML, making mechanical extraction of information a difficult task. In the near future, XML is expected to replace HTML as the language of the Web, bringing more structure and content focus.

One problem when considering XML-sources in a data warehouse context is their lack of update awareness capabilities, which restricts eligible data warehouse maintenance policies. In this work, we wrap XML-sources in order to provide update awareness capabilities.

We have implemented a wrapper prototype that provides update awareness capabilities for autonomous XML-sources, especially change awareness, change activeness, and delta awareness. The prototype wrapper complies with recommendations and working drafts proposed by W3C, thereby being compliant with most off-the-shelf XML tools. In particular, change information produced by the wrapper is based on methods defined by the DOM, implying that any DOM-compliant software, including most off-the-shelf XML processing tools, can be used to incorporate identified changes in a source into an older version of it.

For the delta awareness capability we have investigated the possibility of using change detection algorithms proposed for semi-structured data. We have identified similarities and differences between XML and semi-structured data, which affect delta awareness for XML-sources. As a result of this effort, we propose an algorithm for change detection in XML-sources. We also propose matching criteria for XML-documents, to which the documents have to conform to be subject to change awareness extension.

APA, Harvard, Vancouver, ISO, and other styles
10

Biondi, Mattia. "An Updated Emulated Architecture to Support the Study of Operating Systems." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/20751/.

Full text
Abstract:
One of the most effective ways to learn something new is by actively practising it, and there is—maybe—no better way to study an Operating Systems course than by building your own OS. However, it is important to emphasize how the realization of an operating system capable of running on a real hardware machine could be an overly complex and unsuitable task for an undergraduate student. Nonetheless, it is possible to use a simplified computer system simulator to achieve the goal of teaching Computer Science foundations in the University environment, thus allowing students to experience a quite realistic representation of an operating system. µMPS has been created for this purpose, a pedagogically appropriate machine emulator, based around the MIPS R2/3000 microprocessor, which features an accessible architecture that includes a rich set of easily programmable devices. µMPS has an almost two decades old historical development and the outcome of this following thesis is the third version of the software, dubbed µMPS3. This second major revision aims to simplify, even more, the complexity of the emulator in order to lighten the load of work required by the students during the OS design and implementation. Two of these simplifications are the removal of the virtual memory bit, which allowed address translation to be turned on and off, and the replacement of the tape device, used as storage devices, with a new flash drive device—certainly something more familiar to the new generation of students. Thanks to the employment of this software and the feedback received over the last decade, it has been possible to realize not just this following thesis, but also to develop some major improvements, which concern everything from the project building tools to the front-end, making µMPS a modern and reliable educational software.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Information update system"

1

Office, General Accounting. SEC: EDGAR update. Washington, D.C: The Office, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Office, General Accounting. Information management: Update on implementation of the 1996 Electronic Freedom of Information Act Amendments : report to congressional requesters. Washington, D.C: U.S. General Accounting Office, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hawaii. Legislature. Office of the Legislative Auditor. An update on the Department of Educations's financial management system and school information system: A report to the Governor and the Legislature of the State of Hawaii. Honolulu (465 S. King St., Suite 500, Honolulu 96813): The Auditor, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Boldt, Roger. Information technology update for transit. Washington, D.C: National Academy Press, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Office, General Accounting. Coast Guard: Update on marine information for safety and law enforcement system : report to the Subcommittee on Coast Guard and Maritime Transportation, Committee on Transportation and Infrastructure, House of Representatives. Washington, D.C: U.S. General Accounting Office, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hawkins, Donald T. Online information retrieval bibliography: Twelfth update. Oxford: Learned Information, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hawkins, Donald T. Online information retrieval bibliography: Eleventh update. Oxford: Learned Information, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mullin, Robin C. Data update in a land information network. Kensington, N.S.W., Australia: School of Surveying, University of New South Wales, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Can DOD travelers book a trip?: Defense Travel System update : hearing before the Oversight and Investigations Subcommittee of the Committee on Armed Services, House of Representatives, One Hundred Eleventh Congress, first session, hearing held March 5, 2009. Washington: U.S. G.P.O., 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

United States. Congress. House. Committee on Oversight and Government Reform., ed. 2010 census: Progress on the development of the field data collection automation program and the decennial response integration system : joint hearing before the Subcommittee on Information Policy, Census, and National Archives and the Committee on Oversight and Government Reform, House of Representatives, One Hundred Tenth Congress, second session, April 9, 2008. Washington: U.S. G.P.O., 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Information update system"

1

Johanssen, Michael. "Update on System Virtualization Management." In Communications in Computer and Information Science, 125–34. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-88708-9_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kim, Dong Kwan, Won-Tae Kim, and Seung-Min Park. "DSUENHANCER: A Dynamic Update System for Resource-Constrained Software." In Communications in Computer and Information Science, 195–201. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-26010-0_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tagashira, Shigeaki, Keizo Saisho, Fumitake Inada, and Akira Fukuda. "A Copy Update Mechanism for a Mobile Information Announcement System." In Advances in Database Technologies, 266–77. Berlin, Heidelberg: Springer Berlin Heidelberg, 1999. http://dx.doi.org/10.1007/978-3-540-49121-7_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bertino, Elisa, Barbara Carminati, Elena Ferrari, and Giovanni Mella. "Author-X – A System for Secure Dissemination and Update of XML Documents." In Databases in Networked Information Systems, 66–85. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-39845-5_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Pirola, Fabiana, Giuditta Pezzotta, Veronica Arioli, and Roberto Sala. "Design and Engineer Data-Driven Product Service System: A Methodology Update." In IFIP Advances in Information and Communication Technology, 367–75. Cham: Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-16411-8_43.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bernert, Marie, and Fano Ramparany. "A Belief Update System Using an Event Model for Location of People in a Smart Home." In Lecture Notes in Computer Science, 17–32. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72308-8_2.

Full text
Abstract:
AbstractArtificial Intelligence applications often require to maintain a knowledge base about the observed environment. In particular, when the current knowledge is inconsistent with new information, it has to be updated. Such inconsistency can be due to erroneous assumptions or to changes in the environment. Here we considered the second case, and develop a knowledge update algorithm based on event logic that takes into account constraints according to which the environment can evolve. These constraints take the form of events that modify the environment in a well-defined manner. The belief update triggered by a new observation is thus explained by a sequence of events. We then apply this algorithm to the problem of locating people in a smart home and show that taking into account past information and move’s constraints improves location inference.
APA, Harvard, Vancouver, ISO, and other styles
7

Leyh, Christian, and Pauline Sander. "Critical Success Factors for ERP System Implementation Projects: An Update of Literature Reviews." In Lecture Notes in Business Information Processing, 45–67. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-17587-4_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Barrile, Vincenzo, Antonino Fotia, Ernesto Bernardo, and Giuliana Bilotta. "Road Cadastre an Innovative System to Update Information, from Big Data Elaboration." In Computational Science and Its Applications – ICCSA 2020, 709–20. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58811-3_51.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kawamata, Taisuke, Susumu Fujimori, and Takako Akakura. "Student Authentication Method by Sequential Update of Face Information Registered in e-Learning System." In Human Interface and the Management of Information: Applications and Services, 138–45. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-40397-7_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Volochiy, Bogdan, Vitaliy Yakovyna, Oleksandr Mulyak, and Vyacheslav Kharchenko. "Availability Model of Critical Nuclear Power Plant Instrumentation and Control System with Non-Exponential Software Update Distribution." In Information and Communication Technologies in Education, Research, and Industrial Applications, 3–20. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-76168-8_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Information update system"

1

SBARDELLA, P., and R. BARICHELLO. "USING REMOTE SENSING TO UPDATE GEOGRAPHIC INFORMATION SYSTEM." In Proceedings of the First International Workshop on Multitemp 2001. WORLD SCIENTIFIC, 2002. http://dx.doi.org/10.1142/9789812777249_0049.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

S. S., Kolmogorova, Kolmogorov A. S., Baranov D. S., Biryukov S. V., Kolbina J. N., and Romanovskaya A. D. "SYSTEM OF DISTRIBUTED CONTROL OF TECHNOGENIC ELECTROMAGNETIC FIELDS WITH THE USE OF INTEGRATED SENSORS." In Mechanical Science and Technology Update. Omsk State Technical University, 2022. http://dx.doi.org/10.25206/978-5-8149-3453-6-2022-102-112.

Full text
Abstract:
The progress of all sectors of industry and global digitalization of society causes a number of problems due to the emergence of electromagnetic fields of technogenic nature, their effects on technical and biological objects. The article deals with the measurement of electromagnetic field parameters in the practical task of a decentralized "cloud" information-analytical system using new shapes of sensors as IoT-elements. The article presents the architecture of data collection from sensors of the new shapes with automatic evaluation and technical prediction. A new configuration of sensing elements and mathematical processing is presented. Also, a virtual instrument which realizes the platform with subsequent statistical evaluation. Intellectual system of data VI International scientific conference "Mechanical Science and Technology Update" 22–23 March 2022. Omsk, Russia 103 analysis in a decentralized measuring complex allows real-time assessment of critical parameters of the electromagnetic field.
APA, Harvard, Vancouver, ISO, and other styles
3

Bian, Zhifan, Yukun Li, Tinghai Yue, Pengfei Lei, Dexin Zhao, and Yingyuan Xiao. "An Information Update Method Towards Internal Search Engine." In 2015 12th Web Information System and Application Conference (WISA). IEEE, 2015. http://dx.doi.org/10.1109/wisa.2015.69.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hahner, J., C. Becker, P. J. Marron, and K. Rothermel. "Maintaining Update-Linearizability for Replicated Information in MANETs." In 2006 1st International Conference on Communication System Software and Middleware. IEEE, 2006. http://dx.doi.org/10.1109/comswa.2006.1665176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Robb, Jim. "System Wide Information Management (SWIM): Program overview and status update." In 2014 Integrated Communications, Navigation and Surveillance Conference (ICNS). IEEE, 2014. http://dx.doi.org/10.1109/icnsurv.2014.6820078.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sineva, Irina S., Vladislav Y. Denisov, and Vera D. Galinova. "Building Recommender System for Media with High Content Update Rate." In 2018 IEEE International Conference "Quality Management, Transport and Information Security, Information Technologies" (IT&QM&IS). IEEE, 2018. http://dx.doi.org/10.1109/itmqis.2018.8524903.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pholprasit, Thunyasit, Suporn Pongnumkul, Chalermpol Saiprasert, Sarinthon Mangkorn-ngam, and Lalida Jaritsup. "LiveBusTrack : High-frequency location update information system for shuttle/bus riders." In 2013 13th International Symposium on Communications and Information Technologies (ISCIT). IEEE, 2013. http://dx.doi.org/10.1109/iscit.2013.6645922.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Xue-bo, Jin, and Wang Lei-lei. "Tracking in Distributed Multisensor System Update with Out-of-Sequence Information." In 2009 International Conference on Computational Intelligence and Natural Computing (CINC). IEEE, 2009. http://dx.doi.org/10.1109/cinc.2009.37.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Esfandiari, M., H. Ramapriyan, J. Behnke, and E. Sofinowski. "Earth observing system (EOS) data and information system (EOSDIS) — evolution update and future." In 2007 IEEE International Geoscience and Remote Sensing Symposium. IEEE, 2007. http://dx.doi.org/10.1109/igarss.2007.4423727.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kim, Hyunjung, Jong Wook Kim, and Beakcheol Jang. "Indoor Positioning System using Sensor and Crowdsourcing Landmark Map Update." In 2019 International Conference on Green and Human Information Technology (ICGHIT). IEEE, 2019. http://dx.doi.org/10.1109/icghit.2019.00010.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Information update system"

1

Demeuov, Аrman, Zhanna Tilekova, Yerkin Tokpanov, Olena Hanchuk, Natalia Panteleeva, and Iryna Varfolomyeyeva. Use of GIS technology in geographical education. EDP Sciences, June 2021. http://dx.doi.org/10.31812/123456789/4619.

Full text
Abstract:
At the present stage, digital information technologies create a new education system focused on the global educational space. In general education schools, in connection with the adoption of the updated program, the section Geoinformatics and cartography provides for the use of developing a map-scheme, modeling and conducting small studies on the topic under study. As a result, digital technology has a place in geographical education. This is due to significant changes in the pedagogical and methodological approach in teaching geography and other disciplines. As a result, the education system has changed, the content of education has been updated, a new approach has appeared, a new attitude to geoinformation technologies in schools. The article discusses the importance of computer technologies in the education system, including the effectiveness and necessity of using geoinformation technologies. The article substantiates the relevance of the use of geoinformation technologies in the teaching of geography.
APA, Harvard, Vancouver, ISO, and other styles
2

Mackley, Rob D., George V. Last, and Craig H. Allwardt. Hanford Borehole Geologic Information System (HBGIS) Updated User?s Guide for Web-based Data Access and Export. Office of Scientific and Technical Information (OSTI), September 2008. http://dx.doi.org/10.2172/969742.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bakker, G., M. Heinen, H. P. A. Gooren, W. J. M. de Groot, F. B. T. Assinck, and E. W. J. Hummelink. Hydrofysische gegevens van de bodem in de Basisregistratie Ondergrond (BRO) en het Bodemkundig Informatie Systeem (BIS) : Update 2018. Wageningen: Wettelijke Onderzoekstaken Natuur & Milieu, 2019. http://dx.doi.org/10.18174/474161.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bakker, G., M. Heinen, H. P. A. Gooren, W. J. M. de Groot, and P. D. Peters. Hydrofysische gegevens van de bodem in de Basisregistratie Ondergrond (BRO) en het Bodemkundig Informatie Systeem (BIS) : Update 2019. Wageningen: Wettelijke Onderzoekstaken Natuur & Milieu, 2020. http://dx.doi.org/10.18174/526509.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chagas, Gabriel, Rafael Chagas, and Amanda Rangel. Effectiveness and Safety of Single Antiplatelet Therapy with P2Y12 Inhibitor Monotherapy versus Dual Antiplatelet Therapy After Percutaneous Coronary Intervention for Acute Coronary Syndrome: A Systematic Review and Meta-Analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, July 2022. http://dx.doi.org/10.37766/inplasy2022.7.0097.

Full text
Abstract:
Review question / Objective: What are the effects of single antiplatelet therapy with P2Y12 inhibitor monotherapy versus dual antiplatelet therapy after percutaneous coronary intervention for acute coronary syndrome? Condition being studied: Antiplatelet therapy after percutaneous coronary intervention for acute coronary syndrome. Information sources: The databases will be Medical Literature Analysis and Retrieval System Online (MEDLINE), Excerpta Medica Database (Embase), and Cochrane Library. Searches were conducted on July 25, 2022 and will be updated on August 25, 2022. There will be no language or publication period restrictions.
APA, Harvard, Vancouver, ISO, and other styles
6

Belles, Randy, Gary T. Mays, Olufemi A. Omitaomu, and Willis P. Poore III. Updated Application of Spatial Data Modeling and Geographical Information Systems (GIS) for Identification of Potential Siting Options for Small Modular Reactors. Office of Scientific and Technical Information (OSTI), September 2012. http://dx.doi.org/10.2172/1052267.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

FEDOTKINA, S. A., O. V. MUZALEVA, and E. V. KHUGAEVA. RETROSPECTIVE ANALYSIS OF THE USE OF TELEMEDICINE TECHNOLOGIES FOR THE PREVENTION, DIAGNOSIS AND TREATMENT OF HYPERTENSION. Science and Innovation Center Publishing House, 2021. http://dx.doi.org/10.12731/978-0-615-67320-2-4-22.

Full text
Abstract:
Introduction. The economic losses associated with disability due to diseases of the circulatory system, as well as the costs of providing medical care to patients suffering from heart and vascular diseases, are increasing annually. The state preventive measures currently being carried out are of a delayed nature. The results of the medical examination of the population of the Russian Federation in recent years (2015-2019) indicate that the incidence of cardiovascular diseases, including hypertension, is at a fairly high level. In the middle of the last century, the Concept of risk factors for the development of chronic non-communicable diseases were formulated, in the structure of which cardiovascular diseases, including arterial hypertension, occupies one of the primary positions. The concept is based on the results of promising epidemiological studies, and, at present, is a methodological basis for planning and organizing primary prevention of cardiovascular diseases. The purpose of the study. Based on the analysis of literary sources (including foreign ones) containing experience in the use of telemedicine technologies, to assess their significance for the prevention, diagnosis and treatment of hypertension, as well as forecasting improvements in the quality of medical care when adapting to the use of clinical recommendations. Materials and methods. The article provides an analytical review of the use of modern telemedicine technologies in the prevention of hypertension. The results of the study and their discussion. The analysis of literary sources has shown that in the context of the progress of information and telecommunication technologies in the healthcare system, a fundamentally new direction has appeared in the organization and provision of medical care to the population - telemedicine, which will ensure the modern level of prevention, detection and treatment of chronic non-communicable diseases, and also determines positive medical, social and economic performance indicators. To date, updates in the legislative framework of the Russian Federation are aimed at ensuring that medical care with the use of telemedicine technologies is more widespread, taking into account the standards of medical care and clinical recommendations. Conclusion. Based on a review of literature sources, it has been established that the modern solution to the problem of improving the quality of medical care for patients, including those with hypertension, diseases is medical care using telemedicine technologies that prove their medical, social and economic effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
8

Mahdavian, Farnaz. Germany Country Report. University of Stavanger, February 2022. http://dx.doi.org/10.31265/usps.180.

Full text
Abstract:
Germany is a parliamentary democracy (The Federal Government, 2021) with two politically independent levels of 1) Federal (Bund) and 2) State (Länder or Bundesländer), and has a highly differentiated decentralized system of Government and administration (Deutsche Gesellschaft für Internationale Zusammenarbeit, 2021). The 16 states in Germany have their own government and legislations which means the federal authority has the responsibility of formulating policy, and the states are responsible for implementation (Franzke, 2020). The Federal Government supports the states in dealing with extraordinary danger and the Federal Ministry of the Interior (BMI) supports the states' operations with technology, expertise and other services (Federal Ministry of Interior, Building and Community, 2020). Due to the decentralized system of government, the Federal Government does not have the power to impose pandemic emergency measures. In the beginning of the COVID-19 pandemic, in order to slowdown the spread of coronavirus, on 16 March 2020 the federal and state governments attempted to harmonize joint guidelines, however one month later State governments started to act more independently (Franzke & Kuhlmann, 2021). In Germany, health insurance is compulsory and more than 11% of Germany’s GDP goes into healthcare spending (Federal Statistical Office, 2021). Health related policy at the federal level is the primary responsibility of the Federal Ministry of Health. This ministry supervises institutions dealing with higher level of public health including the Federal Institute for Drugs and Medical Devices (BfArM), the Paul-Ehrlich-Institute (PEI), the Robert Koch Institute (RKI) and the Federal Centre for Health Education (Federal Ministry of Health, 2020). The first German National Pandemic Plan (NPP), published in 2005, comprises two parts. Part one, updated in 2017, provides a framework for the pandemic plans of the states and the implementation plans of the municipalities, and part two, updated in 2016, is the scientific part of the National Pandemic Plan (Robert Koch Institut, 2017). The joint Federal-State working group on pandemic planning was established in 2005. A pandemic plan for German citizens abroad was published by the German Foreign Office on its website in 2005 (Robert Koch Institut, 2017). In 2007, the federal and state Governments, under the joint leadership of the Federal Ministry of the Interior and the Federal Ministry of Health, simulated influenza pandemic exercise called LÜKEX 07, and trained cross-states and cross-department crisis management (Bundesanstalt Technisches Hilfswerk, 2007b). In 2017, within the context of the G20, Germany ran a health emergency simulation exercise with representatives from WHO and the World Bank to prepare for future pandemic events (Federal Ministry of Health et al., 2017). By the beginning of the COVID-19 pandemic, on 27 February 2020, a joint crisis team of the Federal Ministry of the Interior (BMI) and the Federal Ministry of Health (BMG) was established (Die Bundesregierung, 2020a). On 4 March 2020 RKI published a Supplement to the National Pandemic Plan for COVID-19 (Robert Koch Institut, 2020d), and on 28 March 2020, a law for the protection of the population in an epidemic situation of national scope (Infektionsschutzgesetz) came into force (Bundesgesundheitsministerium, 2020b). In the first early phase of the COVID-19 pandemic in 2020, Germany managed to slow down the speed of the outbreak but was less successful in dealing with the second phase. Coronavirus-related information and measures were communicated through various platforms including TV, radio, press conferences, federal and state government official homepages, social media and applications. In mid-March 2020, the federal and state governments implemented extensive measures nationwide for pandemic containment. Step by step, social distancing and shutdowns were enforced by all Federal States, involving closing schools, day-cares and kindergartens, pubs, restaurants, shops, prayer services, borders, and imposing a curfew. To support those affected financially by the pandemic, the German Government provided large economic packages (Bundesministerium der Finanzen, 2020). These measures have adopted to the COVID-19 situation and changed over the pandemic. On 22 April 2020, the clinical trial of the corona vaccine was approved by Paul Ehrlich Institute, and in late December 2020, the distribution of vaccination in Germany and all other EU countries
APA, Harvard, Vancouver, ISO, and other styles
9

Brophy, Kenny, and Alison Sheridan, eds. Neolithic Scotland: ScARF Panel Report. Society of Antiquaries of Scotland, June 2012. http://dx.doi.org/10.9750/scarf.06.2012.196.

Full text
Abstract:
The main recommendations of the Panel report can be summarised as follows: The Overall Picture: more needs to be understood about the process of acculturation of indigenous communities; about the Atlantic, Breton strand of Neolithisation; about the ‘how and why’ of the spread of Grooved Ware use and its associated practices and traditions; and about reactions to Continental Beaker novelties which appeared from the 25th century. The Detailed Picture: Our understanding of developments in different parts of Scotland is very uneven, with Shetland and the north-west mainland being in particular need of targeted research. Also, here and elsewhere in Scotland, the chronology of developments needs to be clarified, especially as regards developments in the Hebrides. Lifeways and Lifestyles: Research needs to be directed towards filling the substantial gaps in our understanding of: i) subsistence strategies; ii) landscape use (including issues of population size and distribution); iii) environmental change and its consequences – and in particular issues of sea level rise, peat formation and woodland regeneration; and iv) the nature and organisation of the places where people lived; and to track changes over time in all of these. Material Culture and Use of Resources: In addition to fine-tuning our characterisation of material culture and resource use (and its changes over the course of the Neolithic), we need to apply a wider range of analytical approaches in order to discover more about manufacture and use.Some basic questions still need to be addressed (e.g. the chronology of felsite use in Shetland; what kind of pottery was in use, c 3000–2500, in areas where Grooved Ware was not used, etc.) and are outlined in the relevant section of the document. Our knowledge of organic artefacts is very limited, so research in waterlogged contexts is desirable. Identity, Society, Belief Systems: Basic questions about the organisation of society need to be addressed: are we dealing with communities that started out as egalitarian, but (in some regions) became socially differentiated? Can we identify acculturated indigenous people? How much mobility, and what kind of mobility, was there at different times during the Neolithic? And our chronology of certain monument types and key sites (including the Ring of Brodgar, despite its recent excavation) requires to be clarified, especially since we now know that certain types of monument (including Clava cairns) were not built during the Neolithic. The way in which certain types of site (e.g. large palisaded enclosures) were used remains to be clarified. Research and methodological issues: There is still much ignorance of the results of past and current research, so more effective means of dissemination are required. Basic inventory information (e.g. the Scottish Human Remains Database) needs to be compiled, and Canmore and museum database information needs to be updated and expanded – and, where not already available online, placed online, preferably with a Scottish Neolithic e-hub that directs the enquirer to all the available sources of information. The Historic Scotland on-line radiocarbon date inventory needs to be resurrected and kept up to date. Under-used resources, including the rich aerial photography archive in the NMRS, need to have their potential fully exploited. Multi-disciplinary, collaborative research (and the application of GIS modelling to spatial data in order to process the results) is vital if we are to escape from the current ‘silo’ approach and address key research questions from a range of perspectives; and awareness of relevant research outside Scotland is essential if we are to avoid reinventing the wheel. Our perspective needs to encompass multi-scale approaches, so that ScARF Neolithic Panel Report iv developments within Scotland can be understood at a local, regional and wider level. Most importantly, the right questions need to be framed, and the right research strategies need to be developed, in order to extract the maximum amount of information about the Scottish Neolithic.
APA, Harvard, Vancouver, ISO, and other styles
10

Rankin, Nicole, Deborah McGregor, Candice Donnelly, Bethany Van Dort, Richard De Abreu Lourenco, Anne Cust, and Emily Stone. Lung cancer screening using low-dose computed tomography for high risk populations: Investigating effectiveness and screening program implementation considerations: An Evidence Check rapid review brokered by the Sax Institute (www.saxinstitute.org.au) for the Cancer Institute NSW. The Sax Institute, October 2019. http://dx.doi.org/10.57022/clzt5093.

Full text
Abstract:
Background Lung cancer is the number one cause of cancer death worldwide.(1) It is the fifth most commonly diagnosed cancer in Australia (12,741 cases diagnosed in 2018) and the leading cause of cancer death.(2) The number of years of potential life lost to lung cancer in Australia is estimated to be 58,450, similar to that of colorectal and breast cancer combined.(3) While tobacco control strategies are most effective for disease prevention in the general population, early detection via low dose computed tomography (LDCT) screening in high-risk populations is a viable option for detecting asymptomatic disease in current (13%) and former (24%) Australian smokers.(4) The purpose of this Evidence Check review is to identify and analyse existing and emerging evidence for LDCT lung cancer screening in high-risk individuals to guide future program and policy planning. Evidence Check questions This review aimed to address the following questions: 1. What is the evidence for the effectiveness of lung cancer screening for higher-risk individuals? 2. What is the evidence of potential harms from lung cancer screening for higher-risk individuals? 3. What are the main components of recent major lung cancer screening programs or trials? 4. What is the cost-effectiveness of lung cancer screening programs (include studies of cost–utility)? Summary of methods The authors searched the peer-reviewed literature across three databases (MEDLINE, PsycINFO and Embase) for existing systematic reviews and original studies published between 1 January 2009 and 8 August 2019. Fifteen systematic reviews (of which 8 were contemporary) and 64 original publications met the inclusion criteria set across the four questions. Key findings Question 1: What is the evidence for the effectiveness of lung cancer screening for higher-risk individuals? There is sufficient evidence from systematic reviews and meta-analyses of combined (pooled) data from screening trials (of high-risk individuals) to indicate that LDCT examination is clinically effective in reducing lung cancer mortality. In 2011, the landmark National Lung Cancer Screening Trial (NLST, a large-scale randomised controlled trial [RCT] conducted in the US) reported a 20% (95% CI 6.8% – 26.7%; P=0.004) relative reduction in mortality among long-term heavy smokers over three rounds of annual screening. High-risk eligibility criteria was defined as people aged 55–74 years with a smoking history of ≥30 pack-years (years in which a smoker has consumed 20-plus cigarettes each day) and, for former smokers, ≥30 pack-years and have quit within the past 15 years.(5) All-cause mortality was reduced by 6.7% (95% CI, 1.2% – 13.6%; P=0.02). Initial data from the second landmark RCT, the NEderlands-Leuvens Longkanker Screenings ONderzoek (known as the NELSON trial), have found an even greater reduction of 26% (95% CI, 9% – 41%) in lung cancer mortality, with full trial results yet to be published.(6, 7) Pooled analyses, including several smaller-scale European LDCT screening trials insufficiently powered in their own right, collectively demonstrate a statistically significant reduction in lung cancer mortality (RR 0.82, 95% CI 0.73–0.91).(8) Despite the reduction in all-cause mortality found in the NLST, pooled analyses of seven trials found no statistically significant difference in all-cause mortality (RR 0.95, 95% CI 0.90–1.00).(8) However, cancer-specific mortality is currently the most relevant outcome in cancer screening trials. These seven trials demonstrated a significantly greater proportion of early stage cancers in LDCT groups compared with controls (RR 2.08, 95% CI 1.43–3.03). Thus, when considering results across mortality outcomes and early stage cancers diagnosed, LDCT screening is considered to be clinically effective. Question 2: What is the evidence of potential harms from lung cancer screening for higher-risk individuals? The harms of LDCT lung cancer screening include false positive tests and the consequences of unnecessary invasive follow-up procedures for conditions that are eventually diagnosed as benign. While LDCT screening leads to an increased frequency of invasive procedures, it does not result in greater mortality soon after an invasive procedure (in trial settings when compared with the control arm).(8) Overdiagnosis, exposure to radiation, psychological distress and an impact on quality of life are other known harms. Systematic review evidence indicates the benefits of LDCT screening are likely to outweigh the harms. The potential harms are likely to be reduced as refinements are made to LDCT screening protocols through: i) the application of risk predication models (e.g. the PLCOm2012), which enable a more accurate selection of the high-risk population through the use of specific criteria (beyond age and smoking history); ii) the use of nodule management algorithms (e.g. Lung-RADS, PanCan), which assist in the diagnostic evaluation of screen-detected nodules and cancers (e.g. more precise volumetric assessment of nodules); and, iii) more judicious selection of patients for invasive procedures. Recent evidence suggests a positive LDCT result may transiently increase psychological distress but does not have long-term adverse effects on psychological distress or health-related quality of life (HRQoL). With regards to smoking cessation, there is no evidence to suggest screening participation invokes a false sense of assurance in smokers, nor a reduction in motivation to quit. The NELSON and Danish trials found no difference in smoking cessation rates between LDCT screening and control groups. Higher net cessation rates, compared with general population, suggest those who participate in screening trials may already be motivated to quit. Question 3: What are the main components of recent major lung cancer screening programs or trials? There are no systematic reviews that capture the main components of recent major lung cancer screening trials and programs. We extracted evidence from original studies and clinical guidance documents and organised this into key groups to form a concise set of components for potential implementation of a national lung cancer screening program in Australia: 1. Identifying the high-risk population: recruitment, eligibility, selection and referral 2. Educating the public, people at high risk and healthcare providers; this includes creating awareness of lung cancer, the benefits and harms of LDCT screening, and shared decision-making 3. Components necessary for health services to deliver a screening program: a. Planning phase: e.g. human resources to coordinate the program, electronic data systems that integrate medical records information and link to an established national registry b. Implementation phase: e.g. human and technological resources required to conduct LDCT examinations, interpretation of reports and communication of results to participants c. Monitoring and evaluation phase: e.g. monitoring outcomes across patients, radiological reporting, compliance with established standards and a quality assurance program 4. Data reporting and research, e.g. audit and feedback to multidisciplinary teams, reporting outcomes to enhance international research into LDCT screening 5. Incorporation of smoking cessation interventions, e.g. specific programs designed for LDCT screening or referral to existing community or hospital-based services that deliver cessation interventions. Most original studies are single-institution evaluations that contain descriptive data about the processes required to establish and implement a high-risk population-based screening program. Across all studies there is a consistent message as to the challenges and complexities of establishing LDCT screening programs to attract people at high risk who will receive the greatest benefits from participation. With regards to smoking cessation, evidence from one systematic review indicates the optimal strategy for incorporating smoking cessation interventions into a LDCT screening program is unclear. There is widespread agreement that LDCT screening attendance presents a ‘teachable moment’ for cessation advice, especially among those people who receive a positive scan result. Smoking cessation is an area of significant research investment; for instance, eight US-based clinical trials are now underway that aim to address how best to design and deliver cessation programs within large-scale LDCT screening programs.(9) Question 4: What is the cost-effectiveness of lung cancer screening programs (include studies of cost–utility)? Assessing the value or cost-effectiveness of LDCT screening involves a complex interplay of factors including data on effectiveness and costs, and institutional context. A key input is data about the effectiveness of potential and current screening programs with respect to case detection, and the likely outcomes of treating those cases sooner (in the presence of LDCT screening) as opposed to later (in the absence of LDCT screening). Evidence about the cost-effectiveness of LDCT screening programs has been summarised in two systematic reviews. We identified a further 13 studies—five modelling studies, one discrete choice experiment and seven articles—that used a variety of methods to assess cost-effectiveness. Three modelling studies indicated LDCT screening was cost-effective in the settings of the US and Europe. Two studies—one from Australia and one from New Zealand—reported LDCT screening would not be cost-effective using NLST-like protocols. We anticipate that, following the full publication of the NELSON trial, cost-effectiveness studies will likely be updated with new data that reduce uncertainty about factors that influence modelling outcomes, including the findings of indeterminate nodules. Gaps in the evidence There is a large and accessible body of evidence as to the effectiveness (Q1) and harms (Q2) of LDCT screening for lung cancer. Nevertheless, there are significant gaps in the evidence about the program components that are required to implement an effective LDCT screening program (Q3). Questions about LDCT screening acceptability and feasibility were not explicitly included in the scope. However, as the evidence is based primarily on US programs and UK pilot studies, the relevance to the local setting requires careful consideration. The Queensland Lung Cancer Screening Study provides feasibility data about clinical aspects of LDCT screening but little about program design. The International Lung Screening Trial is still in the recruitment phase and findings are not yet available for inclusion in this Evidence Check. The Australian Population Based Screening Framework was developed to “inform decision-makers on the key issues to be considered when assessing potential screening programs in Australia”.(10) As the Framework is specific to population-based, rather than high-risk, screening programs, there is a lack of clarity about transferability of criteria. However, the Framework criteria do stipulate that a screening program must be acceptable to “important subgroups such as target participants who are from culturally and linguistically diverse backgrounds, Aboriginal and Torres Strait Islander people, people from disadvantaged groups and people with a disability”.(10) An extensive search of the literature highlighted that there is very little information about the acceptability of LDCT screening to these population groups in Australia. Yet they are part of the high-risk population.(10) There are also considerable gaps in the evidence about the cost-effectiveness of LDCT screening in different settings, including Australia. The evidence base in this area is rapidly evolving and is likely to include new data from the NELSON trial and incorporate data about the costs of targeted- and immuno-therapies as these treatments become more widely available in Australia.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography