Academic literature on the topic 'Records Australia Management Data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Records Australia Management Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Records Australia Management Data processing"

1

Goh, Elaine. "Clear skies or cloudy forecast?" Records Management Journal 24, no. 1 (March 11, 2014): 56–73. http://dx.doi.org/10.1108/rmj-01-2014-0001.

Full text
Abstract:
Purpose – Using the example of audiovisual materials, this paper aims to illustrate how records-related and archival legislation lags behind advances in technology. As more audiovisual materials are created on the cloud, questions arise about the applicability of national laws over the control, ownership, and custody of data and records. Design/methodology/approach – This paper analyses court cases relating to audiovisual materials in the cloud and archival legislation from three Commonwealth countries: Canada, Australia, and Singapore – representing North America, the Pacific, and Asia respectively. Findings – Current records-related and archival legislation does not effectively address the creation, processing, and preservation of records and data in a cloud environment. The paper identifies several records-related risks linked to the cloud – risks related to the ownership and custody of data, legal risks due to transborder data flow, and risks due to differing interpretations on the act of copying and ownership of audiovisual materials. Research limitations/implications – The paper identifies the need for records professionals to pay greater attention to the implications of the emerging cloud environment. There is a need for further research on how the concept of extraterritoriality and transborder laws can be applied to develop model laws for the management and preservation of records in the cloud. Originality/value – The paper identifies record-related risks linked to the cloud by analyzing court cases and archival legislation. The paper examines maritime law to find useful principles that the archival field could draw on to mitigate some of these risks.
APA, Harvard, Vancouver, ISO, and other styles
2

Campos-Andaur, Paulina, Karen Padilla-Lobo, Nicolás Contreras-Barraza, Guido Salazar-Sepúlveda, and Alejandro Vega-Muñoz. "The Wine Effects in Tourism Studies: Mapping the Research Referents." Sustainability 14, no. 5 (February 23, 2022): 2569. http://dx.doi.org/10.3390/su14052569.

Full text
Abstract:
This research provides an empirical overview of articles and authors referring to research on wine tourism, analyzed from 2000 to 2021, and what they contribute to deepening the Sustainable Development Goals (SDGs) 8. The articles were examined through a bibliometric approach based on data from 199 records stored in the Web of Science (JCR), applying traditional bibliometric laws, and using VOSviewer for data processing and metadata. The results highlight an exponential increase in scientific production without interruptions between 2005 and 2020, with a concentration in only 35 highly cited authors, where the hegemony is held by Australia, among the co-authorship networks of worldwide relevance. The main topics observed in the literature are local development through wine tourism, sustainability and nature conservation, and strategies for sustainable development. Finally, there are six articles with great worldwide influence in wine tourism studies that maintain in their entirety the contribution made by researchers affiliated with Australian universities.
APA, Harvard, Vancouver, ISO, and other styles
3

Haile-Mariam, M., E. Schelfhorst, and M. E. Goddard. "Effect of data collection methods on the availability of calving ease, fertility and herd health data for evaluating Australian dairy cattle." Australian Journal of Experimental Agriculture 47, no. 6 (2007): 664. http://dx.doi.org/10.1071/ea05267.

Full text
Abstract:
There is concern in the Australian dairy industry that the fertility, calving ease and disease resistance of cows is declining and that this decline is, at least in part, a genetic change. Improvement in these traits might be achieved through better herd management and genetic selection. Both these strategies are dependant on the availability of suitable data. The Australian Dairy Herd Improvement Scheme publishes estimated breeding values for fertility, calving ease and somatic cell count. However, the accuracy of the estimated breeding values is limited by the amount and quality of data collected. This paper reports on a project conducted to identify a more efficient system for collecting non-production data, with the hypothesis that quantity and quality of data collected would improve if farmers used electronic data collection methods instead of ‘traditional’ methods, such as writing in a notebook. Of 78 farmers involved in the trial, 51 used a PALM handheld (PALM group), 18 wrote data on paper and later entered it in their farm computer (PC group) and nine submitted a paper record to their data processing centres for entry into the centres’ computers (PAPER group). Data collected from these 78 trial herds during the trial period (2002–04) were compared to data collected from 88 similar non-trial farms, which kept records on PC or paper. The ratio of number of events (health, calving ease or fertility) recorded to number of calvings was considered as a measure of level of recording. The results showed that, after adjusting for location and level of recording before the trial started, the PALM group collected significantly more calving ease, pregnancy test and other fertility data per calving than farmers who were not involved in the trial and PAPER and PC groups. The number of records collected by the PALM group increased from 0.13 pregnancy tests in 2001 to 0.36 per calving in 2004, whereas there was little change in the amount of data collected by the other groups. Similarly, the number of calving ease records increased from 0.26 in 2001 to 0.33 in 2004 and the number of heats recorded increased from 0.02 in 2001 to 0.12 in 2004. This increase in data capture among farmers using the PALM was partly due to an increase in the number of farmers who submitted any data at all. For instance, of the PALM group, 86% sent data on calving ease and 61% on pregnancy, as compared to those from the PC and PAPER groups (below 57%) or those who were not involved in the trial (below 44%). When farmers who at least submitted one record of each type of data are considered, farmers in the PALM group still submitted significantly more fertility event data than those who were not involved in the trial and those in the PAPER group. The quality of the data did not appear to be affected by the data collection methods, though the completeness of the mating data was better in PALM and PC users. The use of electronic data entry on farms would increase the amount of data available for the calculation of estimated breeding values and hence the accuracy of these values for fertility, calving ease and health traits.
APA, Harvard, Vancouver, ISO, and other styles
4

van Gemert, Caroline, Rebecca Guy, Mark Stoove, Wayne Dimech, Carol El-Hayek, Jason Asselin, Clarissa Moreira, et al. "Pathology Laboratory Surveillance in the Australian Collaboration for Coordinated Enhanced Sentinel Surveillance of Sexually Transmitted Infections and Blood-Borne Viruses: Protocol for a Cohort Study." JMIR Research Protocols 8, no. 8 (August 8, 2019): e13625. http://dx.doi.org/10.2196/13625.

Full text
Abstract:
Background Passive surveillance is the principal method of sexually transmitted infection (STI) and blood-borne virus (BBV) surveillance in Australia whereby positive cases of select STIs and BBVs are notified to the state and territory health departments. A major limitation of passive surveillance is that it only collects information on positive cases and notifications are heavily dependent on testing patterns. Denominator testing data are important in the interpretation of notifications. Objective The aim of this study is to establish a national pathology laboratory surveillance system, part of a larger national sentinel surveillance system called ACCESS (the Australian Collaboration for Coordinated Enhanced Sentinel Surveillance). ACCESS is designed to utilize denominator testing data to understand trends in case reporting and monitor the uptake and outcomes of testing for STIs and BBVs. Methods ACCESS involves a range of clinical sites and pathology laboratories, each with a separate method of recruitment, data extraction, and data processing. This paper includes pathology laboratory sites only. First established in 2007 for chlamydia only, ACCESS expanded in 2012 to capture all diagnostic and clinical monitoring tests for STIs and BBVs, initially from pathology laboratories in New South Wales and Victoria, Australia, to at least one public and one private pathology laboratory in all Australian states and territories in 2016. The pathology laboratory sentinel surveillance system incorporates a longitudinal cohort design whereby all diagnostic and clinical monitoring tests for STIs and BBVs are collated from participating pathology laboratories in a line-listed format. An anonymous, unique identifier will be created to link patient data within and between participating pathology laboratory databases and to clinical services databases. Using electronically extracted, line-listed data, several indicators for each STI and BBV can be calculated, including the number of tests, unique number of individuals tested and retested, test yield, positivity, and incidence. Results To date, over 20 million STI and BBV laboratory test records have been extracted for analysis for surveillance monitoring nationally. Recruitment of laboratories is ongoing to ensure appropriate coverage for each state and territory; reporting of indicators will occur in 2019 with publication to follow. Conclusions The ACCESS pathology laboratory sentinel surveillance network is a unique surveillance system that collects data on diagnostic testing, management, and care for and of STIs and BBVs. It complements the ACCESS clinical network and enhances Australia’s capacity to respond to STIs and BBVs. International Registered Report Identifier (IRRID) DERR1-10.2196/13625
APA, Harvard, Vancouver, ISO, and other styles
5

Schlumpf, Heidi, Nina Gaze, Hugh Grenfell, Frances Duff, Kelly Hall, Judith Charles, and Benjamin Mortensen. "Data Detectives - The Backlog Cataloguing Project at Auckland War Memorial Museum." Biodiversity Information Science and Standards 2 (June 15, 2018): e25194. http://dx.doi.org/10.3897/biss.2.25194.

Full text
Abstract:
The Collection Access and Readiness Programme (CARP) is a unique, well-defined programme with committed funding at Auckland War Memorial Museum (AWMM). In the Natural Sciences department, CARP has funded the equivalent of five positions over five collecting areas for four years. These are filled by six part-time collection technicians and a senior full-time manager. As Collection Technicians, our role, across Botany, Entomology, Geology, Marine, and Palaeontology, is to digitise acquisitions prior to December 2012. We are processing the backlogs of our collections, which are prioritised across all museum activities in distinct taxonomic projects. The cataloguing method involves gathering and verifying all available information and entering data into Vernon, our collections management system (https://vernonsystems.com/products/vernon-cms/), with specifically designed record standards aligned to Darwin Core (Wieczorek et al. 2012). CARP has allowed us the freedom to explore backlog collections, some of which have not been fully processed, revealing mysteries that would otherwise have sat undiscovered, and to resolve uncertainties across the collections. For example, in Botany, cataloguing the foreign ferns reveals previously unrealised type specimens; in Marine, cataloguing all 9117 specimen lots of the New Zealand Bivalvia collection, brought classification and locality data uncertainties to resolution. There are multiple projects running concurrently in each collecting area, continually enriching our collection data. In turn, this is opening up a far wider range of information to the public through our online collection portal, AWMM Collections Online http://www.aucklandmuseum.com/discover/collections-online (currently 800,000 records). Open accessibility promotes careful consideration of how and what data we deliver, as it is disseminated through global portals, such as the Global Biodiversity Information Facility (GBIF) and Atlas of Living Australia (ALA). Collections that have often had no more attention than recording of their original labels, have interesting stories beyond “just” cataloguing them. As cataloguers, we have found that the uncertainties or sometimes apparent lack of detail increases our engagement with our collections. Rather than solely copying information into the database, we become detectives, resolving uncertainties and verifying the background of our objects, collection sites and collectors. This engagement and the global reach of our data mean that we are invested in the programme, so that data entry continuity and accuracy are maximised. Our presentation will give an overview of the CARP and our method, and a look at our progress two years in, highlighting some of our discoveries and how the uncertainty in our data allows us to engage more with our collections.
APA, Harvard, Vancouver, ISO, and other styles
6

Oruganti, Yagna. "Technology Focus: Data Analytics (October 2021)." Journal of Petroleum Technology 73, no. 10 (October 1, 2021): 60. http://dx.doi.org/10.2118/1021-0060-jpt.

Full text
Abstract:
With a moderate- to low-oil-price environment being the new normal, improving process efficiency, thereby leading to hydrocarbon recovery at reduced costs, is becoming the need of the hour. The oil and gas industry generates vast amounts of data that, if properly leveraged, can generate insights that lead to recovering hydrocarbons with reduced costs, better safety records, lower costs associated with equipment downtime, and reduced environmental footprint. Data analytics and machine-learning techniques offer tremendous potential in leveraging the data. An analysis of papers in OnePetro from 2014 to 2020 illustrates the steep increase in the number of machine-learning-related papers year after year. The analysis also reveals reservoir characterization, formation evaluation, and drilling as domains that have seen the highest number of papers on the application of machine-learning techniques. Reservoir characterization in particular is a field that has seen an explosion of papers on machine learning, with the use of convolutional neural networks for fault detection, seismic imaging and inversion, and the use of classical machine-learning algorithms such as random forests for lithofacies classification. Formation evaluation is another area that has gained a lot of traction with applications such as the use of classical machine-learning techniques such as support vector regression to predict rock mechanical properties and the use of deep-learning techniques such as long short-term memory to predict synthetic logs in unconventional reservoirs. Drilling is another domain where a tremendous amount of work has been done with papers on optimizing drilling parameters using techniques such as genetic algorithms, using automated machine-learning frameworks for bit dull grade prediction, and application of natural language processing for stuck-pipe prevention and reduction of nonproductive time. As the application of machine learning toward solving various problems in the upstream oil and gas industry proliferates, explainable artificial intelligence or machine-learning interpretability becomes critical for data scientists and business decision-makers alike. Data scientists need the ability to explain machine-learning models to executives and stakeholders to verify hypotheses and build trust in the models. One of the three highlighted papers used Shapley additive explanations, which is a game-theory-based approach to explain machine-learning outputs, to provide a layer of interpretability to their machine-learning model for identification of identification of geomechanical facies along horizontal wells. A cautionary note: While there is significant promise in applying these techniques, there remain many challenges in capitalizing on the data—lack of common data models in the industry, data silos, data stored in on-premises resources, slow migration of data to the cloud, legacy databases and systems, lack of digitization of older/legacy reports, well logs, and lack of standardization in data-collection methodologies across different facilities and geomarkets, to name a few. I would like to invite readers to review the selection of papers to get an idea of various applications in the upstream oil and gas space where machine-learning methods have been leveraged. The highlighted papers cover the topics of fatigue dam-age of marine risers and well performance optimization and identification of frackable, brittle, and producible rock along horizontal wells using drilling data. Recommended additional reading at OnePetro: www.onepetro.org. SPE 201597 - Improved Robustness in Long-Term Pressure-Data Analysis Using Wavelets and Deep Learning by Dante Orta Alemán, Stanford University, et al. SPE 202379 - A Network Data Analytics Approach to Assessing Reservoir Uncertainty and Identification of Characteristic Reservoir Models by Eugene Tan, the University of Western Australia, et al. OTC 30936 - Data-Driven Performance Optimization in Section Milling by Shantanu Neema, Chevron, et al.
APA, Harvard, Vancouver, ISO, and other styles
7

Hallinan, Christine Mary, Sedigheh Khademi Habibabadi, Mike Conway, and Yvonne Ann Bonomo. "Social media discourse and internet search queries on cannabis as a medicine: A systematic scoping review." PLOS ONE 18, no. 1 (January 20, 2023): e0269143. http://dx.doi.org/10.1371/journal.pone.0269143.

Full text
Abstract:
The use of cannabis for medicinal purposes has increased globally over the past decade since patient access to medicinal cannabis has been legislated across jurisdictions in Europe, the United Kingdom, the United States, Canada, and Australia. Yet, evidence relating to the effect of medical cannabis on the management of symptoms for a suite of conditions is only just emerging. Although there is considerable engagement from many stakeholders to add to the evidence base through randomized controlled trials, many gaps in the literature remain. Data from real-world and patient reported sources can provide opportunities to address this evidence deficit. This real-world data can be captured from a variety of sources such as found in routinely collected health care and health services records that include but are not limited to patient generated data from medical, administrative and claims data, patient reported data from surveys, wearable trackers, patient registries, and social media. In this systematic scoping review, we seek to understand the utility of online user generated text into the use of cannabis as a medicine. In this scoping review, we aimed to systematically search published literature to examine the extent, range, and nature of research that utilises user-generated content to examine to cannabis as a medicine. The objective of this methodological review is to synthesise primary research that uses social media discourse and internet search engine queries to answer the following questions: (i) In what way, is online user-generated text used as a data source in the investigation of cannabis as a medicine? (ii) What are the aims, data sources, methods, and research themes of studies using online user-generated text to discuss the medicinal use of cannabis. We conducted a manual search of primary research studies which used online user-generated text as a data source using the MEDLINE, Embase, Web of Science, and Scopus databases in October 2022. Editorials, letters, commentaries, surveys, protocols, and book chapters were excluded from the review. Forty-two studies were included in this review, twenty-two studies used manually labelled data, four studies used existing meta-data (Google trends/geo-location data), two studies used data that was manually coded using crowdsourcing services, and two used automated coding supplied by a social media analytics company, fifteen used computational methods for annotating data. Our review reflects a growing interest in the use of user-generated content for public health surveillance. It also demonstrates the need for the development of a systematic approach for evaluating the quality of social media studies and highlights the utility of automatic processing and computational methods (machine learning technologies) for large social media datasets. This systematic scoping review has shown that user-generated content as a data source for studying cannabis as a medicine provides another means to understand how cannabis is perceived and used in the community. As such, it provides another potential ‘tool’ with which to engage in pharmacovigilance of, not only cannabis as a medicine, but also other novel therapeutics as they enter the market.
APA, Harvard, Vancouver, ISO, and other styles
8

Unwin, Elizabeth, James Codde, Louise Gill, Suzanne Stevens, and Timothy Nelson. "The WA Hospital Morbidity Data System: An Evaluation of its Performance and the Impact of Electronic Data Transfer." Health Information Management 26, no. 4 (December 1996): 189–92. http://dx.doi.org/10.1177/183335839702600407.

Full text
Abstract:
This paper evaluates the performance of the Hospital Morbidity Data System, maintained by the Health Statistics Branch (HSB) of the Health Department of Western Australia (WA). The time taken to process discharge summaries was compared in the first and second halves of 1995, using the number of weeks taken to process 90% of all discharges and the percentage of records processed within four weeks as indicators of throughput. Both the hospitals and the HSB showed improvements in timeliness during the second half of the year. The paper also examines the impact of a recently introduced electronic data transfer system for WA country public hospitals on the timeliness of morbidity data. The processing time of country hospital records by the HSB was reduced to a similar time as for metropolitan hospitals, but the processing time in the hospitals increased, resulting in little improvement in total processing time.
APA, Harvard, Vancouver, ISO, and other styles
9

Mesibov, Robert. "An audit of some processing effects in aggregated occurrence records." ZooKeys 751 (April 20, 2018): 129–46. http://dx.doi.org/10.3897/zookeys.751.24791.

Full text
Abstract:
A total of ca 800,000 occurrence records from the Australian Museum (AM), Museums Victoria (MV) and the New Zealand Arthropod Collection (NZAC) were audited for changes in selected Darwin Core fields after processing by the Atlas of Living Australia (ALA; for AM and MV records) and the Global Biodiversity Information Facility (GBIF; for AM, MV and NZAC records). Formal taxon names in the genus- and species-groups were changed in 13–21% of AM and MV records, depending on dataset and aggregator. There was little agreement between the two aggregators on processed names, with names changed in two to three times as many records by one aggregator alone compared to records with names changed by both aggregators. The type status of specimen records did not change with name changes, resulting in confusion as to the name with which a type was associated. Data losses of up to 100% were found after processing in some fields, apparently due to programming errors. The taxonomic usefulness of occurrence records could be improved if aggregators included both original and the processed taxonomic data items for each record. It is recommended that end-users check original and processed records for data loss and name replacements after processing by aggregators.
APA, Harvard, Vancouver, ISO, and other styles
10

van Ginneken, A. M. "Modelling Domain-Knowledge: a Step toward Intelligent Data Management." Methods of Information in Medicine 32, no. 04 (1993): 270–71. http://dx.doi.org/10.1055/s-0038-1634940.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Records Australia Management Data processing"

1

Wong, Sze-nga, and 王絲雅. "The impact of electronic health record on diabetes management : a systematic review." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hdl.handle.net/10722/193850.

Full text
Abstract:
Objectives: To investigate the impact of electronic health record (EHR) on diabetes management through examination of the effectiveness of implementation of EHR and to improve the quality of care and the cost-effectiveness on the use of EHR. Methods: Three databases, PubMed, Ovid Medline and Google Scholar, were searched with specific combination keywords including electronic medical record and electronic health record, and diabetes. Quality appraisal and extraction of data were conducted on literature that met with the inclusion criteria. Results: 10 literature studies, a total of 204,251 participants with diabetes, were included in this study. All subjects, with similar demographic and clinical characteristics, were from clinic and primary care setting with the use of EHR. Different outcome measures were compared and to evaluate the effectiveness of EHR on quality of care and cost-effectiveness. Discussion: The impact of EHR on effectiveness of diabetes management, potential factors of barrier for adoption and the limitation for implementation of EHR were discussed. These suggested that further research is needed to have stronger evidence to widespread the use of EHR in Hong Kong as a future direction on public health issue. Conclusion: In this systematic review, EHR showed potential benefit in improving the quality of care and reduce the health care expenditure for long term running. Patient safety and efficiency are yet to be covered in the studies. Further research is needed on the acceptability and applicability of the use of EHR in Hong Kong.
published_or_final_version
Public Health
Master
Master of Public Health
APA, Harvard, Vancouver, ISO, and other styles
2

Chava, Nalini. "Administrative reporting for a hospital document scanning system." Virtual Press, 1996. http://liblink.bsu.edu/uhtbin/catkey/1014839.

Full text
Abstract:
This thesis will examine the manual hospital document retrieval system and electronic document scanning system. From this examination, requirements will be listed for the Administrative Reporting for the Hospital Document Scanning System which will provide better service and reliability than the previous systems. To assure that the requirements can be met, this will be developed into a working system which is named as the Administrative Reporting for the Hospital Document Scanning System(ARHDSS).
Department of Computer Science
APA, Harvard, Vancouver, ISO, and other styles
3

Harmse, Magda Susanna. "Physicians' perspectives on personal health records: a descriptive study." Thesis, Nelson Mandela Metropolitan University, 2016. http://hdl.handle.net/10948/6876.

Full text
Abstract:
A Personal Health Record (PHR) is an electronic record of a patient’s health-related information that is managed by the patient. The patient can give access to other parties, such as healthcare providers and family members, as they see fit. These parties can use the information in emergency situations, in order to help improve the patient’s healthcare. PHRs have an important role to play in ensuring that a patient’s complete health history is available to his healthcare providers at the point of care. This is especially true in South Africa, where the majority of healthcare organizations still rely on paper-based methods of record-keeping. Research indicates that physicians play an important role in encouraging the adoption of PHRs amongst patients. Whilst various studies have focused on the perceptions of South African citizens towards PHRs, to date no research has focused on the perceptions of South African physicians. Considering the importance of physicians in encouraging the adoption of PHRs, the problem being addressed by this research project thus relates to the lack of information relating to the perceptions of South African physicians of PHRs. Physicians with private practices at private hospitals in Port Elizabeth, South Africa were surveyed in order to determine their perceptions towards PHRs. Results indicate perceptions regarding benefits to the physician and the patient, as well as concerns to the physician and the patient. The levels of trust in various potential PHR providers and the potential uses of a PHR for the physician were also explored. The results of the survey were compared with the results of relevant international literature in order to describe the perceptions of physicians towards PHRs.
APA, Harvard, Vancouver, ISO, and other styles
4

Gibbs, Edward. "A business plan to launch a document management product in the United Kingdom." Thesis, Stellenbosch : University of Stellenbosch, 2009. http://hdl.handle.net/10019.1/4958.

Full text
Abstract:
Thesis (MBA (Business Management))--University of Stellenbosch, 2009.
ENGLISH ABSTRACT: A Document Management System (DMS) can help businesses speed-up as well as reduce the number of mundane and repetitive tasks relating to documents. These benefits can assist management cut costs, reduce errors, automate frequently performed tasks as well as store information electronically in a safe and accessible way. Making IT Simple (the business) is a start-up business that has entered into a partnership agreement with INVU, Europe's fastest growing DMS Software developer (INVU, 2008). This agreement allows the business to sell INVU software without restriction by volume or geography to all sectors of industry. INVU products are designed to make business administration, and especially managing documents a simple and efficient process. These two principle product features support the business' objectives and marketing push by delivering easy-to-use software that helps customers reduce costs by speeding-up as well as reduce the number of daily administrative tasks performed using documents. In order to establish which market sector to target, the Directors conducted an industry analysis (Appendix I) which has identified opportunities in the farming and agricultural Sector. The three primary reasons are: 1) Sustainable sector growth of 30% per annum (UK Agriculture, 2007), 2) no known DMS competition within Farming and 3) the Directors have an established Network and detailed knowledge of the sector. Although farming and agriculture will be the main focus of marketing activity and communication, other industries, such as property letting agencies, are also seen as future opportunities for the business. Target Market and Projections Market Research is based upon 138 questionnaire responses that have enabled the Directors to develop a product package which combines the product, a DMS license and software, together with the necessary hardware and maintenance support sufficient to satisfy the target market's needs. The target market is defined as a farm business, predominantly farming crops or cattle and/or sheep from between 41 and 80 hectares of productive land. It has up to 20 full time employees, half of whom are involved in the business' administration. This admin comprises mainly of financial accounts and Government Department returns which are processed on as many as two computers which have email, Microsoft Office and accounting software packages loaded on. For security and access the target market store their records in filing cabinets for up to 15 years, mostly in paper form. A priority for business' administration within this market is the simplicity and easy access of its records and with all records being in one place. The sales forecasts of 7, 16 and 30 units over the first three years 2009, 2010 and 2011 respectively are deemed achievable by the Directors, having received reassurance from both formal interviews (Radley, 2008) and conversations with potential customers. The business sees their target market strategy and their lean cost base as being their competitive advantage together with the impression that none of INVU's DMS competitors are currently focussed upon the agriculture sector. This gives the business a potential first mover advantage which will be supported by leveraging the marketing efforts and the existing network of contacts to deliver the specifically designed sales process for the target market. The suite of products, which are leased by the customer over 36 months, cost £15,016 and have a Gross Profit of 58%. This gross profit then funds the running and maintenance of the support service provided by the company. Return on Equity over the 3 years of the Business Plan is strong at 60% given that there is a loss of £50,000 in Year 1. Year two generates a small profit of £24,000 with a healthy £64,000 in Year 3 onwards. Return on sales is 10% in year 2 growing to 14% in year 3. Break even point is in July 2011 (month 31) with the business cash positive in August 2010 (month 20). The financial risks are calculated as low due to the lease funding providing cash within 14 days of a signed document, plus there is no cash risk as the requirement to hold inventory is negligible. The balance sheet requires equity funding of £40,000 which is provided by the two directors at £20,000 each, plus a 60 month working-capital bank loan which is forecast to be repaid on month 25. There are 46,000 farms in UK so, in this market sector alone there are potentially 9,200 (20%) sales presentations to be completed based upon the market survey data. On projected performance this would currently take the business 460 months to complete. The business case shows an improving performance month on month based upon an improvement in sales skills, product portfolio and brand awareness. The two employees Edward Gibbs and Mathew Easterbrook, both of whom are Directors, have each invested £20,000 in equity in order to start-up the business. They have 28 years of management experience between them and offer complementing specialities in the IT, sales, farming and finance areas. Their business objectives are to generate cash and satisfy customer needs by selling products at the right price with a sustainable gross profit margin whilst being commercially aggressive on costs. Their simple and equitable company structure reflects their excellent relationship and the balance of power is shared equally. The product is a compliment of hardware, software and support service that is tailored to meet a customer's needs (Table 1).
AFRIKAANSE OPSOMMING: 'n Dokumentbestuurstelsel (DBS) kan organisasies help om meer effektief sake te doen en om die aantal eenvoudige en herhalende take met die hantering van dokumente te verminder. Hierdie voordele kan bestuur help om koste te verminder, foute te verminder, gereelde take te outomatiseer en ook om inligting elektronies in 'n veilige en maklik bereikbare plek te stoor. Making IT simple (die onderneming) is 'n nuwe onderneming wat in 'n vennootskap ooreenkoms met INVU, Europa se snel groeiendste DBS sagteware verskaffer, aangegaan het. Hierdie ooreenkoms laat die ondememing toe om INVU sagteware te verkoop in alle industriee met geen bepreking op volumes of geografiese areas nie. INVU produkte is ontwerp om besigheid administrasie, en spesifiek die bestuur van dokumente, 'n eenvoudige en effektiewe proses te maak. Hierdie twee beginsels ondersteun die onderneming se doelwitte en bemarking deur die lewering van eenvoudig-om-te-gebruik sagteware wat kliente help om kostes te verlaag deur vinniger prosesse sowel as verminderde daaglikse roetine administratiewe dokumentering take. Ten einde die marksegment wat geteiken moet word te bepaal het die direkteure 'n industrie analise (Aanhangsel I) gedoen wat geleenthede in die boerdery en landbou sektore uitgewys het. Die drie primere redes is: 1) Substansiele sektor groei van 30% per jaar (UK Agriculture, 2007), 2) geen bestaande DBS wat tans aktief in die mark is nie en 3) die direkteure het 'n gevestigde netwerk en detail kennis van die sektor. Alhoewel boerdery en landbou die hooffokus van die bemarkings en kommunikasie aktiwiteite gaan wees sal ander industriee, soos eiendoms en verhurings agentskappe, ook gesien word as toekoms geleenthede vir die onderneming. Marknavorsing is gebaseer op 138 voltooide vraelyste wat die direkteure in staat gestel het om 'n produk pakket bestaande uit 'n DBS lisensie en sagteware saam met die nodige hardeware en ondersteunings saam te bondel, ten einde aan die mark se behoefte te voorsien. Die teikenmark is gedefinieer as boerdery ondernemings wat primer boer met gewasse en/of skape op tussen 41 en 80 hektaar produktiewe landbougrond. Die boerdery het ongeveer 20 voltydse werknemers waarvan die helfte met die onderneming se administrasie te doen het. Die administrasie bestaan meestal uit finansiele take en Regerings Departemente se verslae wat geprosesseer word op tot twee rekenaars met toegang tot Microsoft Office en 'n rekenkundige sagteware pakket. Vir sekuriteit en maklike toegang tot hulle dokumente stoor die tipe ondernemings hulle dokumente in liasseerkabinette vir tot 15 jaar, meestal in papier formaat. 'n Prioriteit vir ondernemings se administrasie binne die teikenmark is eenvoud en maklike toegang tot hulle dokumente, asook die sentrale berging van dokumente op een spesifieke plek. Die verkoops vooruitskattings vir 7, 12 en 30 eenhede oor die eerste drie jare 2009, 2010 en 2011, word gesien as realisties en bereikbaar deur die direkteure na aanleiding van formele onderhoude (Radley, 2008) en gesprekke met potensiele kliente. Die onderneming sien hulle teikenmark strategie en hulle lae koste struktuur as hulle kompeterende voordeel tesame met die feite dat geen van INVU se DBS kompeteerders huidiglik op die landbou sektor fokus nie. Die gee die ondememing die potensiele eerstetoetreder voordeel in die landbou industrie. Dit sal ondersteun word deur die hefboom effek van die bemarkings pogings komende uit die bestaande netwerk van kontakte om gefokusde verkoopsprosesse in die teikenmark uit te voer. Die suite van produkte, wat gehuur word deur die kliente oor 'n tydperk van 36 maande, kos £15,016 en het 'n bruto wins marge van 58%. Hierdie bruto wins befonds die bedryf en ondersteuningsdienste van die onderneming. Die opbrengs op ekwiteit oor die drie jaar tydperk van die besigheidsplan is stewig op 60% gegewe dat daar 'n verlies van £50,000 in Jaar 1 plaasvind. Jaar 2 genereer 'n klein wins van £24,000 met 'n stewige wins van £64,000 in Jaar 3. Opbrengs op verkope is 10% in jaar 2 en groei tot 14% in jaar 3. Die gelykbreekpunt is Julie 2011 (maand 31) en die onderneming is kontant positief vanaf Augustus 2010 (maand 20). Die finansiele risiko is laag as gevolg van die huur inkomste wat kontant binne 14 dae na die teken van 'n ooreenkoms genereer en ook omdat daar geen kontant risiko is met die dra van voorraad nie. Die balansstaat ekwiteit benodig 'n aanvangsbelegging van £40,000 wat voorsien word deur die twee direkteure teen £20,000 elk, saam met 'n 60 maande werkskapitaal bank lening wat geprojekteer word om volopbetaal te wees teen maand 25. Daar is tans 46,000 plase in die VK en dus, in die marksegrnent alleen, 9,200 (20%) verkoops voorleggings gebaseer op die markanalise data. Op die geprojekteerde werkstempo sal dit ongeveer 460 maande neem om te voltooi. Die besigheidmodel toon groeiende verrigting op 'n maand tot maand basis gebaseer op 'n verbetering in verkooptegniek, produk portefeulje en produk kennis. Die twee werknemers Edward Gibbs en Mathew Easterbrook, wat ook die direkteure is, het elk £20,000 aanvangskapitaal geinvesteer ten einde die onderneming op die been te bring. Hulle het 28 jaar bestuurservaring en het komplimenterende vermoens in die IT, verkope, boerdery en finansiele areas. Hulle besigheid doelwit is om kontant te genereer en kliente se behoeftes te bevredig deur die verkope van produkte teen die korrekte prys teen 'n volhoubare bruto wins. Hulle eenvoudige maatskappy struktuur reflekteer hulle uitstekende verhouding en die magsbalanse is eweredig versprei. Die aanbod aan die mark is 'n komplimenterende suite van hardeware, sagteware en steundienste wat aangepas word om aan 'n klient se behoeftes te voldoen (Tabel 1).
APA, Harvard, Vancouver, ISO, and other styles
5

Chipfumbu, Colletor Tendeukai. "Engendering the meaningful use of electronic medical records: a South African perspective." Thesis, Nelson Mandela Metropolitan University, 2016. http://hdl.handle.net/10948/18420.

Full text
Abstract:
Theoretically, the use of Electronic Medical Records (EMRs) holds promise of numerous benefits in healthcare provision, including improvement in continuity of care, quality of care and safety. However, in practice, there is evidence that the adoption of electronic medical records has been slow and where adopted, often lacks meaningful use. Thus there is a clear dichotomy between the ambitions for EMR use and the reality of EMR implementation. In the USA, a legislative approach was taken to turn around the situation. Other countries such as Canada and European countries have followed suit (in their own way) to address the adoption and meaningful use of electronic medical records. The South African e-Health strategy and the National Health Normative Standards Framework for Interoperability in eHealth in South Africa documents both recommend the adoption of EMRs. Much work has been done to establish a baseline for standards to ensure interoperability and data portability of healthcare applications and data. However, even with the increased focus on e-Health, South Africa remains excessively reliant on paper-based medical records. Where health information technologies have been adopted, there is lack of coordination between and within provinces, leading to a multitude of systems and vendors. Thus there is a lack of systematic adoption and meaningful use of EMRs in South Africa. The main objective of this research is to develop the components required to engender meaningful use of electronic medical records in the South African healthcare context. The main contributors are identified as EMR certification and consistent, proper use of certified EMRs. Literature review, a Delphi study and logical argumentation are used to develop the relevant components for the South African healthcare context. The benefits of EMRs can only be realized through systematic adoption and meaningful use of EMRs, thus this research contributes to providing a road map for engendering the meaningful use of EMRs with the ultimate aim of improving healthcare in the South African healthcare landscape.
APA, Harvard, Vancouver, ISO, and other styles
6

Ahern, Anthony J. "The management of information technology investments in the Australian ambulance services." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 1994. https://ro.ecu.edu.au/theses/1105.

Full text
Abstract:
Information Technology plays a significant role in the administration and operation of most organisations today. This is certainly the case with each of the Australian Ambulance Services. With the rapid increase in the use of Information Technology and the expectation about its use by both staff and the general public, the Ambulance Service managements' are faced with the dilemma of trying to ensure that their organisations are able to get the full advantage of advances in Information Technology and at the same time ensure that investments in IT are maintained at appropriate levels that will ensure the maximum return on the investment in terms of the Ambulance Service achieving its mission and objectives. The research considers three questions: How are IT investment decisions determined? How are levels of IT investments determined? Do IT investments contribute to the organisation's overall effectiveness? The general feeling by the ambulance service CEOs is that the investment in IT has been worthwhile in terms of contributing to the organisation being more effective. These findings are contrary to a study by United Research/Business Week and described by LaPlante (1988) where less than half of CEOs surveyed felt that their organisation did an excellent job of linking computer strategy to corporate goals.
APA, Harvard, Vancouver, ISO, and other styles
7

Forsyth, Rowena Public Health &amp Community Medicine Faculty of Medicine UNSW. "Tricky technology, troubled tribes: a video ethnographic study of the impact of information technology on health care professionals??? practices and relationships." Awarded by:University of New South Wales. School of Public Health and Community Medicine, 2006. http://handle.unsw.edu.au/1959.4/30175.

Full text
Abstract:
Whilst technology use has always been a part of the practice of health care delivery, more recently, information technology has been applied to aspects of clinical work concerned with documentation. This thesis presents an analysis of the ways that two professional groups, one clinical and one ancillary, at a single hospital cooperatively engage in a work practice that has recently been computerised. It investigates the way that a clinical group???s approach to and actual use of the system creates problems for the ancillary group. It understands these problems to arise from the contrasting ways that the groups position their use of documentation technology in their local definitions of professional status. The data on which analysis of these practices is based includes 16 hours of video recordings of the work practices of the two groups as they engage with the technology in their local work settings as well as video recordings of a reflexive viewing session conducted with participants from the ancillary group. Also included in the analysis are observational field notes, interviews and documentary analysis. The analysis aimed to produce a set of themes grounded in the specifics of the data, and drew on TLSTranscription?? software for the management and classification of video data. This thesis seeks to contribute to three research fields: health informatics, sociology of professions and social science research methodology. In terms of health informatics, this thesis argues for the necessity for health care information technology design to understand and incorporate the work practices of all professional groups who will be involved in using the technology system or whose work will be affected by its introduction. In terms of the sociology of professions, this thesis finds doctors and scientists to belong to two distinct occupational communities that each utilise documentation technology to different extents in their displays of professional competence. Thirdly, in terms of social science research methodology, this thesis speculates about the possibility for viewing the engagement of the groups with the research process as indicative of their reactions to future sources of outside perturbance to their work.
APA, Harvard, Vancouver, ISO, and other styles
8

Robinson, Jeffrey Brett, University of Western Sydney, of Science Technology and Environment College, and School of Environment and Agriculture. "Understanding and applying decision support systems in Australian farming systems research." THESIS_CSTE_EAG_Robinson_J.xml, 2005. http://handle.uws.edu.au:8081/1959.7/642.

Full text
Abstract:
Decision support systems (DSS) are usually based on computerised models of biophysical and economic systems. Despite early expectations that such models would inform and improve management, adoption rates have been low, and implementation of DSS is now “critical” The reasons for this are unclear and the aim of this study is to learn to better design, develop and apply DSS in farming systems research (FSR). Previous studies have explored the merits of quantitative tools including DSS, and suggested changes leading to greater impact. In Australia, the changes advocated have been: Simple, flexible, low cost economic tools: Emphasis on farmer learning through soft systems approaches: Understanding the socio-cultural contexts of using and developing DSS: Farmer and researcher co-learning from simulation modelling and Increasing user participation in DSS design and implementation. Twenty-four simple criteria were distilled from these studies, and their usefulness in guiding the development and application of DSS were assessed in six FSR case studies. The case studies were also used to better understand farmer learning through models of decision making and learning. To make DSS useful complements to farmers’ existing decision-making repertoires, they should be based on: (i) a decision-oriented development process, (ii) identifying a motivated and committed audience, (iii) a thorough understanding of the decision-makers context, (iv) using learning as the yardstick of success, and (v) understanding the contrasts, contradictions and conflicts between researcher and farmer decision cultures
Doctor of Philosophy (PhD)
APA, Harvard, Vancouver, ISO, and other styles
9

Ling, Meng-Chun. "Senior health care system." CSUSB ScholarWorks, 2005. https://scholarworks.lib.csusb.edu/etd-project/2785.

Full text
Abstract:
Senior Health Care System (SHCS) is created for users to enter participants' conditions and store information in a central database. When users are ready for quarterly assessments the system generates a simple summary that can be reviewed, modified, and saved as part of the summary assessments, which are required by Federal and California law.
APA, Harvard, Vancouver, ISO, and other styles
10

Bantom, Simlindile Abongile. "Accessibility to patients’ own health information: a case in rural Eastern Cape, South Africa." Thesis, Cape Peninsula University of Technology, 2016. http://hdl.handle.net/20.500.11838/2411.

Full text
Abstract:
Thesis (MTech (Information Technology))--Cape Peninsula University of Technology, 2016.
Access to healthcare is regarded as a basic and essential human right. It is widely known that ICT solutions have potential to improve access to healthcare, reduce healthcare cost, reduce medical errors, and bridge the digital divide between rural and urban healthcare centres. The access to personal healthcare records is, however, an astounding challenge for both patients and healthcare professionals alike, particularly within resource-restricted environments (such as rural communities). Most rural healthcare institutions have limited or non-existent access to electronic patient healthcare records. This study explored the accessibility of personal healthcare records by patients and healthcare professionals within a rural community hospital in the Eastern Cape Province of South Africa. The case study was conducted at the St. Barnabas Hospital with the support and permission from the Faculty of Informatics and Design, Cape Peninsula University of Technology and the Eastern Cape Department of Health. Semi-structured interviews, observations, and interactive co-design sessions and focus groups served as the main data collection methods used to determine the accessibility of personal healthcare records by the relevant stakeholders. The data was qualitatively interpreted using thematic analysis. The study highlighted the various challenges experienced by healthcare professionals and patients, including time-consuming manual processes, lack of infrastructure, illegible hand-written records, missing records and illiteracy. A number of recommendations for improved access to personal healthcare records are discussed. The significance of the study articulates the imperative need for seamless and secure access to personal healthcare records, not only within rural areas but within all communities.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Records Australia Management Data processing"

1

Takeda, Hiroshi. E-Health: First IMIA/IFIP Joint Symposium, E-Health 2010, Held as Part of WCC 2010, Brisbane, Australia, September 20-23, 2010. Proceedings. Berlin, Heidelberg: IFIP International Federation for Information Processing, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ginn, Mary L. (Mary Lea), 1947-, ed. Records management. 9th ed. Mason, OH: South-Western Cengage Learning, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Stewart, Jeffrey Robert. Records and database management. 4th ed. New York: Gregg Division, McGraw-Hill, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Steve, Clark. Guidelines on computer-assisted records management. Ottawa, Ont: National Archives of Canada, Govt. Records Branch, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Understanding data and information systems for recordkeeping. New York: Neal-Schuman Publishers, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Vecchioli, Lisa. Ev aluating student records management software. [College Park, Md.]: ERIC Clearinghouse on Assessment and Evaluation, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Guard, United States Coast. Military personnel data records (PDR) system. [Washington, DC] (2100 Second St., S.W., Washington 20593-0001): U.S. Dept. of Transportation, U.S. Coast Guard, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Guard, United States Coast. Military personnel data records (PDR) system. [Washington, DC] (2100 Second St., S.W., Washington 20593-0001): U.S. Dept. of Transportation, U.S. Coast Guard, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ballast, David Kent. Computerized records management in architectural and engineering offices: A bibliography. Monticello, Ill., USA: Vance Bibliographies, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Health, Zambia Ministry of. Procedure manual for data management for HIV/AIDS services. Lusaka: Govt. of the Republic of Zambia, Ministry of Health, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Records Australia Management Data processing"

1

Kwanya, Tom. "Big Data in Land Records Management in Kenya: A Fit and Viability Analysis." In Lecture Notes in Business Information Processing, 15–24. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-08618-7_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Xu, Jianghong, Cherry Jiang, Yezun Qu, Wenting Zhong, and Zhiwang Gan. "Analysis and FP-Growth Algorithm Design on Discount Data of Department Store Members." In Proceeding of 2021 International Conference on Wireless Communications, Networking and Applications, 797–804. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-2456-9_80.

Full text
Abstract:
AbstractThis paper mainly studies the discount data of department store members. The researsh shows that the total supply of discounted goods and the number of reward points issued have the most significant relationship with customer activation rate, the increase of discount rate and coverage scale would increase the activation rate of inactive members and invalid members. The increase of the score rate may have a stronger incentive effect on active members, but it has no obvious incentive effect on inactive and ineffective members. In addition, by integrating the commodity records of each purchase, and analyzing association rules, commodity combinations with associated consumption relationships are obtained, and the analysis model of commodity portfolio association rules is established. This paper is mainly based on the data of the member information, the sale water meter, the member consumption detailed list, the merchandise information table, through the data processing and analysis, rejects the abnormal data, prepares for the following processing. By analyzing the characteristics of member consumption and the difference between member and non-member consumption, we can provide marketing suggestions for the store manager FP-growth Algorithm is designed to evaluate the purchasing power of members based on their gender, length of membership, age and consumption frequency, and each parameter of the model is explained, so as to improve the management level of the shopping mall. On this basis, Suggestions for promotional activities in shopping malls are given.
APA, Harvard, Vancouver, ISO, and other styles
3

Nguyen, Chinh, Rosemary Stockdale, Helana Scheepers, and Jason Sargent. "Electronic Records Management - An Old Solution to a New Problem." In Big Data, 2249–74. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-9840-6.ch102.

Full text
Abstract:
The rapid development of technology and interactive nature of Government 2.0 (Gov 2.0) is generating large data sets for Government, resulting in a struggle to control, manage, and extract the right information. Therefore, research into these large data sets (termed Big Data) has become necessary. Governments are now spending significant finances on storing and processing vast amounts of information because of the huge proliferation and complexity of Big Data and a lack of effective records management. On the other hand, there is a method called Electronic Records Management (ERM), for controlling and governing the important data of an organisation. This paper investigates the challenges identified from reviewing the literature for Gov 2.0, Big Data, and ERM in order to develop a better understanding of the application of ERM to Big Data to extract useable information in the context of Gov 2.0. The paper suggests that a key building block in providing useable information to stakeholders could potentially be ERM with its well established governance policies. A framework is constructed to illustrate how ERM can play a role in the context of Gov 2.0. Future research is necessary to address the specific constraints and expectations placed on governments in terms of data retention and use.
APA, Harvard, Vancouver, ISO, and other styles
4

Marimuthu, Ramalatha, Shivappriya S. N., and Saroja M. N. "Generation and Management of Data for Healthcare and Health Diagnostics." In Theory and Practice of Business Intelligence in Healthcare, 106–32. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-2310-0.ch005.

Full text
Abstract:
Healthcare Analytics deals with patient records, effective management of hospitals, and clinical care. But the big data available is still not enough for focused research as it is complicated to find insights from complex, noisy, heterogeneous, and voluminous data, which takes time and effort, while a small clinical data will be more effective for decision making. The health care data also varies in data collection methods and their processing methods. Data generated through patient records is structured, wearable technologies generate semi structured data, and X rays and images provide unstructured data. Storing and extracting information from the structured, semi-structured, and unstructured data is a challenging task. Different machine learning techniques can simplify the process. The chapter discusses the data characteristics, identifying critical attributes, various classification and optimization algorithms for decision making purposes. The purpose of the discussion is to create a basis for selection of algorithms based on size, temporal validity, and outcomes expected.
APA, Harvard, Vancouver, ISO, and other styles
5

Cuzzocrea, Alfredo. "Synopsis Data Structures for Representing, Querying, and Mining Data Streams." In Handbook of Research on Innovations in Database Technologies and Applications, 701–15. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-242-8.ch075.

Full text
Abstract:
Data-stream query processing and mining is an emerging challenge for the database research community. This issue has recently gained the attention from the academic as well as the industrial world. Data streams are continuously flooding data produced by untraditional information sources. They are generated by a growing number of data entities, among all we recall performance measurements in network monitoring and traffic management systems, call details records in telecommunication systems, transactions in retail chains, ATM operations, log records generated by Web servers, sensor network data, RFID- (radio frequency identification) based readings, and so forth.
APA, Harvard, Vancouver, ISO, and other styles
6

Mukhtar, Wafaa Faisal, and Eltayeb Salih Abuelyaman. "Opportunities and Challenges of Big Data in Healthcare." In Handbook of Research on Healthcare Administration and Management, 47–58. IGI Global, 2017. http://dx.doi.org/10.4018/978-1-5225-0920-2.ch004.

Full text
Abstract:
Healthcare big data streams from multiple information sources at an alarming volume, velocity, and variety. The challenge that faces the healthcare industry is extracting meaningful value from such sources. This chapter investigates the diversity and forms of data in the healthcare sector, reviews the methods used to search and analyze these data throughout the past years, and the use of machine learning and data mining techniques to mine useful knowledge from such data. The chapter will also highlight innovations of particular systems and tools which spot the fine approaches for different healthcare data, raise the standard of care and recap the tools and data collection methods. The authors emphasize some of ethical issues regarding processing these records and some data privacy issues.
APA, Harvard, Vancouver, ISO, and other styles
7

Devi, K. Renuka. "Preserving the Healthcare Data by Using PPDM Technique." In Handbook of Research on Complexities, Management, and Governance in Healthcare, 300–317. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-6684-6044-3.ch022.

Full text
Abstract:
The recent advancement in technology catalyse the processing of vast amount of data. It leads to the purpose of extracting and storing those data in an electronic form. Data mining contributes a significant part in fulfilling this process. In the realm of data mining, privacy and security are two important problems. To resolve the above problems, the need of privacy preserving in data mining (PPDM) has been arisen. Under healthcare domain, the data mining will process the sensitive information which includes name, age, health records etc., Hence, PPDM plays a key role in securing that sensitive information and protecting those data from the intruders. Individual data is not only protected by PPDM, but it is also protected from leakage. In this paper, the PPDM technique such as anonymization technique and cryptographic technique has been analysed deeply with the view of its execution time. It is also focused that based on comparison, which method would be more efficient for protecting data from unauthorized access in the healthcare domain.
APA, Harvard, Vancouver, ISO, and other styles
8

Friedman, Nurit L., and Nava Pliskin. "Demonstrating Value-Added Utilization of Existing Databases for Organizational Decision-Support." In Advances in Information Resources Management, 275–93. IGI Global, 2004. http://dx.doi.org/10.4018/978-1-59140-253-4.ch013.

Full text
Abstract:
Managers of healthcare organizations are increasingly aware that the potential of medical information systems exceeds mere support of routine administrative and clinical transaction processing. This chapter describes a case study about Maccabi Health Services, the second largest health maintenance organization in Israel and the first one to computerize clinical records resulting from routine transactions in doctors’ offices, laboratories, and pharmacies. In this case about decision-making support practices, recycling the content of existing databases made it possible to discover patterns of sub-optimal treatment without having to invest time and money in additional data-collection procedures. The case study thus demonstrates value-added utilization of patient data, beyond uses intended at the beginning, for effectively supporting the implementation and evaluation of disease-management programs. Lessons learned about organizational benefits reaped from the organization’s decision-support practices include implications for such initiatives as data warehousing, data mining, and online analytical processing.
APA, Harvard, Vancouver, ISO, and other styles
9

Dale, Peter, and John McLaughlin. "Land Information Management." In Land Administration. Oxford University Press, 2000. http://dx.doi.org/10.1093/oso/9780198233909.003.0012.

Full text
Abstract:
A key component of land administration is the management of land and property related data. Such data may be held in manual or digital form although, increasingly, all land related records are being computerized for ease of storage and retrieval. Data are raw collections of facts that, from a land administration perspective, may be gathered and written down as numbers and text, for instance in a surveyor’s field book, or collected and stored digitally through the use of ‘data loggers’ and computers. They may also be held graphically as on maps or aerial photographs. Data become information when processed into a form meaningful to a decision-maker. The usefulness of this information will depend upon the quality of the data and especially on the extent to which they are up to date, accurate, complete, comprehensive, understandable, and accessible. Even then, good data do not necessarily produce good management decisions since other factors may be involved, such as the qualities of the data user; the converse is however true, namely that poor quality data will almost certainly result in bad decision-making. Land and property related data are increasingly managed within formal land information systems (LIS). As with all information systems, LIS use a combination of human and technical resources, together with a set of organizing procedures, to produce information in support of management activities (Dale and McLaughlin 1988). Increasingly, the technologies that drive the data processing are components of geographic information systems (GIS). There has been much debate about the nature of GIS, some seeing them as sets of hardware, software, and data while others have seen them as all-embracing institutional arrangements of which the technology is only part. In the following discussion, GIS will be treated as the former and restricted to the acquisition and assembly of spatial data; their processing, storage, and maintenance; and their retrieval, analysis, and dissemination. By analogy with the motor car, GIS are the engines that power the car and data are the fuel; operating a transportation system is, however, a more complex process.
APA, Harvard, Vancouver, ISO, and other styles
10

Chae, Sena, Jiyoun Song, Marietta Ojo, and Maxim Topaz. "Identifying Heart Failure Symptoms and Poor Self-Management in Home Healthcare: A Natural Language Processing Study." In Studies in Health Technology and Informatics. IOS Press, 2021. http://dx.doi.org/10.3233/shti210653.

Full text
Abstract:
The goal of this natural language processing (NLP) study was to identify patients in home healthcare with heart failure symptoms and poor self-management (SM). The preliminary lists of symptoms and poor SM status were identified, NLP algorithms were used to refine the lists, and NLP performance was evaluated using 2.3 million home healthcare clinical notes. The overall precision to identify patients with heart failure symptoms and poor SM status was 0.86. The feasibility of methods was demonstrated to identify patients with heart failure symptoms and poor SM documented in home healthcare notes. This study facilitates utilizing key symptom information and patients’ SM status from unstructured data in electronic health records. The results of this study can be applied to better individualize symptom management to support heart failure patients’ quality-of-life.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Records Australia Management Data processing"

1

Tiku, Sanjay, Aaron Dinovitzer, and Scott Ironside. "Effect of Pressure Loading on Integrity Management." In 2008 7th International Pipeline Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/ipc2008-64618.

Full text
Abstract:
Integrity assessment or life predictions for in-service pipelines are sensitive to the assumptions they rely upon. One significant source of uncertainty is the pipeline operating pressure data often captured and archived using a Supervisory Control and Data Acquisition (SCADA) system. SCADA systems may be programmed to collect and archive data differently from one pipeline to another and the resulting pressure records can be significantly different on the basis of the sampling techniques, data processing and the distance from pump and compressor stations. This paper illustrates some of the issues involved in pressure load characterization and is based upon work sponsored by the Pipeline Research Council International (PRCI). A series of sensitivity studies using fatigue crack growth calculations have been carried out to evaluate several factors that can influence crack stability and growth predictions that are often employed in pipeline integrity planning and repair programs. The results presented will highlight the issues related to performing integrity management based upon pump/compressor discharge or suction SCADA data to characterize the potential severity of pressure fluctuation or peak pressure dependent defects, illustrate the differences in fatigue crack growth rates along a pipeline segment and demonstrate the complexity of pressure cycle severity characterization, based upon distance from discharge, elevation, hydraulic gradient, for different sites along the pipeline route.
APA, Harvard, Vancouver, ISO, and other styles
2

Switzner, Nathan, Peter Veloo, Michael Rosenfeld, Troy Rovella, and Jonathan Gibbs. "An Approach to Establishing Manufacturing Process and Vintage of Line Pipe Using In-Situ Nondestructive Examination and Historical Manufacturing Data." In 2020 13th International Pipeline Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/ipc2020-9589.

Full text
Abstract:
Abstract The October 2019 revisions to US federal rules governing natural gas pipelines require Operators to establish vintage and manufacturing process for line-pipe assets with incomplete records. Vintage and manufacturing process are considerations when establishing populations of pipe for maximum allowable operating pressure (MAOP) reconfirmation, materials verification, and integrity management programs. Additionally, the rule changes establish an allowance to utilize in-situ nondestructive examination (NDE) technologies to verify line-pipe material properties including strength, composition, microstructure, and hardness. Economic and market demands have driven changes in steelmaking technologies and pipe-forming approaches. Knowledge of the relationships between processing, microstructure and mechanical properties have been fundamental to the evolution of steel line pipe manufacturing. Product specifications and standards for the manufacture and testing of pipe and tube have crystallized this evolution as performance expectations increased. The resulting manufacturing process changes have left a variety of “fingerprints” observable from in-situ materials verification NDE data, when analyzed holistically. The purpose of this work is to enable operators to begin leveraging these fingerprints to illuminate the vintage and manufacturing process of their line pipe assets using the NDE data. A method is proposed to re-establish line-pipe asset manufacturing and vintage records using historical line pipe manufacturing practices and NDE materials verification data.
APA, Harvard, Vancouver, ISO, and other styles
3

Turini, Turini, Willy Eka Septian, and Widya Jati Lestari. "ACCOUNTING INFORMATION SYSTEM FOR SPARE PARTS CASH SALES USING THE CASH BASIS METHOD AT THE VIE JAYA MOTOR WORKSHOP CIREBON." In Global Conference on Business and Management Proceedings. Goodwood Conferences, 2022. http://dx.doi.org/10.35912/gcbm.v1i1.13.

Full text
Abstract:
Bengkel Vie Jaya Motor is a business unit engaged in the sale of motorcycle spare parts and provides motorcycle service. The Vie Jaya Motor workshop in handling sales still uses a manual system, namely recording every transaction that occurs in the workshop still using a daily sales report book without detailed information, so that making reports takes a long time. With the problems that exist in the Vie Jaya Motor workshop, the purpose of this research is to create an accounting information system for motorcycle spare parts cash sales which can later provide an overview of the workshop in the data processing process. For the method of collecting data through interviews and conducting library research by searching for information through other sources such as books and the internet related to the problem to be discussed with the aim of making it easier to solve the problem. The analysis and system design stages are described using flowcharts, flowmaps, context diagrams, DFD, and ERD. After this, the next stage is the design of the software system. The design of the software system uses PHP (Hypertext Pre Processor) as a programming language and MySQL as database management. The result of the cash sales accounting information system that has been made by the researcher is to make records and calculations in carrying out cash sales transactions for motorcycle spare parts with the cash basis method. This method was chosen because revenue recognition is recorded when a transaction has occurred where the money is actually received or issued. The accounting information system created can also streamline the time of business activities in making sales journals and reports.
APA, Harvard, Vancouver, ISO, and other styles
4

Tsai, Hanchung, Yung Y. Liu, and James Shuler. "RFID Technology for Environmental Remediation and Radioactive Waste Management." In ASME 2010 13th International Conference on Environmental Remediation and Radioactive Waste Management. ASMEDC, 2010. http://dx.doi.org/10.1115/icem2010-40218.

Full text
Abstract:
An advanced Radio Frequency Identification (RFID) system capable of tracking and monitoring a wide range of materials and components—from fissionable stocks to radioactive wastes—has been developed. The system offers a number of advantages, including enhanced safety, security and safeguards, reduced personnel exposure to radiation, and improved inventory control and cost-effectiveness. Using sensors, RFID tags can monitor the state of health of the tracked items and trigger alarms instantly when the normal ranges are violated. Nonvolatile memories in the tags can store sensor data, event records, as well as a contents manifest. Gamma irradiation tests showed that the tag components possess significant radiation resistance. Long-life batteries and smart management circuitries permit the tags to operate for up to 10 years without battery replacement. The tags have a near universal form factor, i.e., they can fit different package types. The read range is up to >100 m with no line-of-sight required. With careful implementation, even a large-size processing or storage facility with a complex configuration can be monitored with a handful of readers in a network. In transportation, by incorporating Global Positioning System (GPS), satellite/cellular communication technology, and secure Internet, situation awareness is assured continuously. The RFID system, when integrated with Geographic Information System (GIS) technology, can promptly provide content- and event-specific information to first responders and emergency management teams in case of incidents. In stand-alone applications, the monitoring and tracking data are contained within the local computer. With a secure Internet, information can be shared within the complex or even globally in real time. As with the deployment of any new technology, overcoming the cultural resistance is part of the developmental process. With a strong institutional support and multiple successful live demonstrations, the cultural resistance has been mostly overcome. As a result, implementation of the RFID technology is taking place at several of U.S. Department of Energy sites and laboratories for processing, storage, and transportation applications.
APA, Harvard, Vancouver, ISO, and other styles
5

Asadi, Sadegh, and Abbas Khaksar. "Analytical and Numerical Sand Production Prediction Calibrated with Field Data, Example from High-Rate Gas Wells." In SPE Asia Pacific Oil & Gas Conference and Exhibition. SPE, 2022. http://dx.doi.org/10.2118/210776-ms.

Full text
Abstract:
Abstract Sand production prediction is essential from the early stages of field development planning for well completion design and later for production management. Unconsolidated and weakly consolidated sandstones are prone to fail at low flowing bottom hole pressures during hydrocarbon production. To predict the sand-free drawdown, a robust sand prediction model that integrates near-wellbore and in-situ stresses, rock mechanical properties, well trajectory, reservoir pressure, production and depletion trends is required. Sanding prediction models should be calibrated with field data such as production and well tests observation. In the absence of field data, numerical techniques can provide a reliable estimate on potential onset and severity of sanding at various reservoir pressures. In this study, analytical and finite-element numerical models are independently used to predict the onset of sanding and volume of produced sand from high rate has wells with weakly consolidated sandstone reservoirs in onshore, Western Australia. The analytical method uses a poro-elastic model and core-calibrated log-derived rock strength profiles with an empirical effective rock strength factor (ESF). In the study, the ESF was calibrated against documented field sanding observation from a well test extended flow period at the initial reservoir pressure under a low drawdown pressure. The numerical method uses a poro-elasto-plastic model defined from triaxial core tests. The rock failure criterion in the numerical method is based on a critical strain limit (CSL) corresponding to the failure of the inner wall of thick-walled cylinder core tests that can also satisfy the existing wells sanding observations. To verify the onset and severity of sanding predicted by the analytical model, numerical simulations for an identical sandstone interval are developed to investigate the corresponding CSL. This combined analytical and numerical modelling calibrated with field data provided high confidence in the sanding evaluation and their application for future well completion and sand management decisions. The analytical model was finally used for sanding assessment over field life pressure condition because of its processing simplicity, speed and flexibility in assessing various pressure and rock strength scenarios with sensitivity analysis over the whole production interval in compared with the numerical method which is more suitable for single-depth, single pressure condition and well and perforation trajectory modelling.
APA, Harvard, Vancouver, ISO, and other styles
6

Ren, Weiju, and Lianshan Lin. "Current Status of the ASME Materials Properties Database Development." In ASME 2017 Pressure Vessels and Piping Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/pvp2017-65754.

Full text
Abstract:
The ASME Materials Properties Database has been under development in the past few years to support the ASME Codes and Standards under the supervision of the Boiler and Pressure Vessel Code Committee on Materials. With the guidance of its Working Group on Materials Database, the project has completed the Phase I development for the Data File Warehouse that offers a depository for various files containing ASME Code Week records, materials test data from codification inquiries, and information associated with code rules development. While the database is in operation, the development has continued into Phase II to create a relational Digital Database that offers customized and relational schemas with advanced software functionalities and tools facilitating digital data processing and management. This paper discusses the current status of the project including its development management, database components and features, business operation, and future growth. Some issues and prospective resolutions for meeting the needs and requirements from Codes and Standards are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
7

Yu, John P., Chengwei Lei, Duncan Wong, Jason Choi, and Jason Cotton. "Blockchain Applications in Oilfield Underground Injection Operations." In International Petroleum Technology Conference. IPTC, 2021. http://dx.doi.org/10.2523/iptc-21786-ms.

Full text
Abstract:
Abstract This research project has successfully built a Distributed Ledger Technology (DLT) based prototype using R3 Corda open source. Its purpose applies in the oil & gas underground injection control (UIC) operations for the underground aquifer protection. This DLT prototype is a permissioned network that allows oil & gas companies to create, disseminate, and trace immutable records. The network enables oil and gas companies, government regulatory agency, and all other participants to share secure records such as well information while maintaining data integrity, traceability, and security. The purpose is to create a network of trust among all the stakeholders in the UIC processes for underground aquifer protection. In this DLT network, a company submits well information, which will be digitally signed and notarized. Unauthorized changes to the information, ownership, or history will become infeasible, thanks to the underlying cryptographic technologies of DLT. The network designs so that information stored and communicated will have a high level of trustworthiness. Every participant in the network can get simultaneous access to a common view of the data. Corda platform also provides multiple functionalities, e.g., Smart contract, Vault, Identity Management, Scheduler, Notary Services, etc. Many of the functionalities automate the data processing within the DLT databases. This project's results expect to enhance public safety and improve the aquifer protection review and operational processes. Kern County uniquely poises for a project to develop more streamlined, effective, and entirely digitized DLT-based workflows that will secure regional environmental data integrity. Water contamination is a primary concern in a region where water and petroleum play vital roles in the economy. Both industries and regulatory agencies pay close attention to environmental quality. Data integrity is a primary issue concern for those that monitor and analyze environmental data. Monitoring and forecasting based on available immutable data are imperative to mitigate complications. We have changed the manual workflow into DLT applications which takes advantage of built-in functionalities. The new review process can avoid repetitive reviews among all participants and shorten the approval time. The embedded smart contracts on the DLT network will also help automate the workflows, and therefore, will be able to help eliminate human errors and improve the turnaround time. The prototype model proves the concept of using DLT. Our research work demonstrates DLT successfully implement into energy technology. The prototype model will further expand to all the UIC processes, such as thermal, wastewater disposal, waterflood, gas injection & disposal, etc. It is a substantial cost and time savings for all the oil and gas companies. The results of this analysis could provide the government with valuable information for significant policy and regulation decisions to further benefit the community and society.
APA, Harvard, Vancouver, ISO, and other styles
8

Spencer, Kevin, Wilson Santamaria, Jane Dawson, and Hong Lu. "Advanced Engineering Critical Assessments of Seam Weld Features in Pressure Cycled Hazardous Liquid Pipelines." In 2008 7th International Pipeline Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/ipc2008-64312.

Full text
Abstract:
The performance of older ERW pipelines has raised concerns regarding their ability to reliably transport product to market. Low toughness or “dirty” steels combined with time dependent threats such as surface breaking defects, selective corrosion and hook cracks are especially of concern in hazardous liquid pipelines that are inevitably subject to cyclic loading, increasing both the probability and rate of crack growth. The existing methods of evaluating seam weld flaws by hydrostatically testing the pipeline or In-Line Inspection (ILI) with an appropriate technology are well established. Hydrostatic testing, whilst providing a quantified level of safety is often impracticable due to associated costs, logistics and the possibility of multiple failures during the test. ILI technologies have become more sophisticated and as a result can accurately detect and size both critical and sub-critical flaws within the pipeline. However, the vast amounts of data generated can often be daunting for a pipeline operator especially when tool tolerances and future growth are required to be accounted for. For either method, extensive knowledge of the benefits and disadvantages are required to assess which is the more appropriate for a particular pipeline segment. This paper will describe advances in the interpretation of seam weld flaws detected by ILI and how they can be applied to an Integrity Management Plan. Signal processing improvements, validated by in-field verifications have enabled detailed profiles of surface breaking defects at seam welds for ERW pipelines to be determined. Using these profiles along with established fracture and fatigue analysis methods allows for reductions in the unnecessary conservatism previously associated with the assessment of seam weld flaws detected by ILI. Combining these results with other available data, e.g. dig verifications, previous hydrostatic testing records, enables more realistic and better-informed integrity and maintenance planning decisions to be made. A real case study conducted in association with a pipeline operator is detailed in the paper and quantifies the benefits that can be realised by using these advanced assessment techniques, to safely and economically manage their assets going forward.
APA, Harvard, Vancouver, ISO, and other styles
9

Karpaya, Shaturrvetan, Sulaiman Sidek, Dani Angga Ab Ghani, Hazrina Ab Rahman, Aivin Yong, Venukumaran Subbarayan, and Simon Chew. "Unlocking Surface Constraints for High Temperature Gas Field: Production Network Compositional Variation Analysis for Wet Gas Meter PVT Calibration Approach." In International Petroleum Technology Conference. IPTC, 2021. http://dx.doi.org/10.2523/iptc-21460-ms.

Full text
Abstract:
Abstract Installation of Wet Gas Metering System (WGMS) on a platform for the purposes of real-time measurement of liquid and gas production rates as well as performance monitoring as part of reservoir and production optimization management are quite common nowadays in Malaysia. Nonetheless, understanding of wells production deliverability invariably measured using these Wet Gas Meter (WGM) which provides the notion of production rates contributed by the wells are paramount important, eventually the produced fluids will be processed by various surface equipment at the central processing platform before being transport to onshore facilities. However, the traditional WGM are known to operate within ±10% accuracy, whereby the confidence level on measurement of the produced fluids can be improved either by updating with accurate PVT flash table or combination of results from performing tracer dilution technique for data verification. Sarawak Gas Field contains a number of gas fields offshore East Malaysia, predominantly are carbonate type formation, where one (1) of the field operated by PETRONAS Carigali Sdn.Bhd.(PCSB) is a high temperature accumulation at which temperature at the Gas Water Contact (GWC) approximately 185°C and full wellstream Flowing Tubing Head Temperature (FTHT) records at 157°C. Cumulative field production of five (5) wells readings from WGM had shown 9.1% differences as compared to the export meter gas readings. As part of a strategy to provide maximum operational flexibility, improvement on accuracy of the WGM is required given that the wells have higher Technical Potential (TP) but are limited by threshold of the multi-stage surface processing capacity. This also impacts commerciality of the field to regaining the cost of capital investment and generate additional revenue especially when there is a surge in network gas demand, as the field unable to swiftly ramp-up its production to fulfill higher gas demand considering the reported production figures from cumulative WGM surpassing the surface equipment Safe Operating Envelope (SOE). Our approach begins with mass balance check at the WGMS and export meter including the fuel, flare and Produced Water Discharge (PWD) to check mass conservation by phases because regardless different type of phases change occurs at topside the total mass should be conserved (i.e. for total phases of gas, condensate and water) provided that precise measurement by the metering equipment. Tracer dilution measurement of gas, condensate and water flowrates were used to verify the latest calibrated Water Gas Ratio (WGR) and Condensate Gas Ratio (CGR) readings input into the WGM. Consequently, PVT separator samples were also taken via mini-separator for compositional analysis (both gas and condensate) and for mathematical recombination at the multi-rates CGR readings to generate a representative PVT compositional table. Simultaneously, process model simulation run was conducted using full wellstream PVT input to validate total field production at the export point. This paper presents practical approach to balance the account, to ensure the SOE at topside as well as to improve the PVT composition at the WGM for high temperature field that emphasizes on understanding of compositional variations across production network causing significant differences in total field production between WGM and the allocation meter.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Records Australia Management Data processing"

1

Rankin, Nicole, Deborah McGregor, Candice Donnelly, Bethany Van Dort, Richard De Abreu Lourenco, Anne Cust, and Emily Stone. Lung cancer screening using low-dose computed tomography for high risk populations: Investigating effectiveness and screening program implementation considerations: An Evidence Check rapid review brokered by the Sax Institute (www.saxinstitute.org.au) for the Cancer Institute NSW. The Sax Institute, October 2019. http://dx.doi.org/10.57022/clzt5093.

Full text
Abstract:
Background Lung cancer is the number one cause of cancer death worldwide.(1) It is the fifth most commonly diagnosed cancer in Australia (12,741 cases diagnosed in 2018) and the leading cause of cancer death.(2) The number of years of potential life lost to lung cancer in Australia is estimated to be 58,450, similar to that of colorectal and breast cancer combined.(3) While tobacco control strategies are most effective for disease prevention in the general population, early detection via low dose computed tomography (LDCT) screening in high-risk populations is a viable option for detecting asymptomatic disease in current (13%) and former (24%) Australian smokers.(4) The purpose of this Evidence Check review is to identify and analyse existing and emerging evidence for LDCT lung cancer screening in high-risk individuals to guide future program and policy planning. Evidence Check questions This review aimed to address the following questions: 1. What is the evidence for the effectiveness of lung cancer screening for higher-risk individuals? 2. What is the evidence of potential harms from lung cancer screening for higher-risk individuals? 3. What are the main components of recent major lung cancer screening programs or trials? 4. What is the cost-effectiveness of lung cancer screening programs (include studies of cost–utility)? Summary of methods The authors searched the peer-reviewed literature across three databases (MEDLINE, PsycINFO and Embase) for existing systematic reviews and original studies published between 1 January 2009 and 8 August 2019. Fifteen systematic reviews (of which 8 were contemporary) and 64 original publications met the inclusion criteria set across the four questions. Key findings Question 1: What is the evidence for the effectiveness of lung cancer screening for higher-risk individuals? There is sufficient evidence from systematic reviews and meta-analyses of combined (pooled) data from screening trials (of high-risk individuals) to indicate that LDCT examination is clinically effective in reducing lung cancer mortality. In 2011, the landmark National Lung Cancer Screening Trial (NLST, a large-scale randomised controlled trial [RCT] conducted in the US) reported a 20% (95% CI 6.8% – 26.7%; P=0.004) relative reduction in mortality among long-term heavy smokers over three rounds of annual screening. High-risk eligibility criteria was defined as people aged 55–74 years with a smoking history of ≥30 pack-years (years in which a smoker has consumed 20-plus cigarettes each day) and, for former smokers, ≥30 pack-years and have quit within the past 15 years.(5) All-cause mortality was reduced by 6.7% (95% CI, 1.2% – 13.6%; P=0.02). Initial data from the second landmark RCT, the NEderlands-Leuvens Longkanker Screenings ONderzoek (known as the NELSON trial), have found an even greater reduction of 26% (95% CI, 9% – 41%) in lung cancer mortality, with full trial results yet to be published.(6, 7) Pooled analyses, including several smaller-scale European LDCT screening trials insufficiently powered in their own right, collectively demonstrate a statistically significant reduction in lung cancer mortality (RR 0.82, 95% CI 0.73–0.91).(8) Despite the reduction in all-cause mortality found in the NLST, pooled analyses of seven trials found no statistically significant difference in all-cause mortality (RR 0.95, 95% CI 0.90–1.00).(8) However, cancer-specific mortality is currently the most relevant outcome in cancer screening trials. These seven trials demonstrated a significantly greater proportion of early stage cancers in LDCT groups compared with controls (RR 2.08, 95% CI 1.43–3.03). Thus, when considering results across mortality outcomes and early stage cancers diagnosed, LDCT screening is considered to be clinically effective. Question 2: What is the evidence of potential harms from lung cancer screening for higher-risk individuals? The harms of LDCT lung cancer screening include false positive tests and the consequences of unnecessary invasive follow-up procedures for conditions that are eventually diagnosed as benign. While LDCT screening leads to an increased frequency of invasive procedures, it does not result in greater mortality soon after an invasive procedure (in trial settings when compared with the control arm).(8) Overdiagnosis, exposure to radiation, psychological distress and an impact on quality of life are other known harms. Systematic review evidence indicates the benefits of LDCT screening are likely to outweigh the harms. The potential harms are likely to be reduced as refinements are made to LDCT screening protocols through: i) the application of risk predication models (e.g. the PLCOm2012), which enable a more accurate selection of the high-risk population through the use of specific criteria (beyond age and smoking history); ii) the use of nodule management algorithms (e.g. Lung-RADS, PanCan), which assist in the diagnostic evaluation of screen-detected nodules and cancers (e.g. more precise volumetric assessment of nodules); and, iii) more judicious selection of patients for invasive procedures. Recent evidence suggests a positive LDCT result may transiently increase psychological distress but does not have long-term adverse effects on psychological distress or health-related quality of life (HRQoL). With regards to smoking cessation, there is no evidence to suggest screening participation invokes a false sense of assurance in smokers, nor a reduction in motivation to quit. The NELSON and Danish trials found no difference in smoking cessation rates between LDCT screening and control groups. Higher net cessation rates, compared with general population, suggest those who participate in screening trials may already be motivated to quit. Question 3: What are the main components of recent major lung cancer screening programs or trials? There are no systematic reviews that capture the main components of recent major lung cancer screening trials and programs. We extracted evidence from original studies and clinical guidance documents and organised this into key groups to form a concise set of components for potential implementation of a national lung cancer screening program in Australia: 1. Identifying the high-risk population: recruitment, eligibility, selection and referral 2. Educating the public, people at high risk and healthcare providers; this includes creating awareness of lung cancer, the benefits and harms of LDCT screening, and shared decision-making 3. Components necessary for health services to deliver a screening program: a. Planning phase: e.g. human resources to coordinate the program, electronic data systems that integrate medical records information and link to an established national registry b. Implementation phase: e.g. human and technological resources required to conduct LDCT examinations, interpretation of reports and communication of results to participants c. Monitoring and evaluation phase: e.g. monitoring outcomes across patients, radiological reporting, compliance with established standards and a quality assurance program 4. Data reporting and research, e.g. audit and feedback to multidisciplinary teams, reporting outcomes to enhance international research into LDCT screening 5. Incorporation of smoking cessation interventions, e.g. specific programs designed for LDCT screening or referral to existing community or hospital-based services that deliver cessation interventions. Most original studies are single-institution evaluations that contain descriptive data about the processes required to establish and implement a high-risk population-based screening program. Across all studies there is a consistent message as to the challenges and complexities of establishing LDCT screening programs to attract people at high risk who will receive the greatest benefits from participation. With regards to smoking cessation, evidence from one systematic review indicates the optimal strategy for incorporating smoking cessation interventions into a LDCT screening program is unclear. There is widespread agreement that LDCT screening attendance presents a ‘teachable moment’ for cessation advice, especially among those people who receive a positive scan result. Smoking cessation is an area of significant research investment; for instance, eight US-based clinical trials are now underway that aim to address how best to design and deliver cessation programs within large-scale LDCT screening programs.(9) Question 4: What is the cost-effectiveness of lung cancer screening programs (include studies of cost–utility)? Assessing the value or cost-effectiveness of LDCT screening involves a complex interplay of factors including data on effectiveness and costs, and institutional context. A key input is data about the effectiveness of potential and current screening programs with respect to case detection, and the likely outcomes of treating those cases sooner (in the presence of LDCT screening) as opposed to later (in the absence of LDCT screening). Evidence about the cost-effectiveness of LDCT screening programs has been summarised in two systematic reviews. We identified a further 13 studies—five modelling studies, one discrete choice experiment and seven articles—that used a variety of methods to assess cost-effectiveness. Three modelling studies indicated LDCT screening was cost-effective in the settings of the US and Europe. Two studies—one from Australia and one from New Zealand—reported LDCT screening would not be cost-effective using NLST-like protocols. We anticipate that, following the full publication of the NELSON trial, cost-effectiveness studies will likely be updated with new data that reduce uncertainty about factors that influence modelling outcomes, including the findings of indeterminate nodules. Gaps in the evidence There is a large and accessible body of evidence as to the effectiveness (Q1) and harms (Q2) of LDCT screening for lung cancer. Nevertheless, there are significant gaps in the evidence about the program components that are required to implement an effective LDCT screening program (Q3). Questions about LDCT screening acceptability and feasibility were not explicitly included in the scope. However, as the evidence is based primarily on US programs and UK pilot studies, the relevance to the local setting requires careful consideration. The Queensland Lung Cancer Screening Study provides feasibility data about clinical aspects of LDCT screening but little about program design. The International Lung Screening Trial is still in the recruitment phase and findings are not yet available for inclusion in this Evidence Check. The Australian Population Based Screening Framework was developed to “inform decision-makers on the key issues to be considered when assessing potential screening programs in Australia”.(10) As the Framework is specific to population-based, rather than high-risk, screening programs, there is a lack of clarity about transferability of criteria. However, the Framework criteria do stipulate that a screening program must be acceptable to “important subgroups such as target participants who are from culturally and linguistically diverse backgrounds, Aboriginal and Torres Strait Islander people, people from disadvantaged groups and people with a disability”.(10) An extensive search of the literature highlighted that there is very little information about the acceptability of LDCT screening to these population groups in Australia. Yet they are part of the high-risk population.(10) There are also considerable gaps in the evidence about the cost-effectiveness of LDCT screening in different settings, including Australia. The evidence base in this area is rapidly evolving and is likely to include new data from the NELSON trial and incorporate data about the costs of targeted- and immuno-therapies as these treatments become more widely available in Australia.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography