Dissertations / Theses on the topic 'Meteorology Australia Data processing'

To see the other types of publications on this topic, follow the link: Meteorology Australia Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 37 dissertations / theses for your research on the topic 'Meteorology Australia Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Kutzner, Kendy. "Processing MODIS Data for Fire Detection in Australia." Thesis, Universitätsbibliothek Chemnitz, 2002. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200200831.

Full text
Abstract:
The aim of this work was to use remote sensing data from the MODIS instrument of the Terra satellite to detect bush fires in Australia. This included preprocessing the demodulator output, bit synchronization and reassembly of data packets. IMAPP was used to do the geolocation and data calibration. The fire detection used a combination of fixed threshold techniques with difference tests and background comparisons. The results were projected in a rectangular latidue/longitude map to remedy the bow tie effect. Algorithms were implemented in C and Matlab. It proved to be possible to detect fires in the available data. The results were compared with fire detection done done by NASA and fire detections based on other sensors and found to be very similar
Das Ziel dieser Arbeit war die Nutzung von Fernerkundungsdaten des MODIS Instruments an Bord des Satelliten Terra zur Erkennung von Buschfeuern in Australien. Das schloss die Vorverarbeitung der Daten vom Demodulator, die Bitsynchronisation und die Umpacketierung der Daten ein. IMAPP wurde genutzt um die Daten zu kalibrieren und zu geolokalisieren. Die Feuererkennung bedient sich einer Kombination von absoluten Schwellwerttests, Differenztests und Vergleichen mit dem Hintergrund. Die Ergebnisse wurden in eine rechteckige Laengen/Breitengradkarte projiziert um dem BowTie Effekt entgegenzuwirken. Die benutzten Algrorithmen wurden in C und Matlab implementiert. Es zeigte sich, dass es moeglich ist in den verfuegbaren Daten Feuer zu erkennen. Die Ergebnisse wurden mit Feuererkennungen der NASA und Feuererkennung die auf anderen Sensoren basieren verglichen und fuer sehr aehnlich befunden
APA, Harvard, Vancouver, ISO, and other styles
2

Wong, Ka-yan, and 王嘉欣. "Positioning patterns from multidimensional data and its applications in meteorology." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B39558630.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fernando, Dweepika Achela Kumarihamy. "On the application of artificial neural networks and genetic algorithms in hydro-meteorological modelling." Thesis, Hong Kong : University of Hong Kong, 1997. http://sunzi.lib.hku.hk/hkuto/record.jsp?B18618546.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kutzner, Kendy. "Processing MODIS Data for Fire Detection in Australia Verarbeitung von MODIS Daten zur Feuererkennung in Australien /." [S.l. : s.n.], 2001. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB10358966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shi, Zhiqun. "Automatic interpretation of potential field data applied to the study of overburden thickness and deep crustal structures, South Australia." Title page, contents and abstract only, 1993. http://web4.library.adelaide.edu.au/theses/09PH/09phs5548.pdf.

Full text
Abstract:
Bibliography: leaves 189-203. Deals with two interpretation methods, a computer program system AUTOMAG and spectral analysis, used for studying overburden thickness and density structure of the crust. The methods were applied to the Gawler Craton, Eyre Peninsula.
APA, Harvard, Vancouver, ISO, and other styles
6

Brown, Roger George, and rogergbrown@mac com. "The impact of the introduction of the graphics calculator on system wide 'high stakes' end of secondary school mathematics examinations." Swinburne University of Technology, 2005. http://adt.lib.swin.edu.au./public/adt-VSWT20051117.121210.

Full text
Abstract:
There has been widespread interest in the potential impact of the graphics calculator on system wide 'high stakes' end of secondary school mathematics examinations. This thesis has focused on one aspect, the way in which examiners have gone about writing examination questions in a graphics calculator assumed environment. Two aspects of this issue have been investigated. The first concerns the types of questions that can be asked in a graphics calculator assumed environment and their frequency of use. The second addresses the level of skills assessed and whether with the introduction of the graphics calculator has been associated with an increase in difficulty as has been frequently suggested. A descriptive case study methodology was used with three examination boards, the Danish Ministry of Education, Victorian Curriculum and Assessment Authority and the International Baccalaureate Organization. Four distinct categories of questions were identified which differed according to the potential for the graphics calculator to contribute to the solution of the question and the freedom the student was then given to make use of this potential. While all examination boards made use of the full range of questions, the tendency was to under use questions in which required the use of the calculator for their solution. In respect to the level of skills assessed, it was found that both prior to and after the introduction of the graphics calculator, all three examination boards used question types that primarily tested the use of lower level mathematical skills. With exceptions, where graphics calculator active questions have been used, the tendency has been to continue to ask routine mechanistic questions. In this regard, there is no evidence of the introduction of the graphics calculator being associated with either lowering or raising of the level of the mathematical skills assessed. For all cases studied, the graphics calculator was introduced with minimal change to the curriculum and examination policies. The role of the graphics calculator in the enacted curriculum was left implicit. The resulting examinations were consistent with the stated policies. However, the inexperience of some examiners and a general policy of containment or minimal change enabled examiners to minimise the impact of the introduction of the graphics calculators on assessment.
APA, Harvard, Vancouver, ISO, and other styles
7

Robinson, Jeffrey Brett, University of Western Sydney, of Science Technology and Environment College, and School of Environment and Agriculture. "Understanding and applying decision support systems in Australian farming systems research." THESIS_CSTE_EAG_Robinson_J.xml, 2005. http://handle.uws.edu.au:8081/1959.7/642.

Full text
Abstract:
Decision support systems (DSS) are usually based on computerised models of biophysical and economic systems. Despite early expectations that such models would inform and improve management, adoption rates have been low, and implementation of DSS is now “critical” The reasons for this are unclear and the aim of this study is to learn to better design, develop and apply DSS in farming systems research (FSR). Previous studies have explored the merits of quantitative tools including DSS, and suggested changes leading to greater impact. In Australia, the changes advocated have been: Simple, flexible, low cost economic tools: Emphasis on farmer learning through soft systems approaches: Understanding the socio-cultural contexts of using and developing DSS: Farmer and researcher co-learning from simulation modelling and Increasing user participation in DSS design and implementation. Twenty-four simple criteria were distilled from these studies, and their usefulness in guiding the development and application of DSS were assessed in six FSR case studies. The case studies were also used to better understand farmer learning through models of decision making and learning. To make DSS useful complements to farmers’ existing decision-making repertoires, they should be based on: (i) a decision-oriented development process, (ii) identifying a motivated and committed audience, (iii) a thorough understanding of the decision-makers context, (iv) using learning as the yardstick of success, and (v) understanding the contrasts, contradictions and conflicts between researcher and farmer decision cultures
Doctor of Philosophy (PhD)
APA, Harvard, Vancouver, ISO, and other styles
8

Roman, Diego. "Modelagem computacional de dados: um sistema de tomada de decisão para gestão de recursos agrometeorológicos - SIAGRO." Universidade do Estado do Rio de Janeiro, 2007. http://www.bdtd.uerj.br/tde_busca/arquivo.php?codArquivo=764.

Full text
Abstract:
A maioria das aplicações envolvendo a influência do clima na agricultura requer um grande volume de dados que, geralmente, não estão disponíveis. Desta forma, há necessidade de um aplicativo computacional para facilitar a organização dos dados necessários. O sistema computacional SIAGRO foi desenvolvido para dar suporte a uma plataforma de coleta de dados termo-pluviométricos e para atender à demanda dos usuários da informação agrometeorológica para agricultura. O sistema proposto permite, a partir de dados coletados a intervalos de 15 minutos, cadastrar outras estações, importar dados, calcular a evapotranspiração por diferentes modelos (Thornthwaite; Camargo; Thornthwaite modificado por Camargo e Hagreaves e Samani), utilizar a classificação climática de Thornthwaite e determinar médias para os parâmetros coletados em períodos distintos de tempo. Os resultados são apresentados em forma de gráficos e tabelas num computador pessoal ou via Internet, que podem ser exportados para uso em outros aplicativos computacionais ou comparados com os resultados de outras estações cadastradas no sistema. Disponibilizar o SIAGRO de informação que permita gerir de forma eficiente programas de irrigação para atender as carências de água nos cultivos, permitiu que se avaliasse o desempenho de três métodos de referência para estimar a evapotranspiração com dados obtidos em lisímetros de lençol freático constante. Os dados foram coletados diariamente e processados em escala mensal. O desempenho dos métodos foi analisado a partir do coeficiente de correlação r e do índice de concordância de Willmot d. Os resultados mostraram que a melhor estimativa foi obtida com o modelo de Thornthwaite modificado por Camargo, devido ao seu melhor ajuste aos dados lisimétricos, apresentando uma concordância ótima, com índice d de 0,91.
Since most of the applications involving the influence of climate in agriculture require a great amount of data that usually are unavailable, a computational tool is needed to help to organize the necessary data. The computational system SIAGRO was developed in an attempt to support such a demand of users of climate information in agriculture. The system makes it possible to register other stations, import climatic data, to calculate evapotranspiration by means of different methods (Thornthwaite; Camargo; Thornthwaite modified by Camargo and Hagreaves e Samani), to apply a climatic classification and to determine averages for different periods of time from daily data. The system presents its results in graphics and tables, which can be copied for use in other computer applications or used to be compared with results of other weather stations registered in this system. To supply SIAGRO with profitable information for irrigation scheduling and increase the efficiency in water use by crops, allowed the evaluation of three reference methods to estimating evapotranspiration through correlation with data obtained in constant water table lisimeter. The data were collected daily and processed in a monthly basis. The performance evaluations of the methods were based on the correlation coefficient r and Willmott agreement coefficient d. The results showed that the best estimate was obtained with the Thornthwaite modified by Camargo model, which shows the best adjustment to lysimeter data, with the index d equal to 0.91.
APA, Harvard, Vancouver, ISO, and other styles
9

Ahern, Anthony J. "The management of information technology investments in the Australian ambulance services." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 1994. https://ro.ecu.edu.au/theses/1105.

Full text
Abstract:
Information Technology plays a significant role in the administration and operation of most organisations today. This is certainly the case with each of the Australian Ambulance Services. With the rapid increase in the use of Information Technology and the expectation about its use by both staff and the general public, the Ambulance Service managements' are faced with the dilemma of trying to ensure that their organisations are able to get the full advantage of advances in Information Technology and at the same time ensure that investments in IT are maintained at appropriate levels that will ensure the maximum return on the investment in terms of the Ambulance Service achieving its mission and objectives. The research considers three questions: How are IT investment decisions determined? How are levels of IT investments determined? Do IT investments contribute to the organisation's overall effectiveness? The general feeling by the ambulance service CEOs is that the investment in IT has been worthwhile in terms of contributing to the organisation being more effective. These findings are contrary to a study by United Research/Business Week and described by LaPlante (1988) where less than half of CEOs surveyed felt that their organisation did an excellent job of linking computer strategy to corporate goals.
APA, Harvard, Vancouver, ISO, and other styles
10

Forsyth, Rowena Public Health &amp Community Medicine Faculty of Medicine UNSW. "Tricky technology, troubled tribes: a video ethnographic study of the impact of information technology on health care professionals??? practices and relationships." Awarded by:University of New South Wales. School of Public Health and Community Medicine, 2006. http://handle.unsw.edu.au/1959.4/30175.

Full text
Abstract:
Whilst technology use has always been a part of the practice of health care delivery, more recently, information technology has been applied to aspects of clinical work concerned with documentation. This thesis presents an analysis of the ways that two professional groups, one clinical and one ancillary, at a single hospital cooperatively engage in a work practice that has recently been computerised. It investigates the way that a clinical group???s approach to and actual use of the system creates problems for the ancillary group. It understands these problems to arise from the contrasting ways that the groups position their use of documentation technology in their local definitions of professional status. The data on which analysis of these practices is based includes 16 hours of video recordings of the work practices of the two groups as they engage with the technology in their local work settings as well as video recordings of a reflexive viewing session conducted with participants from the ancillary group. Also included in the analysis are observational field notes, interviews and documentary analysis. The analysis aimed to produce a set of themes grounded in the specifics of the data, and drew on TLSTranscription?? software for the management and classification of video data. This thesis seeks to contribute to three research fields: health informatics, sociology of professions and social science research methodology. In terms of health informatics, this thesis argues for the necessity for health care information technology design to understand and incorporate the work practices of all professional groups who will be involved in using the technology system or whose work will be affected by its introduction. In terms of the sociology of professions, this thesis finds doctors and scientists to belong to two distinct occupational communities that each utilise documentation technology to different extents in their displays of professional competence. Thirdly, in terms of social science research methodology, this thesis speculates about the possibility for viewing the engagement of the groups with the research process as indicative of their reactions to future sources of outside perturbance to their work.
APA, Harvard, Vancouver, ISO, and other styles
11

Bottomley, Laura Jones. "The application of IBM PC's and distrometers in a satellite propagation experiment." Thesis, Virginia Polytechnic Institute and State University, 1985. http://hdl.handle.net/10919/90919.

Full text
Abstract:
This thesis describes the use of a distrometer and two IBM-PC's to collect data in a large propagation experiment. The uses and methods of collecting drop size distribution are discussed as are the uses of IBM-PC's for both data collection and control. Methods of requiring the PC's to operate in real time are also included.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
12

Thompson, Dean (Dean Barrie) 1974. "Dynamic reconfiguration under real-time constraints." Monash University, School of Computer Science and Software Engineering, 2002. http://arrow.monash.edu.au/hdl/1959.1/7991.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Sadler, Rohan. "Image-based modelling of pattern dynamics in a semiarid grassland of the Pilbara, Australia." University of Western Australia. School of Plant Biology, 2007. http://theses.library.uwa.edu.au/adt-WU2007.0155.

Full text
Abstract:
[Truncated abstract] Ecologists are increasingly interested in quantifying local interacting processes and their impacts on spatial vegetation patterns. In arid and semiarid ecosystems, theoretical models (often spatially explicit) of dynamical system behaviour have been used to provide insight into changes in vegetation patterning and productivity triggered by ecological events, such as fire and episodic rainfall. The incorporation of aerial imagery of vegetation patterning into current theoretical model remains a challenge, as few theoretical models may be inferred directly from ecological data, let alone imagery. However, if conclusions drawn from theoretical models were well supported by image data then these models could serve as a basis for improved prediction of complex ecosystem behaviour. The objective of this thesis is therefore to innovate methods for inferring theoretical models of vegetation dynamics from imagery. ... These results demonstrate how an ad hoc inference procedure returns biologically meaningful parameter estimates for a germ-grain model of T. triandra vegetation patterning, with VLSA photography as data. Various aspects of the modelling and inference procedures are discussed in the concluding chapter, including possible future extensions and alternative applications for germ-grain models. I conclude that the state-and-transition model provides an effective exploration of an ecosystem?s dynamics, and complements spatially explicit models designed to test specific ecological mechanisms. Significantly, both types of models may now be inferred from image data through the methodologies I have developed, and can provide an empirical basis to theoretical models of complex vegetation dynamics used in understanding and managing arid (and other) ecological systems.
APA, Harvard, Vancouver, ISO, and other styles
14

Bligh, W. O. M. "Application of machine learning and connectionist modeling to an Australian dairy database." Thesis, Queensland University of Technology, 2000. https://eprints.qut.edu.au/36851/1/36851_Bligh_2000.pdf.

Full text
Abstract:
The Australian Dairy Herd Improvement Scheme (ADIDS) provides a database containing both raw and processed data relating to milk production in Australia. This thesis provides estimations of potential milk production for dairy breeding using dairy animal data and artificial neural networks (ANNs). By predicting daughter milk production from data representative of dams in a herd and artificial insemination sires, an evaluation of those potential daughter results can lead to the selection of a breeding sire for that herd. Relevant data fields and derived attributes from the dairy database that significantly affect the daughter milk production are utilised in the development of a prediction model. Further research of data fields proven to influence daughter milk production, results in a set of rules extracted for human interpretation. Only Victorian data is used in this study.
APA, Harvard, Vancouver, ISO, and other styles
15

Wilmot, Peter Nicholas. "Modelling cooling tower risk for Legionnaires' Disease using Bayesian Networks and Geographic Information Systems." Title page, contents and conclusion only, 1999. http://web4.library.adelaide.edu.au/theses/09SIS.M/09sismw744.pdf.

Full text
Abstract:
Includes bibliographical references (leaves 115-120) Establishes a Bayesian Belief Network (BBN) to model uncertainty of aerosols released from cooling towers and Geographic Information Systems (GIS) to create a wind dispersal model and identify potential cooling towers as the source of infection. Demonstrates the use of GIS and BBN in environmental epidemiology and the power of spatial information in the area of health.
APA, Harvard, Vancouver, ISO, and other styles
16

Galybin, Konstantin A. "P-wave velocity model for the southwest of the Yilgarn Craton, Western Australia and its relation to the local geology and seismicity." University of Western Australia. School of Earth and Geographical Sciences, 2007. http://theses.library.uwa.edu.au/adt-WU2007.0167.

Full text
Abstract:
[Truncated abstract] A number of controlled and natural seismic sources are utilised to model the Pwave velocity structure of the southwest of the Yilgarn Craton, Western Australia. The Yilgarn Craton is one of the largest pieces of Archaean crust in the world and is known for its gold and nickel deposits in the east and intraplate seismicity in the west. The aim of the project is to link 2D and 3D models of variations in seismic velocity with the local seismicity and geology. A new set of seismic refraction data, acquired in 25 overlapping deployments between 2002 and 2005, has been processed, picked and analysed using forward modelling. The data comprise two perpendicular traverses of three-component recordings of various delay-fired blasts from local commercial quarries. The data were processed using a variety of techniques. Tests were carried out on a number of data enhancement and picking procedures in order to determine the best method for enhancement of delay-fired data. A new method for automatic phase recognition is presented, where the maximum of the derivative of the rectilinearity of a trace is taken as the first break. Complete shot gathers with first break picks for each seismic source are compiled from the overlapping deployments. ... The starting 3D model was based on the models produced by 2D forward modelling. 14 iterations were carried out and the best-fit 3D model was achieved at the 10th iteration. It is 35% better then the current model used to locate earthquakes in this region. The resultant velocity block model was used to iii construct a density block model. A relative gravity map of the southwest of Yilgarn Craton was made. The results of 2D forward modelling, 3D tomography and forward gravity modelling have been compared and it was found that the HVZ is present in all models. Such a zone has been previously seen on a single seismic refraction profile, but it is the first time, this zone has been mapped in 3D. The gravity high produced by the zone coincides with the gravity high observed in reality. There is strong evidence that suggests that the HVZ forms part of the Archaean terrane boundary within the Yilgarn Craton. The distribution of the local seismicity was then discussed in the framework of the new 3D velocity model. A hypothesis, that the primary control on the seismicity in the study area is rotation of the major horizontal stress orientation, is presented. It is also argued that the secondary control on seismicity in the SWSZ is accommodation of movements along major faults.
APA, Harvard, Vancouver, ISO, and other styles
17

Bergfors, Anund. "Using machine learning to identify the occurrence of changing air masses." Thesis, Uppsala universitet, Institutionen för teknikvetenskaper, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-357939.

Full text
Abstract:
In the forecast data post-processing at the Swedish Meteorological and Hydrological Institute (SMHI) a regular Kalman filter is used to debias the two meter air temperature forecast of the physical models by controlling towards air temperature observations. The Kalman filter however diverges when encountering greater nonlinearities in shifting weather patterns, and can only be manually reset when a new air mass has stabilized itself within its operating region. This project aimed to automate this process by means of a machine learning approach. The methodology was at its base supervised learning, by first algorithmically labelling the air mass shift occurrences in the data, followed by training a logistic regression model. Observational data from the latest twenty years of the Uppsala automatic meteorological station was used for the analysis. A simple pipeline for loading, labelling, training on and visualizing the data was built. As a work in progress the operating regime was more of a semi-supervised one - which also in the long run could be a necessary and fruitful strategy. Conclusively the logistic regression appeared to be quite able to handle and infer from the dynamics of air temperatures - albeit non-robustly tested - being able to correctly classify 77% of the labelled data. This work was presented at Uppsala University in June 1st of 2018, and later in June 20th at SMHI.
APA, Harvard, Vancouver, ISO, and other styles
18

Amein, Hussein Aly Abbass. "Computational intelligence techniques for decision making : with applications to the dairy industry." Thesis, Queensland University of Technology, 2000. https://eprints.qut.edu.au/36867/1/36867_Digitised%20Thesis.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Zheng, Letian. "Spatio-temporal models of Australian rainfall and temperature data." Phd thesis, 2011. http://hdl.handle.net/1885/149934.

Full text
Abstract:
This thesis presents three essays on the analysis of historical meteorological data in Australia. The Australian Bureau of Meteorology has established a network of more than one thousand stations across Australia that have recordings dating from early last century, resulting in a large dataset of meteorological records. These data provide important information on the dynamics of the Australian climate system and systematic investigation using these data can help us to better understand our climate and prepare for possible changes. The purpose of this thesis is to develop models and methods to analyse such meteorological data from a statistical perspective. In Chapter 2, a spatia-temporal model is developed based on monthly average temperature data at 177 locations in south-eastern Australia over 40 years. Guided by a preliminary analysis, a model with components dealing with spatial varying mean and seasonality, short-term and long-term temporal trends is built, and the space-time interaction is modelled by the kernel-convolution method. It is shown that the temperature has become warmer in most of the south-eastern Australia during the period under investigation. In Chapter 3, a new duration-dependent Hidden Markov Model is proposed as an extension to the Hidden Markov Model (HMM) and Non-homogeneous Hidden Markov Model (NHMM) which assumes that the transition probabilities are either constant or only depend on some independent variables. The possibility of duration-dependent effects is formally considered in this chapter where the transition probabilities are allowed to be explicitly correlated to duration - how long the hidden system has been in the current state. This approach is used to model the amount of daily rainfall amount at 5 locations in Darwin, Northern Territory. For data arising from climate phenomenon, such as the temperature and rainfall data considered here, it is common for outliers to be present. The presence of outliers could unduly influence the results of any analysis that are conducted and make conclusion non-robust. But it is often difficult to detect them simultaneously because of the masking effect. Motivated by this problem, a general method is proposed in Chapter 4 for identifying multiple influential observations in regression models. The ability of this method is tested and illustrated by both a thorough simulation and several examples.
APA, Harvard, Vancouver, ISO, and other styles
20

Torok, Simon James. "The development of a high quality historical temperature data base for Australia." 1996. http://repository.unimelb.edu.au/10187/2407.

Full text
Abstract:
A high quality, historical surface air temperature data set is essential for the reliable investigation of climate change and variability. In this study, such a data set has been prepared for Australia by adjusting raw mean annual temperature data for inhomogeneities associated with station relocations, changes in exposure, and other problems. Temperature records from long-term stations were collaborated from the set of all raw data held by the Australian Bureau of Meteorology. These long-term records were extended by combining stations and manually entering previously unused archived temperature measurements. An objective procedure was developed to determine the necessary adjustments, in conjunction with complementary statistical methods and station history documentation. The objective procedure involved creating a reference time series for each long-term station, from the median values at surrounding, well-correlated stations. Time series of annual mean maximum and mean minimum temperatures have been produced for 224 stations, and the adjusted dataset has been made available to the research community. The adjusted data are likely to be more representative of real climatic variations than raw data due to the removal of discontinuities. The adjusted data set has been compared with previously used temperature data sets, and data sets of other parameters. The adjusted data set provides adequate spatial coverage of Australia back to 1910. Additional adjusted data are available prior to this date at many stations. Trends in annual mean maximum, minimum, the mean of the maximum and minimum, and the range between the maximum and minimum, have been calculated at each site. Maximum and minimum temperatures have increased since about 1950, with minimum temperatures increasing faster than maximum temperatures.
APA, Harvard, Vancouver, ISO, and other styles
21

"high-resolution rapidly-updated meteorological data analysis system for aviation applications." 2008. http://library.cuhk.edu.hk/record=b5893736.

Full text
Abstract:
Lau, Chi Shing = 一個應用於航空的高分辨率、快速更新的氣象數據分析系統 / 柳巳丞.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2008.
Includes bibliographical references (leaves 76-78).
Abstracts in English and Chinese.
Lau, Chi Shing = Yi ge ying yong yu hang kong de gao fen bian lu, kuai su geng xin de qi xiang shu ju fen xi xi tong / Liu Sicheng.
Abstract --- p.i
Acknowledgement --- p.iii
Chapter 1 --- Introduction --- p.1
Chapter 1.1 --- Overview --- p.1
Chapter 1.2 --- Review on Windshear --- p.2
Chapter 2 --- Review of the Weather Radar System --- p.5
Chapter 2.1 --- Introduction --- p.5
Chapter 2.2 --- Reflectivity Measurement --- p.8
Chapter 2.3 --- Velocity Measurement --- p.11
Chapter 2.4 --- The Doppler Dilemma --- p.14
Chapter 2.5 --- TDWR and LIDAR used in Hong Kong --- p.16
Chapter 3 --- Design of the System --- p.19
Chapter 3.1 --- The Wind Analysis --- p.19
Chapter 3.2 --- The Cloud Analysis --- p.25
Chapter 3.3 --- Settings of the Domain --- p.26
Chapter 4 --- Data Preparation --- p.31
Chapter 4.1 --- Background Field --- p.31
Chapter 4.2 --- Non-radar Observation Data --- p.33
Chapter 4.3 --- The Radar Data --- p.33
Chapter 5 --- A Study on Sea Breeze --- p.37
Chapter 5.1 --- The Physical origin of Sea Breeze --- p.37
Chapter 5.2 --- Case Study on 10 March 2006 --- p.41
Chapter 6 --- A Study on Tropical Cyclone --- p.46
Chapter 6.1 --- The Physics of Tropical Cyclone --- p.46
Chapter 6.2 --- Case Study on 3 Aug 2006 --- p.51
Chapter 7 --- A Study on Microburst --- p.57
Chapter 7.1 --- The Physical origin of Microburst --- p.57
Chapter 7.2 --- Case Study on 8 June 2007 --- p.60
Chapter 8 --- Discussions and Conclusions --- p.67
Chapter 8.1 --- Discussions --- p.67
Chapter 8.2 --- Conclusions --- p.69
Chapter A --- Derivation of Radar Equation --- p.70
Chapter A.1 --- Radar Equation for Point Target --- p.70
Chapter A.2 --- Radar Equation for Distributed Targets --- p.71
Chapter B --- Technical Details --- p.73
Chapter B.1 --- Hardware and Timing --- p.73
Chapter B.2 --- Programming issues --- p.75
Bibliography --- p.76
APA, Harvard, Vancouver, ISO, and other styles
22

Dabrowska, Zielinska Katarzyna. "Inferring evapotranspiration from remotely sensed thermal radiation data." Phd thesis, 1987. http://hdl.handle.net/1885/138717.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Fibbens, Michael, University of Western Sydney, Faculty of Management, and School of Land Economy. "The application of personal computers to direct comparison valuation : a residential mass appraisal investigation." 1993. http://handle.uws.edu.au:8081/1959.7/22664.

Full text
Abstract:
This thesis examines the use of computer technology in the direct comparison method of valuation within the context of mass appraisal for rating and taxing purposes. Computer database, spreadsheet and statistical packages were utilised in the analysis of seven hundred and fifty residential land sales from two areas of Western Sydney. The prime objective of the research was to provide valuers with a personal computer based system that could be applied to direct comparison valuation. Two main approaches were taken in the analysis of sales and prediction of values. The first comprised stepwise multiple regression analysis. The second comprised the adjustment grid (or matrix) approach (which is based on paired sales techniques assisted by spreadsheet technology). The study has been limited to the analysis and prediction of values for residential land. Valuers who carry out mass appraisals may also be required to value industrial, commercial, and retail lands. Therefore, there is a need for further research relating to the applicability of the computerised techniques discussed in this thesis.
Master of Commerce (Hons.)
APA, Harvard, Vancouver, ISO, and other styles
24

McVicar, Timothy Richard. "Monitoring regional moisture availability using AVHRR data : its application for drought assessment." Phd thesis, 2001. http://hdl.handle.net/1885/146031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Donaldson, William S. "Integrating real-time weather data with dynamic crop development models." Thesis, 1991. http://hdl.handle.net/1957/36712.

Full text
Abstract:
Crop development models are commonly used in research. However, their use as crop management tools for growers is rare. Decision support systems (DSS), which combine crop models with expert systems, are being developed to provide management assistance to growers. Researchers at Oregon State University are in the process of developing a DSS. Research was conducted to develop a computer program to provide current and generated weather data for use by the DSS. The objectives of this research were to obtain a weather station, develop a set of quality control procedures to check data from the station, obtain a weather generator program, and create a weather data manager program to implement the above objectives. A weather station was obtained and was placed near two existing weather stations for ten months. Data from the weather station was compared with the other two stations for values of monthly average maximum temperature, minimum temperature, and daily total solar radiation and monthly total precipitation. The weather station performed well. Only measurements of total daily solar radiation were consistently different from the other stations. Based on a comparison of the weather station with an Eppley pyranometer, a factor was calculated to correct the solar radiation readings. The quality control procedures used on the weather data were adapted from automated procedures given in the literature. When tested, the procedures performed as desired. When used on actual data from the weather station, values that failed the procedures were apparently legitimate values. Options were added to the data manager program that allow the user to quickly decide what to do with failed values. For a weather data generator, WGEN was chosen from the generators presented in the literature. An input parameter file was created for the Corvallis, Oregon area and thirty years of data were generated. Monthly means from this data were compared with thirty-year historical monthly means for Corvallis. Precipitation data from WGEN compared well with the historical data. The generated data for maximum and minimum temperature and daily total solar radiation had great differences from the historical data. It is believed that the input parameters for the Corvallis area suggested by the authors of WGEN are not appropriate. The weather data manager program was written in the C programming language, and occupies approximately 98 kilobytes of disk space, not including the eleven files created directly and indirectly by the program. The main functions of the program are: 1) retrieving data from the weather station and performing quality control procedures on the data (allowing the user to decide what to do with values that failed QC); 2) viewing and editing of files by the user; 3) weather data generation (creating a file of only generated data or appending generated data to the file of current data from the weather station to create a file containing a full year of weather data); and 4) miscellaneous functions (monitoring the weather station, setting the calendar in the station's datalogger, and changing information used by the data manager program). It is hoped that this program will be a significant contribution towards the development of a decision support system.
Graduation date: 1992
APA, Harvard, Vancouver, ISO, and other styles
26

Wong, Anthony Kar Man. "Theoretical investigation of Australian designed reinforced concrete frames subjected to earthquake loading." Thesis, 1999. http://hdl.handle.net/2440/114585.

Full text
Abstract:
Studies the behaviour of reinforced concrete frame structures designed in accordance with AS3600 concrete structures code using a non-linear computer model. A computer model of a multi-storey multi-bay prototype structure was created.
Thesis (M.Eng.Sc.) -- University of Adelaide, Dept. of Civil and Environmental Engineering, 1999
APA, Harvard, Vancouver, ISO, and other styles
27

Hu, Jigao. "Data visualization & TQM implementation : a study of the implementation of data visualization in total quality management in Victorian manufacturing industry." Thesis, 1995. https://vuir.vu.edu.au/18177/.

Full text
Abstract:
Introduction: Data visualisation (DV) is the process of creating and presenting a chart given a set of active data and sets of attribute and entity constraints. It rapidly and interactively investigates large multivariate and multidisciplinary data sets to detect trends, correlations, and anomalies. Data Visualisation is the latest analytical tool for both technical computer users and business computer users. Total Quality Management (TQM) is continuous improvement in the performance of all processes and the products and services that are the outcomes of those processes. In quality management, DV is one of the three new tools that complement the existing seven, which are flow charts, Ishikawa or cause and effect diagrams, Pareto charts, histograms, run charts and graphs, scattergrams and control charts. It lets quality control engineers readily see the real reasons for quality problems by presenting the data in up to six dimensions. Methodology: A survey by mail questionnaire was conducted to collect data from one hundred Victorian manufacturing companies. Responses were received from 52 companies out of the total of 100. The sample size for each analysis may vary from 52 to 49. The source for company information was Kompass Australia 1994/1995. The statistical analysis tool used was Statistica. Major Findings: The TQM program implementation tends to be more complete in companies with more employees. Wordprocessing software is adopted by all companies in TQM practice, mostly for producing a quality instructional manual. Spreadsheet and database packages are the second and the third most commonly used software. Companies that have completed their formal TQM program implementation generally use computer software in more aspects of their TQM practice than companies at lower TQM stages though not always. Two-dimensional DV techniques are more commonly used than three-dimensional ones with the 2-D colour and 2-D shade the most widely used by all. The 3-D animation tool needs to be explored. DV features are generally important for all the users. The ability to handle complex data is more important for companies at a higher stage of TQM program implementation than companies at lower stages.
APA, Harvard, Vancouver, ISO, and other styles
28

Reverter-Rambaldi, Marcel. "Topic Modelling in Spontaneous Speech Data." Thesis, 2022. http://hdl.handle.net/1885/281664.

Full text
Abstract:
The development of large-scale, language corpora has highlighted the increasing need for automated methods, to assist humans in the inefficient task of sorting and labelling language-transcripts by semantic contents (i.e. topics). One approach to semantic labelling involves using a class of unsupervised, machine-learning algorithms known as “topic modelling”. These algorithms process a document (e.g. a transcript), and identify clusters representing words that occur in proximity to each other in the document. To date, topic modelling has been implemented widely in written language – including newspapers, academic articles, and business reports – but much less to spontaneous speech data. The linguistics literature has identified the need to apply more qualitative and analytic approaches, when judging and improving topic modelling for future use. My research applies topic-modelling algorithms to transcripts from sociolinguistic interviews, compiled for the Sydney Speaks Project. I apply certain modifications to improve topic-modelling’s performance, including the use of a custom stoplist, a human benchmark for measuring efficacy, and linguistically-based, text partitioning. The findings support the idea that text partitioning and a custom stoplist, produce results that align better with the human benchmark.
APA, Harvard, Vancouver, ISO, and other styles
29

Fleming, Nicholas S. "Sustainability and water resources management for the northern Adelaide Plains, South Australia / Nicholas S. Fleming." 1999. http://hdl.handle.net/2440/19525.

Full text
Abstract:
Includes bibliographical references (64 p.)
2 v. : ill., maps (chiefly col.) ; 30 cm.
Title page, contents and abstract only. The complete thesis in print form is available from the University Library.
The concept of sustainable development is explored with a focus upon water resources and urban development. Simulation of urban growth patterns and water resources management has been undertaken as part of the case study. The artificial Neural Networks technique has been employed to model regional water consumption.
Thesis (Ph.D.)--University of Adelaide, Dept. of Civil and Environmental Engineering, 1999?
APA, Harvard, Vancouver, ISO, and other styles
30

Zhang, Jingyuan. "Web geospatial visualisation for clustering analysis of epidemiological data." Thesis, 2014. https://vuir.vu.edu.au/25917/.

Full text
Abstract:
Public health is a major factor that in reducing of disease round the world. Today, most governments recognise the importance of public health surveillance in monitoring and clarifying the epidemiology of health problems. As part of public health surveillance, public health professionals utilise the results of epidemiological analysis to reform health care policy and health service plans. There are many health reports on epidemiological analysis within government departments, but the public are not authorised to access these reports because of commercial software restrictions. Although governments publish many reports of epidemiological analysis, the reports are coded in epidemiology terminology and are almost impossible for the public to fully understand. In order to improve public awareness, there is an urgent need for government to produce a more easily understandable epidemiological analysis and to provide an open access reporting system with minimum cost. Inevitably, it poses challenges to IT professionals to develop a simple, easily understandable and freely accessible system for public use. It is not only required to identify a data analysis algorithm which can make epidemiological analysis reports easily understood but also to choose a platform which can facilitate the visualisation of epidemiological analysis reports with minimum cost. In this thesis, there were two major research objectives: the clustering analysis of epidemiological data and the geospatial visualisation of the results of the clustering analysis. SOM, FCM and k-means, the three commonly used clustering algorithms for health data analysis, were investigated. After a number of experiments, k-means has been identified, based on Davies-Bouldin index validation, as the best clustering algorithm for epidemiological data. The geospatial visualisation requires a Geo-Mashups engine and geospatial layer customisation. Because of the capacity and many successful applications of free geospatial web services, Google Maps has been chosen as the geospatial visualisation platform for epidemiological reporting.
APA, Harvard, Vancouver, ISO, and other styles
31

Bassett, Cameron. "Cloud computing and innovation: its viability, benefits, challenges and records management capabilities." Diss., 2015. http://hdl.handle.net/10500/20149.

Full text
Abstract:
This research investigated the potential benefits, risks and challenges, innovation properties and viability of cloud computing for records management on an Australian organisation within the mining software development sector. This research involved the use of a case study results analysis as well as a literature analysis. The literature analysis identified the ten potential benefits of cloud computing, as well as the ten risks and challenges associated with cloud computing. It further identified aspects, which needed to be addressed when adopting cloud computing in order to promote innovation within an organisation. The case study analysis was compared against a literature review of ten potential benefits of cloud computing, as well as the ten risks and challenges associated with cloud computing. This was done in order to determine cloud computing’s viability for records management for Company X (The company in the case study). Cloud computing was found to be viable for Company X. However, there were certain aspects, which need to be discussed and clarified with the cloud service provider beforehand in order to mitigate possible risks and compliance issues. It is also recommended that a cloud service provider who complies with international standards, such as ISO 15489, be selected. The viability of cloud computing for organisations similar to Company X (mining software development) followed a related path. These organisations need to ensure that the service provider is compliant with laws in their local jurisdiction, such as Electronic Transactions Act 1999 (Australia, 2011:14-15), as well as laws where their data (in the cloud) may be hosted. The benefits, risks and challenges of records management and cloud computing are applicable to these similar organisations. However, mitigation of these risks needs to be discussed with a cloud service provider beforehand. From an innovation perspective, cloud computing is able to promote innovation within an organisation, if certain antecedents are dealt with. Furthermore, if cloud computing is successfully adopted then it should promote innovation within organisations.
Information Science
M. Inf.
APA, Harvard, Vancouver, ISO, and other styles
32

Sadoddin, Amir. "Bayesian network models for integrated catchment-scale management of salinity." Phd thesis, 2006. http://hdl.handle.net/1885/150932.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Bowden, G. J. (Gavin James). "Forecasting water resources variables using artificial neural networks." 2003. http://web4.library.adelaide.edu.au/theses/09PH/09phb7844.pdf.

Full text
Abstract:
"February 2003." Corrigenda for, inserted at back Includes bibliographical references (leaves 475-524 ) A methodology is formulated for the successful design and implementation of artificial neural networks (ANN) models for water resources applications. Attention is paid to each of the steps that should be followed in order to develop an optimal ANN model; including when ANNs should be used in preference to more conventional statistical models; dividing the available data into subsets for modelling purposes; deciding on a suitable data transformation; determination of significant model inputs; choice of network type and architecture; selection of an appropriate performance measure; training (optimisation) of the networks weights; and, deployment of the optimised ANN model in an operational environment. The developed methodology is successfully applied to two water resorces case studies; the forecasting of salinity in the River Murray at Murray Bridge, South Australia; and the the forecasting of cyanobacteria (Anabaena spp.) in the River Murray at Morgan, South Australia.
APA, Harvard, Vancouver, ISO, and other styles
34

Ghile, Yonas Beyene. "Development of a framework for an integrated time-varying agrohydrological forecast system for southern Africa." Thesis, 2007. http://hdl.handle.net/10413/352.

Full text
Abstract:
Policy makers, water managers, farmers and many other sectors of the society in southern Africa are confronting increasingly complex decisions as a result of the marked day-to-day, intra-seasonal and inter-annual variability of climate. Hence, forecasts of hydro-climatic variables with lead times of days to seasons ahead are becoming increasingly important to them in making more informed risk-based management decisions. With improved representations of atmospheric processes and advances in computer technology, a major improvement has been made by institutions such as the South African Weather Service, the University of Pretoria and the University of Cape Town in forecasting southern Africa’s weather at short lead times and its various climatic statistics for longer time ranges. In spite of these improvements, the operational utility of weather and climate forecasts, especially in agricultural and water management decision making, is still limited. This is so mainly because of a lack of reliability in their accuracy and the fact that they are not suited directly to the requirements of agrohydrological models with respect to their spatial and temporal scales and formats. As a result, the need has arisen to develop a GIS based framework in which the “translation” of weather and climate forecasts into more tangible agrohydrological forecasts such as streamflows, reservoir levels or crop yields is facilitated for enhanced economic, environmental and societal decision making over southern Africa in general, and in selected catchments in particular. This study focuses on the development of such a framework. As a precursor to describing and evaluating this framework, however, one important objective was to review the potential impacts of climate variability on water resources and agriculture, as well as assessing current approaches to managing climate variability and minimising risks from a hydrological perspective. With the aim of understanding the broad range of forecasting systems, the review was extended to the current state of hydro-climatic forecasting techniques and their potential applications in order to reduce vulnerability in the management of water resources and agricultural systems. This was followed by a brief review of some challenges and approaches to maximising benefits from these hydro-climatic forecasts. A GIS based framework has been developed to serve as an aid to process all the computations required to translate near real time rainfall fields estimated by remotely sensed tools, as well as daily rainfall forecasts with a range of lead times provided by Numerical Weather Prediction (NWP) models into daily quantitative values which are suitable for application with hydrological or crop models. Another major component of the framework was the development of two methodologies, viz. the Historical Sequence Method and the Ensemble Re-ordering Based Method for the translation of a triplet of categorical monthly and seasonal rainfall forecasts (i.e. Above, Near and Below Normal) into daily quantitative values, as such a triplet of probabilities cannot be applied in its original published form into hydrological/crop models which operate on a daily time step. The outputs of various near real time observations, of weather and climate models, as well as of downscaling methodologies were evaluated against observations in the Mgeni catchment in KwaZulu-Natal, South Africa, both in terms of rainfall characteristics as well as of streamflows simulated with the daily time step ACRU model. A comparative study of rainfall derived from daily reporting raingauges, ground based radars, satellites and merged fields indicated that the raingauge and merged rainfall fields displayed relatively realistic results and they may be used to simulate the “now state” of a catchment at the beginning of a forecast period. The performance of three NWP models, viz. the C-CAM, UM and NCEP-MRF, were found to vary from one event to another. However, the C-CAM model showed a general tendency of under-estimation whereas the UM and NCEP-MRF models suffered from significant over-estimation of the summer rainfall over the Mgeni catchment. Ensembles of simulated streamflows with the ACRU model using ensembles of rainfalls derived from both the Historical Sequence Method and the Ensemble Re-ordering Based Method showed reasonably good results for most of the selected months and seasons for which they were tested, which indicates that the two methods of transforming categorical seasonal forecasts into ensembles of daily quantitative rainfall values are useful for various agrohydrological applications in South Africa and possibly elsewhere. The use of the Ensemble Re-ordering Based Method was also found to be quite effective in generating the transitional probabilities of rain days and dry days as well as the persistence of dry and wet spells within forecast cycles, all of which are important in the evaluation and forecasting of streamflows and crop yields, as well as droughts and floods. Finally, future areas of research which could facilitate the practical implementation of the framework were identified.
Thesis (Ph.D.)-University of KwaZulu-Natal, Pietermaritzburg, 2007.
APA, Harvard, Vancouver, ISO, and other styles
35

Massey, Philip. "An expert system for a legal office." Thesis, 1995. https://vuir.vu.edu.au/18190/.

Full text
Abstract:
The subject of this thesis is the use of Information Technology (IT) to assist in the resolution of cases that appear before the Family Law Court in Australia. IT is used to assist in two processes. The first process is the intelligent gathering and preparation of information for Family Law cases. The second process is the modeling of case-decisions by the Family Law Court. The first process is covered in this thesis and the second process is part of the Andrew Stranieri's doctorate research at La Trobe University. Legal Interaction Charts (LIC) are developed to model procedures a Family Law solicitor carries out when gathering information and preparing a case. Sequenced Event Charts (SEC) are developed to implement LICs on a computer.
APA, Harvard, Vancouver, ISO, and other styles
36

O'Connor, Bill. "Solutions to problems encountered during the adoption and management of new colour measuring and control technology in the textile industry." Thesis, 1995. https://vuir.vu.edu.au/18199/.

Full text
Abstract:
This research identifies the key factors involved in the successful adoption of a computerised match prediction system in the textile industry. The adoption of this technology has created big problems for many companies and few have succeeded without difficulty. Five companies adopting the technology were investigated to identify common problem areas. These areas were compared with the results of a literature review. A case study format was used to study in greater detail two companies in the carpet industry regarding their adoption of this system. One company was remarkably successful whilst the other company succeeded after much delay and difficulty. The literature relating to technological change and its effects on employees indicates the problems involve management, environmental, technical and social factors. Hence four research questions concerning prescriptive and contextual factors are tested by case study research and a cultural survey of all involved at both sites. Factors like the importance of strategy, management support and training are examined. The impact of culture, management style and fear of change are closely investigated. The results, whilst not conclusive, do give a good indication of the areas for special attention and the key factors, should the adoption of a computerised match prediction system be contemplated. The key factors form the basis of the conclusions that training, management support and the presence of a knowledgeable champion to drive the implementation were crucial whereas there was very little evidence of fear of the technology. Culture and management style were found to have an impact in so far as they direct the companies' approach to adopting the technology and influence h o w decisions are made and problems solved.
APA, Harvard, Vancouver, ISO, and other styles
37

Beerval, Ravichandra Kavya Urs. "Spatiotemporal analysis of extreme heat events in Indianapolis and Philadelphia for the years 2010 and 2011." Thesis, 2014. http://hdl.handle.net/1805/4083.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
Over the past two decades, northern parts of the United States have experienced extreme heat conditions. Some of the notable heat wave impacts have occurred in Chicago in 1995 with over 600 reported deaths and in Philadelphia in 1993 with over 180 reported deaths. The distribution of extreme heat events in Indianapolis has varied since the year 2000. The Urban Heat Island effect has caused the temperatures to rise unusually high during the summer months. Although the number of reported deaths in Indianapolis is smaller when compared to Chicago and Philadelphia, the heat wave in the year 2010 affected primarily the vulnerable population comprised of the elderly and the lower socio-economic groups. Studying the spatial distribution of high temperatures in the vulnerable areas helps determine not only the extent of the heat affected areas, but also to devise strategies and methods to plan, mitigate, and tackle extreme heat. In addition, examining spatial patterns of vulnerability can aid in development of a heat warning system to alert the populations at risk during extreme heat events. This study focuses on the qualitative and quantitative methods used to measure extreme heat events. Land surface temperatures obtained from the Landsat TM images provide useful means by which the spatial distribution of temperatures can be studied in relation to the temporal changes and socioeconomic vulnerability. The percentile method used, helps to determine the vulnerable areas and their extents. The maximum temperatures measured using LST conversion of the original digital number values of the Landsat TM images is reliable in terms of identifying the heat-affected regions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography