Journal articles on the topic 'Pensions Australia Data processing'

To see the other types of publications on this topic, follow the link: Pensions Australia Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Pensions Australia Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Murphy, Tara, Peter Lamb, Christopher Owen, and Malte Marquarding. "Data Storage, Processing, and Visualization for the Australia Telescope Compact Array." Publications of the Astronomical Society of Australia 23, no. 1 (2006): 25–32. http://dx.doi.org/10.1071/as05033.

Full text
Abstract:
AbstractWe present three Virtual Observatory tools developed at the Australia Telescope National Facility (ATNF) for the storage, processing and visualization of Australia Telescope Compact Array (ATCA) data. These are the Australia Telescope Online Archive, a prototype data-reduction pipeline, and the Remote Visualization System. These tools were developed in the context of the Virtual Observatory and were intended to be both useful for astronomers and technology demonstrators. We discuss the design and implementation of these tools, as well as issues that should be considered when developing similar systems for future telescopes.
APA, Harvard, Vancouver, ISO, and other styles
2

Hidayanti, Erni. "KUALITAS PELAYANAN PENETAPAN PENSIUN OTOMATIS BERBASIS LESS PAPER BAGI PEGAWAI NEGERI SIPIL DI LINGKUNGAN PEMERINTAH PROVINSI SULAWESI UTARA." AKSELERASI: Jurnal Ilmiah Nasional 3, no. 2 (July 16, 2021): 80–91. http://dx.doi.org/10.54783/jin.v3i2.412.

Full text
Abstract:
The purpose of this study was directed to examine the quality of service for determining less paper-based automatic pensions for Civil Servants in the North Sulawesi Provincial Government. This research is descriptive qualitative with data collection techniques in the form of interviews, observations and documentation. Implementation of the determination of less paper-based automatic pensions for Civil Servants in the North Sulawesi Provincial Government has not gone well, due to the absence of Standard Operating Procedures (SOP) for determining less paper-based automatic pensions, lack of transparency, weak socialization, weak data reconciliation, lack of human resources. The apparatus, there is no innovation in the development of less paper-based pension processing, the condition of civil servants who will enter the retirement age limit has not responded to the implementation of the less paper-based automatic pension determination. Therefore, it takes a strong commitment and desire from the North Sulawesi Provincial Government, especially the Regional Personnel Board of North Sulawesi Province in making innovations in developing less paper-based automatic pension determinations to create automatic pension service flows that meet service quality standards, making it easier for pension recipients to receive pension decisions.
APA, Harvard, Vancouver, ISO, and other styles
3

Pettit, C. J., S. N. Lieske, and S. Z. Leao. "BIG BICYCLE DATA PROCESSING: FROM PERSONAL DATA TO URBAN APPLICATIONS." ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences III-2 (June 2, 2016): 173–79. http://dx.doi.org/10.5194/isprsannals-iii-2-173-2016.

Full text
Abstract:
Understanding the flows of people moving through the built environment is a vital source of information for the planners and policy makers who shape our cities. Smart phone applications enable people to trace themselves through the city and these data can potentially be then aggregated and visualised to show hot spots and trajectories of macro urban movement. In this paper our aim is to develop procedures for cleaning, aggregating and visualising human movement data and translating this into policy relevant information. In conducting this research we explore using bicycle data collected from a smart phone application known as RiderLog. We focus on the RiderLog application initially in the context of Sydney, Australia and discuss the procedures and challenges in processing and cleaning this data before any analysis can be made. We then present some preliminary map results using the CartoDB online mapping platform where data are aggregated and visualised to show hot spots and trajectories of macro urban movement. We conclude the paper by highlighting some of the key challenges in working with such data and outline some next steps in processing the data and conducting higher volume and more extensive analysis.
APA, Harvard, Vancouver, ISO, and other styles
4

Pettit, C. J., S. N. Lieske, and S. Z. Leao. "BIG BICYCLE DATA PROCESSING: FROM PERSONAL DATA TO URBAN APPLICATIONS." ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences III-2 (June 2, 2016): 173–79. http://dx.doi.org/10.5194/isprs-annals-iii-2-173-2016.

Full text
Abstract:
Understanding the flows of people moving through the built environment is a vital source of information for the planners and policy makers who shape our cities. Smart phone applications enable people to trace themselves through the city and these data can potentially be then aggregated and visualised to show hot spots and trajectories of macro urban movement. In this paper our aim is to develop procedures for cleaning, aggregating and visualising human movement data and translating this into policy relevant information. In conducting this research we explore using bicycle data collected from a smart phone application known as RiderLog. We focus on the RiderLog application initially in the context of Sydney, Australia and discuss the procedures and challenges in processing and cleaning this data before any analysis can be made. We then present some preliminary map results using the CartoDB online mapping platform where data are aggregated and visualised to show hot spots and trajectories of macro urban movement. We conclude the paper by highlighting some of the key challenges in working with such data and outline some next steps in processing the data and conducting higher volume and more extensive analysis.
APA, Harvard, Vancouver, ISO, and other styles
5

Henderson, Roger. "ASEG History Committee: A history of geophysical data and image processing in Australia." Preview 2020, no. 207 (July 3, 2020): 5. http://dx.doi.org/10.1080/14432471.2020.1800381.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Unwin, Elizabeth, James Codde, Louise Gill, Suzanne Stevens, and Timothy Nelson. "The WA Hospital Morbidity Data System: An Evaluation of its Performance and the Impact of Electronic Data Transfer." Health Information Management 26, no. 4 (December 1996): 189–92. http://dx.doi.org/10.1177/183335839702600407.

Full text
Abstract:
This paper evaluates the performance of the Hospital Morbidity Data System, maintained by the Health Statistics Branch (HSB) of the Health Department of Western Australia (WA). The time taken to process discharge summaries was compared in the first and second halves of 1995, using the number of weeks taken to process 90% of all discharges and the percentage of records processed within four weeks as indicators of throughput. Both the hospitals and the HSB showed improvements in timeliness during the second half of the year. The paper also examines the impact of a recently introduced electronic data transfer system for WA country public hospitals on the timeliness of morbidity data. The processing time of country hospital records by the HSB was reduced to a similar time as for metropolitan hospitals, but the processing time in the hospitals increased, resulting in little improvement in total processing time.
APA, Harvard, Vancouver, ISO, and other styles
7

Spencer, G. A., D. F. Pridmore, and D. J. Isles. "Data integration of exploration data using colour space on an image processor." Exploration Geophysics 20, no. 2 (1989): 31. http://dx.doi.org/10.1071/eg989031.

Full text
Abstract:
lmage processing in exploration has rapidly evolved into the field of data integration, whereby independent data sets which coincide in space are displayed concurrently. Interrelation-ships between data sets which may be crucial to exploration can thus be identified much more effectively than with conventional hard copy overlays. The use of perceptual colour space; hue, saturation and luminosity (HSL) provides an effective means for integrating raster data sets, as illustrated with the multi-spectral scanner and airborne geophysical data from the Kambalda area in Western Australia. The integration process must also cater for data in vector format, which is more appropriate for geological, topographic and cultural information, but to date, image processing systems have poorly captured and managed such data. As a consequence, the merging of vector data management software such as GIS (geographic information system) with existing advanced image enhancement packages is an area of active development in the exploration industry.
APA, Harvard, Vancouver, ISO, and other styles
8

Teng, Keat Huat, Joe Zhou, Rao Yandapalli Hanumantha, Yingjie Feng, Zhengmin Zhang, and Loic Michel. "Simplify the variable-depth streamer data processing through pre- migration deghosting: a case study from NWS Australia Data." ASEG Extended Abstracts 2013, no. 1 (December 2013): 1–4. http://dx.doi.org/10.1071/aseg2013ab286.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ticehurst, Catherine, Zheng-Shu Zhou, Eric Lehmann, Fang Yuan, Medhavy Thankappan, Ake Rosenqvist, Ben Lewis, and Matt Paget. "Building a SAR-Enabled Data Cube Capability in Australia Using SAR Analysis Ready Data." Data 4, no. 3 (July 15, 2019): 100. http://dx.doi.org/10.3390/data4030100.

Full text
Abstract:
A research alliance between the Commonwealth Scientific and Industrial Research Organization and Geoscience Australia was established in relation to Digital Earth Australia, to develop a Synthetic Aperture Radar (SAR)-enabled Data Cube capability for Australia. This project has been developing SAR analysis ready data (ARD) products, including normalized radar backscatter (gamma nought, γ0), eigenvector-based dual-polarization decomposition and interferometric coherence, all generated from the European Space Agency (ESA) Sentinel-1 interferometric wide swath mode data available on the Copernicus Australasia Regional Data Hub. These are produced using the open source ESA SNAP toolbox. The processing workflows are described, along with a comparison of the γ0 backscatter and interferometric coherence ARD produced using SNAP and the proprietary software GAMMA. This comparison also evaluates the effects on γ0 backscatter due to variations related to: Near- and far-range look angles; SNAP’s default Shuttle Radar Topography Mission (SRTM) DEM and a refined Australia-wide DEM; as well as terrain. The agreement between SNAP and GAMMA is generally good, but also presents some systematic geometric and radiometric differences. The difference between SNAP’s default SRTM DEM and the refined DEM showed a small geometric shift along the radar view direction. The systematic geometric and radiometric issues detected can however be expected to have negligible effects on analysis, provided products from the two processors and two DEMs are used separately and not mixed within the same analysis. The results lead to the conclusion that the SNAP toolbox is suitable for producing the Sentinel-1 ARD products.
APA, Harvard, Vancouver, ISO, and other styles
10

Sangeeth L R, Silpa, Saji K. Mathew, and Vidyasagar Potdar. "Information Processing view of Electricity Demand Response Systems: A Comparative Study Between India and Australia." Pacific Asia Journal of the Association for Information Systems 12 (June 30, 2020): 27–63. http://dx.doi.org/10.17705/1thci.12402.

Full text
Abstract:
Abstract Background: In recent years, demand response (DR) has gained increased attention from utilities, regulators, and market aggregators to meet the growing demands of electricity. The key aspect of a successful DR program is the effective processing of data and information to gain critical insights. This study aims to identify information processing needs and capacity that interact to improve energy DR effectiveness. To this end, organizational information processing theory (OIPT) is employed to understand the role of Information Systems (IS) resources in achieving desired DR program performance. This study also investigates how information processing for DR systems differ between developing (India) and developed (Australia) countries. Method: This work adopts a case study methodology to propose a theoretical framework using OIPT for information processing in DR systems. The study further employs a comparative case data analyses between Australian and Indian DR initiatives. Results: Our cross case analysis identifies variables of value creation in designing DR programs - pricing structure for demand side participation, renewable integration at supply side, reforms in the regulatory instruments, and emergent technology. This research posits that the degree of information processing capacity mediates the influence of information processing needs on energy DR effectiveness. Further, we develop five propositions on the interaction between task based information processing needs and capacity, and their influence on DR effectiveness. Conclusions: The study generates insights on the role of IS resources that can help stakeholders in the electricity value chain to take informed and intelligent decisions for improved performance of DR programs. Recommended Citation Sangeeth L R, Silpa; Mathew, Saji K.; and Potdar, Vidyasagar (2020) "Information Processing view of Electricity Demand Response Systems: A Comparative Study Between India and Australia," Pacific Asia Journal of the Association for Information Systems: Vol. 12: Iss. 4, Article 2. DOI: 10.17705/1pais.12402 Available at: https://aisel.aisnet.org/pajais/vol12/iss4/2
APA, Harvard, Vancouver, ISO, and other styles
11

Dunne, Jarrod, and Greg Beresford. "Improving seismic data quality in the Gippsland Basin (Australia)." GEOPHYSICS 63, no. 5 (September 1998): 1496–506. http://dx.doi.org/10.1190/1.1444446.

Full text
Abstract:
Deep seismic exploration in the Gippsland Basin is hindered by strong noise below the Latrobe Group coal sequence. The reflectivity method provides a means for constructing detailed and accurate synthetic seismograms, often from little more than a partial sonic log. The noise contributions to the synthetics can then be interpreted using additional synthetics computed from variations upon the depth model and by exercising control over the wave types modeled. This approach revealed three types of persistent noise in progressively deeper parts of the subcoal image: (1) mode‐converted interbed multiples (generated within the coal sequence), (2) S-wave reflections and long‐period multiples (generated between the coal sequence and the Miocene carbonates), and (3) surface‐related multiples. The noise interpretation can also be performed upon semblance analyses of the elastic synthetics to guide a velocity analysis away from a well. This procedure helped to avoid picking the interformation long‐period multiples, whose stacking velocities were only 5 to 10% below those of the weak target zone primaries. An improve subcoal image was obtained by making full use of the versatile noise suppression offered by a τ-p domain processing stream. By separating the strong linear events at the far offsets, it is possible to stack a larger portion of the target zone reflections, provided hyperbolic velocity filtering (HVF) is applied to suppress the transform artifacts. Hyperbolic velocity filtering can be incorporated into a point‐source τ-p transform to suppress S-wave reflections and guided waves while preserving plane‐wave amplitudes to assist the subsequent deconvolution of the mode‐converted interbed multiples. Stacking in the τ-p domain is achieved using an elliptical moveout correction that reduces wavelet stretch and approximates the exact reflection traveltime better than NMO. Two regional seismic lines were reprocessed in this manner and cointerpreted with the modeling studies performed at nearby wells to avoid the noise events that still remained. Several new events appeared in the immediate target zone, passing the low‐frequency character expected following transmission through a coal sequence.
APA, Harvard, Vancouver, ISO, and other styles
12

Strehz, Alexander, and Thomas Einfalt. "Precipitation Data Retrieval and Quality Assurance from Different Data Sources for the Namoi Catchment in Australia." Geomatics 1, no. 4 (October 28, 2021): 417–28. http://dx.doi.org/10.3390/geomatics1040024.

Full text
Abstract:
Within the Horizon 2020 Project WaterSENSE a modular approach was developed to provide different stakeholders with the required precipitation information. An operational high-quality rainfall grid was set up for the Namoi catchment in Australia based on rain gauge adjusted radar data. Data availability and processing considerations make it necessary to explore alternative precipitation approaches. The gauge adjusted radar data will serve as a benchmark for the alternative precipitation data. The two well established satellite-based precipitation datasets IMERG and GSMaP will be analyzed with the temporal and spatial requirements of the applications envisioned in WaterSENSE in mind. While first results appear promising, these datasets will need further refinements to meet the criteria of WaterSENSE, especially with respect to the spatial resolution. Inferring information from soil moisture-derived from EO observations to increase the spatial detail of the existing satellite-based datasets is a promising approach that will be investigated along with other alternatives.
APA, Harvard, Vancouver, ISO, and other styles
13

Nourollah, Hadi, and Javad Aliemrani. "Improved imaging of the Strzelecki Formation by the reprocessing of 3D seismic reflection data: onshore Gippsland Basin, Australia." APPEA Journal 55, no. 2 (2015): 465. http://dx.doi.org/10.1071/aj14100.

Full text
Abstract:
The Wombat 3D seismic survey was recorded by Lakes Oil in 2008 following the drilling of the Wombat–3 well. The survey was aimed at identifying the structures better inside the Strzelecki Formation and the underlying Rintoul Creek Sandstone, and to give a better understanding of the provenance of the gas encountered in the first three Wombat wells and the oil encountered at depth in the Wombat–3 well. Seismic acquisition design and processing in this area are challenged by the presence of coal measures in the Latrobe Group, which overlies the zone of interest, absorbs most of the incident seismic energy and complicates seismic imaging through the generation of multiple reflection events. A novel approach, described as multi-line decomposition, modelling and synthesis, was developed to model critical imaging parameters and in particular the velocity model for midpoint stacking and migration. This approach is demonstrated through the reprocessing of the Wombat 3D seismic survey and is compared with a conventional 3D processing approach and original survey processing examples. The reprocessed seismic section provides a significantly improved image of the deeper structures, delineating a number of continuous reflectors in the Early Cretaceous formations. It also provides the opportunity to build a more accurate model of the basement.
APA, Harvard, Vancouver, ISO, and other styles
14

Harrison, Christopher B., and Milovan Urosevic. "Seismic processing, inversion, and AVO for gold exploration — Case study from Western Australia." GEOPHYSICS 77, no. 5 (September 1, 2012): WC235—WC243. http://dx.doi.org/10.1190/geo2011-0506.1.

Full text
Abstract:
We investigate the potential of using high-resolution seismic methods for rock characterization and for targeting of gold deposits at the St. Ives gold camp. The application of seismic methods in hard-rock environments is challenged by complex structures, intrinsically low signal-to-noise ratio, regolith distortions, and access restrictions. If these issues can be addressed, then the unparalleled resolving power of reflection seismic can be used for mineral exploration. Appropriate spatial sampling of the wavefield combined with a survey geometry design and rigorous data processing to incorporate high fold and long offsets are necessary for creation of high-quality seismic images. In the hard-rock environment of Western Australia, accurate static corrections and multiphase velocity analysis are essential processing steps. This is followed by a rigorous quality control following each processing step. In such a case, we show that the role of reflection seismic could be lifted from mere identification of first-order structures to refined lithological analyses. Five deep boreholes with sonic logs and core sample test data were used to calibrate 2D seismic images. Despite seismic images were produced with relatively robust scaling it was possible to achieve reasonably high seismic-log correlation across three of the tightly spaced boreholes using a single composite wavelet. Amplitude-versus-offset (AVO) analysis indicated that gold-bearing structures may be related to elevated AVO effect and increased reflectivity. Consequently, partial stack analysis and acoustic and elastic inversions were conducted. These results and impedance crossplots were then evaluated against known gold occurrences. While still in the preliminary stages, hard-rock seismic imaging, inversion, and the application of AVO techniques indicated significant potential for targeting mineral reserves.
APA, Harvard, Vancouver, ISO, and other styles
15

Jendryke, Michael, Timo Balz, Houjun Jiang, Mingsheng Liao, and Uwe Stilla. "Using Open-Source Components to Process Interferometric TerraSAR-X Spotlight Data." International Journal of Antennas and Propagation 2013 (2013): 1–13. http://dx.doi.org/10.1155/2013/275635.

Full text
Abstract:
We address the processing of interferometric TerraSAR-X and TanDEM-X spotlight data. Processing steps necessary to derive interferograms at high spatial resolution from bi- and monostatic satellite images will be explained. The spotlight image mode is a beam steering technique focusing the antenna on a specific ground area. This results in a linear Doppler shift frequency in azimuth direction, which has to be matched to the master image. While shifting the interpolation kernel in azimuth during resampling, the frequency spectrum of the slave image is aligned to the master image. We show how to process bistatic TanDEM-X images and propose an integrated processing option for monostatic TerraSAR-X data in the Delft Object-oriented Radar Interferometric Software (DORIS). The paper focuses on the implementation of this algorithm for high-resolution spotlight InSAR in a public domain tool; hence, it becomes available to a larger research community. The results are presented for three test areas: Uluru in Australia, Las Vegas in the USA, and Lüneburg in Germany.
APA, Harvard, Vancouver, ISO, and other styles
16

Zhou, Tian, Bart Nijssen, George J. Huffman, and Dennis P. Lettenmaier. "Evaluation of Real-Time Satellite Precipitation Data for Global Drought Monitoring." Journal of Hydrometeorology 15, no. 4 (July 30, 2014): 1651–60. http://dx.doi.org/10.1175/jhm-d-13-0128.1.

Full text
Abstract:
Abstract The Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) near-real-time (RT) data are considered less accurate than the TMPA research quality (RP) data because of the simplified data processing algorithm and the lack of gauge adjustments. However, for near-real-time hydrological applications, such as drought nowcasting, the RT data must play a key role given latency considerations and consistency is essential with products like RP, which have a long-term climatology. The authors used a bivariate test to examine the consistency between the monthly RT and RP precipitation estimates for 12 yr (2000–12) and found that, for over 75% of land cells globally, RT and RP were statistically consistent at 0.05 significance level. The inconsistent grid cells are spatially clustered in western North America, northern South America, central Africa, and most of Australia. The authors also show that RT generally increases with time relative to RP in northern South America and western Australia, while in western North America and eastern Australia, RT decreases relative to RP. In other areas such as the eastern part of North America, Eurasia, and southern part of the South America, the RT data are statistically consistent with the RP data and are appropriate for global- or macroscale hydrological applications.
APA, Harvard, Vancouver, ISO, and other styles
17

Carmichael, Gordon A. "Indigenous fertility in Australia: updating Alan Gray." Journal of Population Research 36, no. 4 (September 20, 2019): 283–317. http://dx.doi.org/10.1007/s12546-019-09233-w.

Full text
Abstract:
Abstract Although he was not the first scholar to investigate it, there is little question that the Ph.D. research of Alan Gray, completed in 1983, represented a landmark in the study of Indigenous fertility in Australia. Convinced that ‘Aboriginal’ fertility had fallen rapidly through the 1970s, Gray set out to document and explain the decline. Weaving through a maze of sub-optimal census data he produced a series of age-specific and total fertility rates, refined by three broad geographic location categories, for 5-year periods from 1956–1961 to 1976–1981. These he subsequently updated to also include 1981–1986 and the 10-year period 1986–1996 as new census children-ever-borne data became available. He would doubtless have extended his series further had he lived to do so. For years his fertility estimates were graphed in the annual ABS publication Births Australia as the Bureau began publishing registration-based Indigenous fertility estimates from the late 1990s, but Indigenous birth registration data and fertility estimates based thereon remain to this day problematic in several respects. This paper summarises Alan Gray’s work, extends his Indigenous fertility estimates to the 2011–2016 intercensal period, and examines the results against registration-based estimates that have been subjected to (a) regular retrospective revision (in light of data processing flaws and substantial errors of closure in intercensal Indigenous population increments), and (b) the vagaries of significant late registration, and periodic registry efforts to clear backlogs of unregistered Indigenous births.
APA, Harvard, Vancouver, ISO, and other styles
18

Heath, A. M., A. L. Culver, and C. W. Luxton. "Gathering good seismic data from the Otway Basin." Exploration Geophysics 20, no. 2 (1989): 247. http://dx.doi.org/10.1071/eg989247.

Full text
Abstract:
Cultus Petroleum N.L. began exploration in petroleum permit EPP 23 of the offshore Otway Basin in December 1987. The permit was sparsely explored, containing only 2 wells and poor quality seismic data. A regional study was made taking into account the shape of the basin and the characteristics of the major seismic sequences. A prospective trend was recognised, running roughly parallel to the present shelf edge of South Australia. A new seismic survey was orientated over this prospective trend. The parameters were designed to investigate the structural control of the prospects in the basin. To improve productivity during the survey, north-south lines had to be repositioned due to excessive swell noise on the cable. The new line locations were kept in accordance with the structural model. Field displays of the raw 240 channel data gave encouraging results. Processing results showed this survey to be the best quality in the area. An FK filter was designed on the full 240 channel records. Prior to wavelet processing, an instrument dephase was used to remove any influence of the recording system on the phase of the data. Close liaison was kept with the processing centre over the selection of stacking velocities and their relevance to the geological model. DMO was found to greatly improve the resolution of steeply dipping events and is now considered to be part of the standard processing sequence for Otway Basin data. Seismic data of a high enough quality for structural and stratigraphic interpretation can be obtained from this basin.
APA, Harvard, Vancouver, ISO, and other styles
19

Hefti, J., S. Dewing, C. Jenkins, A. Arnold, and B. E. Korn. "IMPROVEMENTS IN SEISMIC IMAGING, IO JANSZ GAS FIELD NORTH WEST SHELF, AUSTRALIA." APPEA Journal 46, no. 1 (2006): 135. http://dx.doi.org/10.1071/aj05009.

Full text
Abstract:
The Io Jansz gas field is situated in the Carnarvon Basin on the North West Shelf of Australia. It is Australia’s largest gas field, estimated to hold over 20 TCF of gas reserves and covering an area of over 2000 km2. Following a series of appraisal wells and a 3D seismic survey, this field is moving rapidly towards development. Image quality of the 3D provided significant uplift over existing 2D surveys in the area. Expectations for resolution and business targets have been met through careful planning and the provision of staged deliverables.Despite the exceptional data quality, a number of technical challenges were encountered that led to operational changes and adaptations by the project team. Source height statics and severe image distortion due to overburden are examples of some of the challenges addressed. Consideration of the exploration history of this field and its associated imaging gives insight into the improvements in image quality that can be realised by careful selection of acquisition and processing parameters, high levels of quality control (QC) and modern processing algorithms. The ultimate success of this project was achieved through close cooperation within interdisciplinary teams comprised of partner technical staff and the seismic acquisition and processing contractor.
APA, Harvard, Vancouver, ISO, and other styles
20

Bell, Jon F., Peter J. Hall, Warwick E. Wilson, Robert J. Sault, Rick J. Smegal, Malcolm R. Smith, Willem van Straten, et al. "Base Band Data for Testing Interference Mitigation Algorithms." Publications of the Astronomical Society of Australia 18, no. 1 (2001): 105–13. http://dx.doi.org/10.1071/as01006.

Full text
Abstract:
AbstractDigital signal processing is one of many valuable tools for suppressing unwanted signals or inter-ference. Building hardware processing engines seems to be the way to best implement some classes of interference suppression but is, unfortunately, expensive and time-consuming, especially if several miti-gation techniques need to be compared. Simulations can be useful, but are not a substitute for real data. CSIRO’s Australia Telescope National Facility has recently commenced a ‘software radio telescope’ project designed to fill the gap between dedicated hardware processors and pure simulation. In this approach, real telescope data are recorded coherently, then processed offline. This paper summarises the current contents of a freely available database of base band recorded data that can be used to experiment with signal processing solutions. It includes data from the following systems: single dish, multi-feed receiver; single dish with reference antenna; and an array of six 22 m antennas with and without a reference antenna. Astronomical sources such as OH masers, pulsars and continuum sources subject to interfering signals were recorded. The interfering signals include signals from the US Global Positioning System (GPS) and its Russian equivalent (GLONASS), television, microwave links, a low-Earth-orbit satellite, various other transmitters, and signals leaking from local telescope systems with fast clocks. The data are available on compact disk, allowing use in general purpose computers or as input to laboratory hardware prototypes.
APA, Harvard, Vancouver, ISO, and other styles
21

Urosevic, Milovan, Ganesh Bhat, and Marcos Hexsel Grochau. "Targeting nickel sulfide deposits from 3D seismicreflection data at Kambalda, Australia." GEOPHYSICS 77, no. 5 (September 1, 2012): WC123—WC132. http://dx.doi.org/10.1190/geo2011-0514.1.

Full text
Abstract:
The greenstone belts of the Yilgarn Craton, Western Australia, host numerous Archaean gold, nickel, and iron ore deposits. These deposits typically are found in complex geologic structures hidden by a deep, heterogeneous, and often conductive regolith profile. This added complexity limits the depth of penetration for the potential field methods, but at the same time opens new revenue possibilities through the application of seismic methods. To explore this opportunity, we acquired high-resolution, experimental, 3D seismic data over Lake Lefroy in Kambalda, Western Australia. The main objective was to map exceptionally complex, deep structures associated with Kambalda dome. Survey design used 3D ray tracing to improve the distribution of the common reflection points across ultramafic-basalt contacts which host numerous small, high-grade nickel sulfide deposits. A combination of small explosive sources, high-shot/receiver density, and exceptionally good coupling over the ultrasalty lake surface produced seismic data of very high quality. Processing focused on computation of accurate static and dynamic corrections, whereas imaging was helped by the existing geologic model. Advanced volumetric interpretation supported by seismic forward modeling was used to guide mapping of the main lithological interfaces and structures. Forward modeling was carried out using rock properties obtained from ultrasonic measurements and one borehole, drilled in the proximity of the 3D seismic volume. Using this information, geometric constraints based on the typical size of ore bodies found in this mine and a simple window-based seismic attribute, several new targets were proposed. Three of these targets subsequently have been drilled and new zones of mineralization were intercepted. The case study presented demonstrates that high-quality, high-resolution, 3D seismic data combined with volumetric seismic interpretation could become a primary methodology for exploration of deep, small, massive sulfide deposits distributed across the Kambalda area.
APA, Harvard, Vancouver, ISO, and other styles
22

Wietfeldt, R., W. Van Straten, D. Del Rizzo, N. Bartel, W. Cannon, M. Bailes, J. Reynolds, and W. Wilson. "The S2 Baseband Processing System for Phase-coherent Pulsar Observations." International Astronomical Union Colloquium 160 (1996): 21–22. http://dx.doi.org/10.1017/s0252921100040926.

Full text
Abstract:
AbstractThe phase-coherent recording of pulsar data and subsequent software dispersion removal provide a flexible way to reach the limits of high time resolution, useful for more precise pulse timing and the study of fast signal fluctuations within a pulse. Because of the huge data rate and lack of adequate recording and computing capabilities, this technique has been used mostly only for small pulsar data sets. In recent years, however, the development of very capable, reasonably inexpensive high-speed recording systems and computers has made feasible the notion of pulsar baseband recording and subsequent processing with a workstation/computer. In this paper we discuss the development of a phase-coherent baseband processing system for radio pulsar observations. This system is based on the S2 VLBI recorder developed at ISTS/York University in Toronto, Canada. We present preliminary first results for data from the Vela pulsar, obtained at Parkes, Australia, and processed at ISTS/York University, and discuss plans for future developments.
APA, Harvard, Vancouver, ISO, and other styles
23

Ghaffariyan, M. R. "Comparing productivity-cost of roadside processing system and road side chipping system in Western Australia." Journal of Forest Science 59, No. 5 (May 30, 2013): 204–10. http://dx.doi.org/10.17221/81/2012-jfs.

Full text
Abstract:
This research compared roadside chipping and road side processing systems. Two sites planted with Eucalyptus globulus were selected to study these harvesting systems. A time and motion study was applied to collect the data for both harvesting systems. The working cycles for each machine were recorded as well as the variables af­fecting the working productivity. Using the multiple regression method the appropriate models were developed. The results showed that the productivity of feller-buncher and processor was significantly affected by tree size. Productivity of skidders was dependent on extraction distance and load weight. Productivity for road side processing was higher than for road side chipping, which resulted in a lower unit cost. The unit cost (from stand to the mill) for road side processing and road side chipping averaged 22.68 AUD·t–1 and 21.07 AUD·t–1, respectively.  
APA, Harvard, Vancouver, ISO, and other styles
24

Pearce, Christopher, Adam McLeod, Jon Patrick, Jason Ferrigi, Michael Michael Bainbridge, Natalie Rinehart, and Anna Fragkoudi. "Coding and classifying GP data: the POLAR project." BMJ Health & Care Informatics 26, no. 1 (November 2019): e100009. http://dx.doi.org/10.1136/bmjhci-2019-100009.

Full text
Abstract:
BackgroundData, particularly ‘big’ data are increasingly being used for research in health. Using data from electronic medical records optimally requires coded data, but not all systems produce coded data.ObjectiveTo design a suitable, accurate method for converting large volumes of narrative diagnoses from Australian general practice records to codify them into SNOMED-CT-AU. Such codification will make them clinically useful for aggregation for population health and research purposes.MethodThe developed method consisted of using natural language processing to automatically code the texts, followed by a manual process to correct codes and subsequent natural language processing re-computation. These steps were repeated for four iterations until 95% of the records were coded. The coded data were then aggregated into classes considered to be useful for population health analytics.ResultsCoding the data effectively covered 95% of the corpus. Problems with the use of SNOMED CT-AU were identified and protocols for creating consistent coding were created. These protocols can be used to guide further development of SNOMED CT-AU (SCT). The coded values will be immensely useful for the development of population health analytics for Australia, and the lessons learnt applicable elsewhere.
APA, Harvard, Vancouver, ISO, and other styles
25

Evans, B. J., G. A. Paterson, and S. E. Frey. "FAULT PLANE RESOLUTION USING THE LOW-FOLD 3D SEISMIC TECHNIQUE OVER WOODADA GAS FIELD, PERTH BASIN, WESTERN AUSTRALIA." APPEA Journal 27, no. 1 (1987): 289. http://dx.doi.org/10.1071/aj86023.

Full text
Abstract:
During August 1984, a conventional 2D seismic line and a single fold 3D seismic survey were recorded over the Woodada Gas Field, North Perth Basin, Western Australia. This survey was a joint venture between the Allied Geophysical Laboratories at the University of Houston and the Exploration Seismology Centre's Field Research Laboratory at the Western Australian Institute of Technology. Previous seismic data were so poor that there was confusion about fault orientation and structure in the survey area. In addition, the fault strike direction and extent were unknown at this location. Consequently, 3D seismic acquisition and processing techniques appeared highly applicable to this geological problem.In general, progressive development of seismic data acquisition methods has been towards higher channel, higher multifold 2D and 3D surveys. However, at the Allied Geophysical Laboratories, processing techniques for single-fold 3D data have been developed using model tank data. This processing technique — LO-FOLD 3D — was used to field trial the method, and to test its ability to define faulting between the gas producing well Indoon 1 and dry step-out well Woodada 9. Previous usage of the single-fold 3D survey method was to delineate reefal structures in the Michigan Basin. Beyond this, no published articles discuss the method.With single-fold data, velocity analysis and coherent noise are a problem. Consequently, 2D bin lines through the 3D volume of data were processed in order to improve the signal to noise ratios. The objective was to delineate the fault orientation in the Carynginia Formation, located between 1.3 and 1.5 seconds. Fault delineation was determined from 2D bin lines and time slices, and is interpreted to run diagonally between the two wells.
APA, Harvard, Vancouver, ISO, and other styles
26

Williamson, P. E., and F. Kroh. "THE ROLE OF AMPLITUDE VERSUS OFFSET TECHNOLOGY IN PROMOTING OFFSHORE PETROLEUM EXPLORATION IN AUSTRALIA." APPEA Journal 47, no. 1 (2007): 163. http://dx.doi.org/10.1071/aj06009.

Full text
Abstract:
Amplitude versus offset (AVO) technology has proved itself useful in petroleum exploration in various parts of the world, particularly for gas exploration. To determine if modern AVO compliant processing could identify potential anomalies for exploration of open acreage offshore Australia, Geoscience Australia reprocessed parts of four publicly available long cable lines. These lines cover two 2006 acreage release areas on the Exmouth Plateau and in the Browse Basin on the North West Shelf. An earlier study has also been done on two publicly available long cable lines from Geoscience Australia’s Bremer Basin study and cover areas from the 2005 frontier acreage release on the southern margin. The preliminary results from these three reprocessing efforts produced AVO anomalies and were made publicly available to assist companies interested in assessing the acreage. The results of the studies and associated data are available from Geoscience Australia at the cost of transfer.The AVO data from the Exmouth Plateau show AVO anomalies including one that appears to be at the Jurassic level of the reservoir in the Jansz/Io supergiant gas field in adjacent acreage to the north. The AVO data from the Caswell Sub-basin of the Browse Basin show an AVO anomaly at or near the stratigraphic zone of the Brecknock South–1 gas discovery to the north. The geological settings of strata possibly relating to two AVO anomalies in the undrilled Bremer Basin are in the Early Cretaceous section, where lacustrine sandstones are known to occur. The AVO anomalies from the three studies are kilometres in length along the seismic lines.These preliminary results from Geoscience Australiaand other AVO work that has been carried out by industry show promise that AVO compliant processing has value—particularly for gas exploration offshore Australia—and that publicly available long-cable data can be suitable for AVO analysis.
APA, Harvard, Vancouver, ISO, and other styles
27

Bertone, Edoardo, Rodney A. Stewart, Hong Zhang, and Cameron Veal. "Data-driven recursive input–output multivariate statistical forecasting model: case of DO concentration prediction in Advancetown Lake, Australia." Journal of Hydroinformatics 17, no. 5 (June 6, 2015): 817–33. http://dx.doi.org/10.2166/hydro.2015.131.

Full text
Abstract:
A regression model integrating data pre-processing and transformation, input selection techniques and a data-driven statistical model, facilitated accurate 7 day ahead time series forecasting of selected water quality parameters. A core feature of the modelling approach is a novel recursive input–output algorithm. The herein described model development procedure was applied to the case of a 7 day ahead dissolved oxygen (DO) concentration forecast for the upper hypolimnion of Advancetown Lake, Queensland, Australia. The DO was predicted with an R2 > 0.8 and a normalised root mean squared error of 14.9% on a validation data set by using 10 inputs related to water temperature or pH. A key feature of the model is that it can handle nonlinear correlations, which was essential for this environmental forecasting problem. The pre-processing of the data revealed some relevant inputs that had only 6 days' lag, and as a consequence, those predictors were in-turn forecasted 1 day ahead using the same procedure. In this way, the targeted prediction horizon (i.e. 7 days) was preserved. The implemented approach can be applied to a wide range of time-series forecasting problems in the complex hydro-environment research area. The reliable DO forecasting tool can be used by reservoir operators to achieve more proactive and reliable water treatment management.
APA, Harvard, Vancouver, ISO, and other styles
28

Schow, Ronald L., Mary M. Whitaker, J. Anthony Seikel, Jeff E. Brockett, and Deborah M. Domitz Vieira. "Validity of the Multiple Auditory Processing Assessment–2: A Test of Auditory Processing Disorder." Language, Speech, and Hearing Services in Schools 51, no. 4 (October 2, 2020): 993–1006. http://dx.doi.org/10.1044/2020_lshss-20-00001.

Full text
Abstract:
Purpose A normative study using the Multiple Auditory Processing Assessment–2 (MAPA-2; Schow et al., 2018) was recently completed. With access to these data, the authors extend that work and support a definite construct for auditory processing disorder (APD). The goal here is to examine MAPA-2 reliability and validity (construct, content, and concurrent). Evidence for the APD construct is further buttressed by measures of sensitivity and specificity. Results of MAPA-2 testing on children diagnosed with learning disability (LD), attention-deficit/hyperactivity disorder (ADHD), and specific language impairment (SLI) are included. Method Normative data (previously published as the MAPA-2) allowing derivation of these findings included a representative sample of 748 children (53% girls) ages 7–14 years tested by 54 speech-language pathologists and audiologists in 27 U.S. states. The authors examined diagnostic accuracy based on the American Speech-Language-Hearing Association (2005) criteria (index test) for confirmed cases of APD. The index was also used to identify listening problems for three other diagnostic categories (LD, ADHD, and SLI). Validated questionnaire responses from parents and school personnel allowed incorporation of functional measures widely supported in APD diagnosis but unavailable with other normative and sensitivity/specificity studies. Results Reliability and validity were both satisfactory, and diagnostic accuracy for an APD group of 18 (28% female) compared to the remaining typical group of 625 yielded 89% sensitivity and 82% specificity. The remaining three groups (LD, ADHD, and SLI), where comorbidity was expected to be about 50%, had APD-type listening problems with a prevalence ranging from 52% to 65%. Conclusions Current results provide important evidence for the construct of APD. The MAPA-2 can be administered by an audiologist or speech-language pathologist. A similar diagnostic protocol in Australia yielded positive therapeutic gains. Further study is encouraged to determine if the present positive findings will be found in future research.
APA, Harvard, Vancouver, ISO, and other styles
29

Wang, Jianzhou, Ling Xiao, and Jun Shi. "The Combination Forecasting of Electricity Price Based on Price Spikes Processing: A Case Study in South Australia." Abstract and Applied Analysis 2014 (2014): 1–12. http://dx.doi.org/10.1155/2014/172306.

Full text
Abstract:
Electricity price forecasting holds very important position in the electricity market. Inaccurate price forecasting may cause energy waste and management chaos in the electricity market. However, electricity price forecasting has always been regarded as one of the largest challenges in the electricity market because it shows high volatility, which makes electricity price forecasting difficult. This paper proposes the use of artificial intelligence optimization combination forecasting models based on preprocessing data, called “chaos particles optimization (CPSO) weight-determined combination models.” These models allow for the weight of the combined model to take values of[-1,1]. In the proposed models, the density-based spatial clustering of applications with noise (DBSCAN) algorithm is used to identify outliers, and the outliers are replaced by a new data-produced linear interpolation function. The proposed CPSO weight-determined combination models are then used to forecast the projected future electricity price. In this case study, the electricity price data of South Australia are simulated. The results indicate that, while the weight of the combined model takes values of[-1,1], the proposed combination model can always provide adaptive, reliable, and comparatively accurate forecast results in comparison to traditional combination models.
APA, Harvard, Vancouver, ISO, and other styles
30

Zhou, Joe, Peter Chia, Jingyu Li, Henry Ng, Sergey Birdus, Keat Huat Teng, Ying Peng Phan, Jason Sun, and He Yi. "Unlocking the full potential of broadband data with advanced processing and imaging technology, a case study from NWS Australia." ASEG Extended Abstracts 2015, no. 1 (December 2015): 1–4. http://dx.doi.org/10.1071/aseg2015ab081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Warner, R. D., F. R. Dunshea, D. Gutzke, J. Lau, and G. Kearney. "Factors influencing the incidence of high rigor temperature in beef carcasses in Australia." Animal Production Science 54, no. 4 (2014): 363. http://dx.doi.org/10.1071/an13455.

Full text
Abstract:
Beef carcasses undergoing rapid pH fall while the loin muscle temperature is still high are described as heat-shortened, heat-toughened or ‘high rigor temperature’ carcasses, with subsequent negative effects on quality traits. The aim of the study was to quantify the occurrence of high rigor temperature in beef carcasses across Australia and to identify the causative factors. Data was collected over 4–5 days at each of seven beef processing plants from 1512 beef carcasses. The beef carcasses were from both grass- and grain-fed cattle ranging in days on grain feeding from 0 (grass-fed) to 350 days and the category of cattle ranged from veal to ox and cow. Data collected on the day of slaughter included the duration of electrical inputs at the immobiliser, electrical stimulation and hide puller, longissimus muscle pH and temperature decline, hot carcass weight and P8 fat depth. At grading, ultimate pH, eye muscle area, wetness of the loin surface and colour score were also collected. The temperature at pH 6 was calculated and if it was >35°C, the carcass was defined as ‘high rigor temperature’. Modelling of the data was conducted using GLMM and REML. The occurrence of high rigor temperature across all seven beef processing plants was 74.6% ranging from 56 to 94% between beef processing plants. Increasing days in the feedlot and heavier carcass weights were highly correlated and both caused an increase in the predicted temperature at pH 6 and in the % high rigor temperature (P < 0.05 for both). Longer duration of electrical inputs at the hide puller, fatter grass-fed cattle and fatter male (castrate) carcasses had a higher temperature at pH 6 and higher % high rigor temperature. Modelling showed that if the time to reach pH 6 in the longissimus muscle was 65 v. 105 min, the % high rigor temperature carcasses reduced from 98 to 19% in grain-fed cattle and 93 to 7% in grass-fed cattle. Higher plasma insulin levels at slaughter were associated with a higher temperature at pH 6 (rigor temperature) (P < 0.001). In conclusion, in order to reduce the incidence of high rigor temperature in grain-fed beef carcasses, methods for identifying high rigor temperature carcasses will be required and while some management strategies can be implemented now, others require further research.
APA, Harvard, Vancouver, ISO, and other styles
32

Tupper, Neil, Eric Matthews, Gareth Cooper, Andy Furniss, Tim Hicks, and Suzanne Hunt. "The Waitsia Field, onshore North Perth Basin, Western Australia." APPEA Journal 56, no. 1 (2016): 29. http://dx.doi.org/10.1071/aj15003.

Full text
Abstract:
The Waitsia Field represents a new commercial play for the onshore north Perth Basin with potential to deliver substantial reserves and production to the domestic gas market. The discovery was made in 2014 by deepening of the Senecio–3 appraisal well to evaluate secondary reservoir targets. The well successfully delineated the extent of the primary target in the Upper Permian Dongara and Wagina sandstones of the Senecio gas field but also encountered a combination of good-quality and tight gas pay in the underlying Lower Permian Kingia and High Cliff sandstones. The drilling of the Waitsia–1 and Waitsia–2 wells in 2015, and testing of Senecio-3 and Waitsia-1, confirmed the discovery of a large gas field with excellent flow characteristics. Wireline log and pressure data define a gross gas column in excess of 350 m trapped within a low-side fault closure that extends across 50 km2. The occurrence of good-quality reservoir in the depth interval 3,000–3,800 m is diagenetically controlled with clay rims inhibiting quartz cementation and preserving excellent primary porosity. Development planning for Waitsia has commenced with the likelihood of an early production start-up utilising existing wells and gas processing facilities before ramp-up to full-field development. The dry gas will require minimal processing, and access to market is facilitated by the Dampier–Bunbury and Parmelia gas pipelines that pass directly above the field. The Waitsia Field is believed to be the largest conventional Australian onshore discovery for more than 30 years and provides impetus and incentive for continued exploration in mature and frontier basins. The presence of good-quality reservoir and effective fault seal was unexpected and emphasise the need to consider multiple geological scenarios and to test unorthodox ideas with the drill bit.
APA, Harvard, Vancouver, ISO, and other styles
33

Robinson, Jo, Katrina Witt, Michelle Lamblin, Matthew J. Spittal, Greg Carter, Karin Verspoor, Andrew Page, et al. "Development of a Self-Harm Monitoring System for Victoria." International Journal of Environmental Research and Public Health 17, no. 24 (December 15, 2020): 9385. http://dx.doi.org/10.3390/ijerph17249385.

Full text
Abstract:
The prevention of suicide and suicide-related behaviour are key policy priorities in Australia and internationally. The World Health Organization has recommended that member states develop self-harm surveillance systems as part of their suicide prevention efforts. This is also a priority under Australia’s Fifth National Mental Health and Suicide Prevention Plan. The aim of this paper is to describe the development of a state-based self-harm monitoring system in Victoria, Australia. In this system, data on all self-harm presentations are collected from eight hospital emergency departments in Victoria. A natural language processing classifier that uses machine learning to identify episodes of self-harm is currently being developed. This uses the free-text triage case notes, together with certain structured data fields, contained within the metadata of the incoming records. Post-processing is undertaken to identify primary mechanism of injury, substances consumed (including alcohol, illicit drugs and pharmaceutical preparations) and presence of psychiatric disorders. This system will ultimately leverage routinely collected data in combination with advanced artificial intelligence methods to support robust community-wide monitoring of self-harm. Once fully operational, this system will provide accurate and timely information on all presentations to participating emergency departments for self-harm, thereby providing a useful indicator for Australia’s suicide prevention efforts.
APA, Harvard, Vancouver, ISO, and other styles
34

Mesibov, Robert. "An audit of some processing effects in aggregated occurrence records." ZooKeys 751 (April 20, 2018): 129–46. http://dx.doi.org/10.3897/zookeys.751.24791.

Full text
Abstract:
A total of ca 800,000 occurrence records from the Australian Museum (AM), Museums Victoria (MV) and the New Zealand Arthropod Collection (NZAC) were audited for changes in selected Darwin Core fields after processing by the Atlas of Living Australia (ALA; for AM and MV records) and the Global Biodiversity Information Facility (GBIF; for AM, MV and NZAC records). Formal taxon names in the genus- and species-groups were changed in 13–21% of AM and MV records, depending on dataset and aggregator. There was little agreement between the two aggregators on processed names, with names changed in two to three times as many records by one aggregator alone compared to records with names changed by both aggregators. The type status of specimen records did not change with name changes, resulting in confusion as to the name with which a type was associated. Data losses of up to 100% were found after processing in some fields, apparently due to programming errors. The taxonomic usefulness of occurrence records could be improved if aggregators included both original and the processed taxonomic data items for each record. It is recommended that end-users check original and processed records for data loss and name replacements after processing by aggregators.
APA, Harvard, Vancouver, ISO, and other styles
35

Soler, Santiago R., and Leonardo Uieda. "Gradient-boosted equivalent sources." Geophysical Journal International 227, no. 3 (August 24, 2021): 1768–83. http://dx.doi.org/10.1093/gji/ggab297.

Full text
Abstract:
SUMMARY The equivalent source technique is a powerful and widely used method for processing gravity and magnetic data. Nevertheless, its major drawback is the large computational cost in terms of processing time and computer memory. We present two techniques for reducing the computational cost of equivalent source processing: block-averaging source locations and the gradient-boosted equivalent source algorithm. Through block-averaging, we reduce the number of source coefficients that must be estimated while retaining the minimum desired resolution in the final processed data. With the gradient-boosting method, we estimate the sources coefficients in small batches along overlapping windows, allowing us to reduce the computer memory requirements arbitrarily to conform to the constraints of the available hardware. We show that the combination of block-averaging and gradient-boosted equivalent sources is capable of producing accurate interpolations through tests against synthetic data. Moreover, we demonstrate the feasibility of our method by gridding a gravity data set covering Australia with over 1.7 million observations using a modest personal computer.
APA, Harvard, Vancouver, ISO, and other styles
36

Simpson, C. J., J. R. Wilford, L. F. Macias, and R. J. Korsch. "SATELLITE DETECTION OF NATURAL HYDROCARBON SEEPAGE: PALM VALLEY GAS FIELD, AMADEUS BASIN, CENTRAL AUSTRALIA." APPEA Journal 29, no. 1 (1989): 196. http://dx.doi.org/10.1071/aj88019.

Full text
Abstract:
Digital image processing of advanced aircraft and Landsat Thematic Mapper (TM) satellite remotely sensed data over sandstones of the Palm Valley Gas Field, central Australia, showed a distinct colour anomaly about 6 km long by 1.5 km wide which is not obvious in visible wavelength imagery. Field inspection showed that the colour anomaly was characterised by different rock- weathering colour, a geobotanical anomaly, calcium carbonate precipitation within rock fractures, and different soil pH. Inorganic rock geochemistry indicates significant chemical differences in some major elements. A limited number of soil gas samples were analysed and within the remotely sensed colour anomaly some had above- threshold concentrations of methane, ethane, propane and butane. Preliminary processing of airborne magnetic and gamma spectrometric data over the anticline did not indicate any significant values that suggested abnormal development of magnetite or clay minerals within the colour anomaly. Carbon and oxygen isotope analyses on calcrete from within the colour anomaly suggest, somewhat inconclusively, that hydrocarbons have not contributed significantly to the formation of the calcium carbonate component of the calcrete. Consideration of all available information suggests that the colour anomaly detectable by aircraft and Landsat TM satellite remote sensing corresponds to a zone of surface alteration resulting from long- term seepage of hydrocarbon gases. This colour anomaly, the first of its type reported from Australia, was detected because of spectral reflectance differences resulting from a combination of increased soil carbonate and different geobotanical characteristics from those of the surrounding terrain.
APA, Harvard, Vancouver, ISO, and other styles
37

Birdus, Sergey, and Alexey Artyomov. "Removing fault shadow distortions from seismic images using depth-velocity modelling and pre-stack depth migration." APPEA Journal 52, no. 2 (2012): 700. http://dx.doi.org/10.1071/aj11114.

Full text
Abstract:
In many areas, fault shadows manifest a serious challenge to seismic imaging. The major part of this problem is caused by different types of velocity variations caused by faults. Pre-stack depth migration with sufficiently accurate velocity model successfully resolves this problem and the high resolution tomographic depth-velocity modelling is the most important component of the solution. During depth processing on a number of real 3D seismic datasets with fault shadows from Australia and other regions, the following were noticed: The appearance of the image distortions below the faults and the convergence speed of the tomographic velocity inversion depend on the acquisition direction. Sometimes, tomographic modelling produces depth-velocity models that closely follow geology, but the models contain non-geological looking anomalies in other areas. In both cases, the depth migration delivers distortion-free images. If anisotropy is present in faulted areas, it creates additional image distortions and can require extra input data and processing efforts. To examine these effects and optimise depth-processing workflow, several 3D synthetic seismic datasets were created for different types of velocity anomalies associated with the faults in isotropic and anisotropic media and different acquisition directions. On synthetic and real data from Australia, different types of fault shadows are illustrated; how they can be solved depending on the acquisition direction are also shown. Some types of the fault shadows are shown to require multi-azimuth illumination to guarantee their successful removal.
APA, Harvard, Vancouver, ISO, and other styles
38

Altıparmak Yılmaz, H. Merve, and Necati Demir. "Error Analysis: Approaches to Written Texts of Turks Living in the Sydney." International Education Studies 13, no. 2 (January 29, 2020): 104. http://dx.doi.org/10.5539/ies.v13n2p104.

Full text
Abstract:
The purpose of this study is to describe the errors made by Turks living in Sydney, Australia in Turkish written texts. The mistakes identified in the texts were handled with the error analysis approach and evaluated according to their linguistic, cognitive processing, communicative, spelling and punctuation characteristics. Content analysis technique, one of the qualitative research methods, was used in the research. The study group consisted of forty-one people, aged between 10-25 years, living in Sydney, Australia in 2017. Participants were asked to create a text of at least 250 words by selecting any of the seven elective subjects in the written expression form. The texts were then examined one by one and the errors were analyzed under four headings: linguistic, cognitive processing, communicative, spelling and punctuation. As a result of the analyzed data in written expression texts, 951 linguistic and cognitive processing, 343 communicative, 230 spelling and 178 punctuation errors were detected. By analyzing the written texts under these headings, it is thought that the mistakes will be identified more easily and be beneficial for the language teaching process and everyone involved in this process, that the mistakes can be avoided more easily by focusing on more efficient and goal-oriented works and that they will save time.
APA, Harvard, Vancouver, ISO, and other styles
39

Geissler, Paul E. "Seismic reflection profiling for groundwater studies in Victoria, Australia." GEOPHYSICS 54, no. 1 (January 1989): 31–37. http://dx.doi.org/10.1190/1.1442574.

Full text
Abstract:
Experimental seismic reflection profiling was employed for groundwater studies in southeastern Australia. Equipment consisted of a simple engineering seismograph and tape recorder, and data reduction was carried out on a minicomputer using a graphics‐based processing system specifically written for the project. The investigation area is the site of a proposed induced groundwater recharge scheme in which surface water would be diverted to infiltrate aquifers outcropping several kilometers from a bore field which supplies up to half of the drinking water for the city of Geelong. The unconsolidated Tertiary aquifers of the region are known to be interrupted in places by steep normal and reverse faults. Since similar faulting had been inferred along the proposed recharge avenue, the objective of the seismic study was to verify, if possible, the assumption of aquifer continuity along the survey line. The reflection results reveal monoclinal folding in the upper unconsolidated sediments produced by recent movement on bedrock faults. The seismic study confirms that the aquifers are continuous between the proposed recharge and extraction areas despite structural complexity along the recharge avenue.
APA, Harvard, Vancouver, ISO, and other styles
40

Jin, Brian, Aditya Joshi, Ross Sparks, Stephen Wan, Cécile Paris, and C. Raina MacIntyre. "‘Watch the Flu’: A Tweet Monitoring Tool for Epidemic Intelligence of Influenza in Australia." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 09 (April 3, 2020): 13616–17. http://dx.doi.org/10.1609/aaai.v34i09.7095.

Full text
Abstract:
‘Watch The Flu’ is a tool that monitors tweets posted in Australia for symptoms of influenza. The tool is a unique combination of two areas of artificial intelligence: natural language processing and time series monitoring, in order to assist public health surveillance. Using a real-time data pipeline, it deploys a web-based dashboard for visual analysis, and sends out emails to a set of users when an outbreak is detected. We expect that the tool will assist public health experts with their decision-making for disease outbreaks, by providing them insights from social media.
APA, Harvard, Vancouver, ISO, and other styles
41

Campbell, Ian C., Gina M. Enierga, Lilian Fuchshuber, and Kim R. James. "An evaluation of the two variable model for stream litter processing using data from southeastern Australia: How important is temperature?" SIL Proceedings, 1922-2010 25, no. 3 (January 1994): 1837–40. http://dx.doi.org/10.1080/03680770.1992.11900503.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Leggate, William, Robert L. McGavin, and Tom Lewis. "An assessment of native forests in Queensland for the potential supply of small-diameter, peeler logs for spindleless lathe rotary-veneer processing." BioResources 14, no. 4 (October 16, 2019): 9485–99. http://dx.doi.org/10.15376/biores.14.4.9485-9499.

Full text
Abstract:
Spindleless lathes have shown great potential for the efficient conversion of small native forest logs in Australia. However, a major impediment to the further commercial adoption of this processing approach for native forest small-diameter logs is the absence of reliable and available data on the quantities of logs possibly available and suitable for this purpose. This study undertaken in hardwood and white cypress pine (Callitris glaucophylla) native forests and at sawmills in Queensland, Australia, demonstrated that there are potentially substantial quantities (up to 10.5 m3 per hectare of Crown native hardwood, 14 m3 per hectare of private forest hardwood and 75,000 m3 per year of Crown white cypress pine) logs suitable for spindleless lathe rotary veneer processing. However, access to and utilization of these logs will depend on many factors including accommodating Government policies and log supply agreements; potential alterations in the code of practice for native forest harvesting, silviculture, tree marking and sales practices; diversion of logs from other uses; and development of appropriate log specifications.
APA, Harvard, Vancouver, ISO, and other styles
43

West, Tracey, and Andrew Worthington. "The impact of major life events on household asset portfolio rebalancing." Studies in Economics and Finance 36, no. 3 (July 26, 2019): 334–47. http://dx.doi.org/10.1108/sef-11-2017-0318.

Full text
Abstract:
Purpose This paper aims to model the asset portfolio rebalancing decisions of Australian households experiencing a severe life event shock. Design/methodology/approach The paper uses household longitudinal data from the Household, Income, and Labour Dynamics in Australia (HILDA) survey since 2001. The major life events are serious illness or injury, death of a spouse, job dismissal or redundancy and separation from a spouse. The asset classes are bank accounts, cash investments, equities, superannuation (private pensions), life insurance, trust funds, owner-occupied housing, investor housing, business assets, vehicles and collectibles. The authors use both static and dynamic Tobit models to assess the impact and duration of impact of the shocks. Findings Serious illness and injury, loss of employment, separation and spousal death cause households to rebalance portfolios in ways that can have detrimental effects on long-term wealth accumulation through poor market timing and the incurring of transaction costs. Research limitations/implications The survey results are only available since 2001, and the wealth module from which the asset data are drawn is self-reported and not available every year. Practical implications Relevant to policymakers working on the ongoing retirement of the “baby boomer” generation and for financial planners guiding household investment decisions. Originality/value Most research on shocks to household wealth concern a narrower range of assets and only limited shocks. Also, this is one of the few studies to use a random effects model to allow for unspecified heterogeneity among households.
APA, Harvard, Vancouver, ISO, and other styles
44

Gore, Joshua Andrew, and Stefan Peters. "Interactive web mapping of 90 Years of Fire History Across South Australia." Abstracts of the ICA 1 (July 15, 2019): 1–2. http://dx.doi.org/10.5194/ica-abs-1-97-2019.

Full text
Abstract:
<p><strong>Abstract.</strong> Fire is a significant part of South Australian history and integral to the state’s ecosystems. Small prescribed fires are an essential part of ecosystem management and health. However, large, uncontrollable bushfires during South Australia’s hot, dry summers often cause loss of property, damage to the environment and fatalities. An awareness of this fire history enables effective management and encourages residents to better prepare for catastrophic fire events. The South Australian Country Fire Service provides data on the burn extents of more than 5500 prescribed and accidental fire events across the state from 1931 to 2018. This work presents the development of an open-source interactive web mapping tool, allowing residents, managers, and other map users to intuitively explore the state’s fire history without the need for advanced software and skills. This web map application aims to clearly show the state’s complex fire history at any zoom level from the entire state to a medium sized rural property. To clearly symbolise areas with extensive fire history, events are shown as both polygons and centroid points. Points are symbolised using scale dependant clustering with cluster symbols including counts. This communicates both the number of fires and the extent of area burnt for regions of interest at all zoom levels. Statistics for the state’s fire districts are also portrayed using a choropleth symbology. Popups provide further information on all point and polygon data.</p><p> All processing is done by the browser, with GeoJSON polygon geometry simplified and tiled using the geojson-vt library to ensure a responsive experience. Whilst resulting in a large initial download and requiring a mid-range PC for display this approach means data can be freely provided on static hosting with very little pre-upload processing required. Fire data can thus easily be updated as new incidents occur.</p><p> To allow responsive filtering of the over 5500 fire features an index system is implemented, using indices calculated both when data is loaded and on the fly in response to interface changes. The unique IDs given features within the fire GeoJSON are assigned to their leaflet layer representation. Lists of IDs of all features returning true when toggles are set to true are created on data load using a combination of the filter and map functions. Similar lists are created on the fly based on the status of range inputs. Intersect, union and difference functions are implemented using the Set data type allowing fast comparisons between filter results, visible, and not visible features to determine specific features to be shown or hidden.</p><p> This index system is utilised to provide a time series animation and a range of dual handle slider and checkbox interface elements allowing comprehensive data exploration based on event type, size, season, and year. When data is filtered by individual decade a textual historic summary of the decade is also displayed. The visualisation performs well on mid to high end desktop computers and thus demonstrates the potential client-side web technology has for providing comprehensive and accessible data exploration.</p>
APA, Harvard, Vancouver, ISO, and other styles
45

Krahe, Michelle A., Julie Toohey, Malcolm Wolski, Paul A. Scuffham, and Sheena Reilly. "Research data management in practice: Results from a cross-sectional survey of health and medical researchers from an academic institution in Australia." Health Information Management Journal 49, no. 2-3 (March 11, 2019): 108–16. http://dx.doi.org/10.1177/1833358319831318.

Full text
Abstract:
Background: Building or acquiring research data management (RDM) capacity is a major challenge for health and medical researchers and academic institutes alike. Considering that RDM practices influence the integrity and longevity of data, targeting RDM services and support in recognition of needs is especially valuable in health and medical research. Objective: This project sought to examine the current RDM practices of health and medical researchers from an academic institution in Australia. Method: A cross-sectional survey was used to collect information from a convenience sample of 81 members of a research institute (68 academic staff and 13 postgraduate students). A survey was constructed to assess selected data management tasks associated with the earlier stages of the research data life cycle. Results: Our study indicates that RDM tasks associated with creating, processing and analysis of data vary greatly among researchers and are likely influenced by their level of research experience and RDM practices within their immediate teams. Conclusion: Evaluating the data management practices of health and medical researchers, contextualised by tasks associated with the research data life cycle, is an effective way of shaping RDM services and support in this group. Implications: This study recognises that institutional strategies targeted at tasks associated with the creation, processing and analysis of data will strengthen researcher capacity, instil good research practice and, over time, improve health informatics and research data quality.
APA, Harvard, Vancouver, ISO, and other styles
46

Lee, I. K., J. C. Trinder, and A. Sowmya. "APPLICATION OF U-NET CONVOLUTIONAL NEURAL NETWORK TO BUSHFIRE MONITORING IN AUSTRALIA WITH SENTINEL-1/-2 DATA." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B1-2020 (August 6, 2020): 573–78. http://dx.doi.org/10.5194/isprs-archives-xliii-b1-2020-573-2020.

Full text
Abstract:
Abstract. This paper aims to define a pipeline architecture for near real-time identification of bushfire impact areas using Geoscience Australia Data Cube (AGDC). A series of catastrophic bushfires from late 2019 to early 2020 have captured international attention with their scale of devastation across four of the most populous states across Australia; New South Wales, Queensland, Victoria and South Australia. The extraction of burned areas using multispectral Sentinel-2 observations are straightforward when no cloud or haze obstruction are present. Without clear-sky observations, precisely locating the bushfire affected regions are difficult to achieve. Sentinel-1 C-band dual-polarized (VH/VV) Synthetic Aperture Radar (SAR) data is introduced to effectively elicit and analyse useful information based on backscattering coefficients, unaffected by adverse weather conditions and lack of sunlight. Burned vegetation results in significant volume scattering; co-/cross-polarised response decreases due to leafless trees, as well as coherence change over fire-disturbed areas; two sensors acquired images in a shortened revisit time over the same effected areas; all of which provided discriminative features for identifying burnt areas. Moreover, applying U-Net deep learning framework to train the recent and historical satellite data leads to an effective pre-trained segmentation model of burnt and non-burnt areas, enabling more timely emergency response, more efficient hazard reduction activities and evacuation planning during severe bushfire events. The advantages of this approach could have profound significance for a more robust, timely and accurate method of bushfire detection, utilising a scalable big data processing framework, to predict the bushfire footprint and fire spread model development.
APA, Harvard, Vancouver, ISO, and other styles
47

Jupp, DLB, JTO Kirk, and GP Harris. "Detection, identification and mapping of cyanobacteria — Using remote sensing to measure the optical quality of turbid inland waters." Marine and Freshwater Research 45, no. 5 (1994): 801. http://dx.doi.org/10.1071/mf9940801.

Full text
Abstract:
The advantages of airborne scanning for the detection, identification and mapping of algal species, cyanobacteria and associated water parameters (such as turbidity) can be realized if current research outcomes are developed into operational methods based on images with high spectral resolution. Evidence for this has become available through data obtained recently in Australia from the Compact Airborne Spectrographic Imager. This paper shows how pigments associated with cyanobacteria are detectable, even in the very turbid waters typical of eastern Australia. It demonstrates how, if the waterbodies and their constituents can be characterized by a programme of field and laboratory measurement, current processing techniques and models allow the concentrations of different constituents (algae and particles) in the photic zone to be estimated and mapped. The challenge for operational remote sensing of optical water quality in Australia (and throughout the world) is seen to have two components. One is to provide an effective characterization of the target inland and adjacent coastal waters and the other is to streamline the data analysis to provide maps of water properties in the time and cost frameworks required for operational use.
APA, Harvard, Vancouver, ISO, and other styles
48

Goncharov, Alexey, Nicholas Rawlinson, and Bruce Goleby. "Towards upgraded capability in offshore seismic studies with the new National Australian Pool of Ocean Bottom Seismographs." APPEA Journal 53, no. 2 (2013): 482. http://dx.doi.org/10.1071/aj12093.

Full text
Abstract:
In 2013, Australia, for the first time in its history, will obtain a national pool of ocean-bottom seismographs (OBSs) suitable for multi-scale experiments at sea and for onshore-offshore combined observations. Twenty broadband OBS instruments will be purchased for short- and long-term deployment (up to 12 months) to a maximum depth of 6 km. The instruments will be made available to Australian researchers through ANSIR, with only the costs of mobilisation and deployment to be met. It is anticipated that the OBS facility will greatly enhance the research capabilities of Australian scientists in the area of Earth imaging, offshore exploration, and natural hazard assessment. OBS experiments in Australia have been limited so far, with the only data set collected by Geoscience Australia in 1995–96 on a number of coincident reflection/refraction seismic transects across the northwestern Australian Margin. The main findings from that experiment will be reviewed in the context of recent OBS data processing and acquisition advances. The scope of the experiments with the new national Australian pool of OBSs will be presented, as well as practicalities and logistics of the OBS experiments.
APA, Harvard, Vancouver, ISO, and other styles
49

Kovacs, Eva M., Chris Roelfsema, James Udy, Simon Baltais, Mitchell Lyons, and Stuart Phinn. "Cloud Processing for Simultaneous Mapping of Seagrass Meadows in Optically Complex and Varied Water." Remote Sensing 14, no. 3 (January 27, 2022): 609. http://dx.doi.org/10.3390/rs14030609.

Full text
Abstract:
Improved development of remote sensing approaches to deliver timely and accurate measurements for environmental monitoring, particularly with respect to marine and estuarine environments is a priority. We describe a machine learning, cloud processing protocol for simultaneous mapping seagrass meadows in waters of variable quality across Moreton Bay, Australia. This method was adapted from a protocol developed for mapping coral reef areas. Georeferenced spot check field-survey data were obtained across Moreton Bay, covering areas of differing water quality, and categorized into either substrate or ≥25% seagrass cover. These point data with coincident Landsat 8 OLI satellite imagery (30 m resolution; pulled directly from Google Earth Engine’s public archive) and a bathymetric layer (30 m resolution) were incorporated to train a random forest classifier. The semiautomated machine learning algorithm was applied to map seagrass in shallow areas of variable water quality simultaneously, and a bay-wide map was created for Moreton Bay. The output benthic habitat map representing seagrass presence/absence was accurate (63%) as determined by validation with an independent data set.
APA, Harvard, Vancouver, ISO, and other styles
50

Shen, Ruiyun. "Application of MODIS Data-Based Forest Fire Monitoring and Assessment." Highlights in Science, Engineering and Technology 17 (November 10, 2022): 86–90. http://dx.doi.org/10.54097/hset.v17i.2510.

Full text
Abstract:
Forest fires are uncontrollable fires that spread freely within forest land, causing significant harm and damage, and thus its monitoring and assessment are crucial. There is a wide range of applications of MODIS data in forest fires aspect, but they are mainly targeted to solve regional problems. This study addresses MODIS data technology and examples of its application to forest fires in the Heilongjiang, Australia, Fujian Province, and Daxinganling Mountains, confirming its potential for monitoring and assessing forest fires. MODIS images and fire products contribute significantly to the usefulness and accuracy in the dynamic identification monitoring of forest fires and accurate determination of the ignition place due to their high resolution, excellent calibration, and positioning processing. MODIS and its corresponding product datasets can also be used to construct multiple vegetation and associated indicators to acquire vegetation area changes and to analyze the damage caused by forest fires. It is the ideal data source for monitoring and assessing forest fires.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography