Academic literature on the topic 'Just-in-time systems Australia'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Just-in-time systems Australia.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Just-in-time systems Australia"

1

Hulugalle, N. R., T. B. Weaver, L. A. Finlay, and V. Heimoana. "Soil organic carbon concentrations and storage in irrigated cotton cropping systems sown on permanent beds in a Vertosol with restricted subsoil drainage." Crop and Pasture Science 64, no. 8 (2013): 799. http://dx.doi.org/10.1071/cp12374.

Full text
Abstract:
Long-term studies of soil organic carbon dynamics in two- and three-crop rotations in irrigated cotton (Gossypium hirsutum L.) based cropping systems under varying stubble management practices in Australian Vertosols are relatively few. Our objective was to quantify soil organic carbon dynamics during a 9-year period in four irrigated, cotton-based cropping systems sown on permanent beds in a Vertosol with restricted subsoil drainage near Narrabri in north-western New South Wales, Australia. The experimental treatments were: cotton–cotton (CC); cotton–vetch (Vicia villosa Roth. in 2002–06, Vicia benghalensis L. in 2007–11) (CV); cotton–wheat (Triticum aestivum L.), where wheat stubble was incorporated (CW); and cotton–wheat–vetch, where wheat stubble was retained as in-situ mulch (CWV). Vetch was terminated during or just before flowering by a combination of mowing and contact herbicides, and the residues were retained as in situ mulch. Estimates of carbon sequestered by above- and below-ground biomass inputs were in the order CWV >> CW = CV > CC. Carbon concentrations in the 0–1.2 m depth and carbon storage in the 0–0.3 and 0–1.2 m depths were similar among all cropping systems. Net carbon sequestration rates did not differ among cropping systems and did not change significantly with time in the 0–0.3 m depth, but net losses occurred in the 0–1.2 m depth. The discrepancy between measured and estimated values of sequestered carbon suggests that either the value of 5% used to estimate carbon sequestration from biomass inputs was an overestimate for this site, or post-sequestration losses may have been high. The latter has not been investigated in Australian Vertosols. Future research efforts should identify the cause and quantify the magnitude of these losses of organic carbon from soil.
APA, Harvard, Vancouver, ISO, and other styles
2

Zilberman, N. V., D. H. Roemmich, S. T. Gille, and J. Gilson. "Estimating the Velocity and Transport of Western Boundary Current Systems: A Case Study of the East Australian Current near Brisbane." Journal of Atmospheric and Oceanic Technology 35, no. 6 (June 2018): 1313–29. http://dx.doi.org/10.1175/jtech-d-17-0153.1.

Full text
Abstract:
AbstractWestern boundary currents (WBCs) are highly variable narrow meandering jets, making assessment of their volume transports a complex task. The required high-resolution temporal and spatial measurements are available only at a limited number of sites. In this study a method is developed for improving estimates of the East Australian Current (EAC) mean transport and its low-frequency variability, using complementary modern datasets. The present calculation is a case study that will be extended to other subtropical WBCs. The method developed in this work will reduce uncertainties in estimates of the WBC volume transport and in the interannual mass and heat budgets of the meridional overturning circulations, improving our understanding of the response of WBCs to local and remote forcing on long time scales. High-resolution expendable bathythermograph (HR-XBT) profiles collected along a transect crossing the EAC system near Brisbane, Australia, are merged with coexisting profiles and parking-depth trajectories from Argo floats, and with altimetric sea surface height data. Using HR-XBT/Argo/altimetry data combined with Argo trajectory-based velocities at 1000 m, the 2004–15 mean poleward alongshore transport of the EAC is 19.5 ± 2.0 Sv (1 Sv ≡ 106 m3 s−1) of which 2.5 ± 0.5 Sv recirculate equatorward just offshore of the EAC. These transport estimates are consistent in their mean and variability with concurrent and nearly collocated moored observations at 27°S, and with earlier moored observations along 30°S. Geostrophic transport anomalies in the EAC system, including the EAC recirculation, show a standard deviation of ±3.1 Sv at interannual time scales between 2004 and 2015.
APA, Harvard, Vancouver, ISO, and other styles
3

Nichols, Phillip, and Philip Cocks. "Use of bulk hybrid populations to select for adaptation to contrasting environments in subterranean clover." NZGA: Research and Practice Series 12 (January 1, 2006): 157–62. http://dx.doi.org/10.33584/rps.12.2006.3017.

Full text
Abstract:
Population changes were measured over 17 years within a highly variable bulk hybrid population of subterranean clover in a short and long growing season mediterranean-type environment in Western Australia. Flowering time was used as an indicator of evolutionary change and was highly responsive to environment. Markedly different populations evolved, with rapid selection for early flowering at the short growing season site and later flowering at the long growing season site. The use of bulk hybrid populations is suggested as a low-input means of breeding and selecting annual pasture legumes adapted to target environments and farming systems. While adapted genotypes can be selected after just 3 seasons, further adaptive fine-tuning occurs with increased homozygosity. The success of the method hinges on the original parents containing genes for desirable characters, trial sites being representative of target environments and trial management being representative of typical farm practice
APA, Harvard, Vancouver, ISO, and other styles
4

Sherwood, John E., Jim M. Bowler, Stephen P. Carey, John Hellstrom, Ian J. McNiven, Colin V. Murray-Wallace, John R. Prescott, et al. "The Moyjil site, south-west Victoria, Australia: chronology." Proceedings of the Royal Society of Victoria 130, no. 2 (2018): 32. http://dx.doi.org/10.1071/rs18005.

Full text
Abstract:
An unusual shell deposit at Moyjil (Point Ritchie), Warrnambool, in western Victoria, has previously been dated at 67±10 ka and has features suggesting a human origin. If human, the site would be one of Australia’s oldest, justifying a redetermination of age using amino acid racemisation (AAR) dating of Lunella undulata (syn. Turbo undulatus) opercula (the dominant shellfish present) and optically stimulated luminescence (OSL) of the host calcarenite. AAR dating of the shell bed and four Last Interglacial (LIG) beach deposits at Moyjil and Goose Lagoon, 30 km to the west, confirmed a LIG age. OSL analysis of the host sand revealed a complex mixing history, with a significant fraction (47%) of grains giving an early LIG age (120–125 ka) using a three-component mixing model. Shell deposition following the LIG sea-level maximum at 120–125 ka is consistent with stratigraphic evidence. A sand layer immediately below the shell deposit gave an age of ~240 ka (i.e. MIS 7) and appears to have been a source of older sand incorporated into the shell deposit. Younger ages (~60–80 ka) are due to bioturbation before calcrete finally sealed the deposit. Uranium/thorium methods were not applicable to L. undulata opercula or an otolith of the fish Argyrosomus hololepidotus because they failed to act as closed systems. A U–Th age of 103 ka for a calcrete sheet within the 240 ka sand indicates a later period of carbonate deposition. Calcium carbonate dripstone from a LIG wave-cut notch gave a U–Th age of 11–14 ka suggesting sediment cover created a cave-like environment at the notch at this time. The three dating techniques have collectively built a chronology spanning the periods before and after deposition of the shell bed, which occurred just after the LIG sea-level maximum (120–125 ka).
APA, Harvard, Vancouver, ISO, and other styles
5

Sullivan, Clair, Ides Wong, Emily Adams, Magid Fahim, Jon Fraser, Gihan Ranatunga, Matthew Busato, and Keith McNeil. "Moving Faster than the COVID-19 Pandemic: The Rapid, Digital Transformation of a Public Health System." Applied Clinical Informatics 12, no. 02 (March 2021): 229–36. http://dx.doi.org/10.1055/s-0041-1725186.

Full text
Abstract:
Abstract Background Queensland, Australia has been successful in containing the COVID-19 pandemic. Underpinning that response has been a highly effective virus containment strategy which relies on identification, isolation, and contact tracing of cases. The dramatic emergence of the COVID-19 pandemic rendered traditional paper-based systems for managing contact tracing no longer fit for purpose. A rapid digital transformation of the public health contact tracing system occurred to support this effort. Objectives The objectives of the digital transformation were to shift legacy systems (paper or standalone electronic systems) to a digitally enabled public health system, where data are centered around the consumer rather than isolated databases. The objective of this paper is to outline this case study and detail the lessons learnt to inform and give confidence to others contemplating digitization of public health systems in response to the COVID-19 pandemic. Methods This case study is set in Queensland, Australia. Universal health care is available. A multidisciplinary team was established consisting of clinical informaticians, developers, data strategists, and health information managers. An agile “pair-programming” approach was undertaken to application development and extensive change efforts were made to maximize adoption of the new digital workflows. Data governance and flows were changed to support rapid management of the pandemic. Results The digital coronavirus application (DCOVA) is a web-based application that securely captures information about people required to quarantine and creates a multiagency secure database to support a successful containment strategy. Conclusion Most of the literature surrounding digital transformation allows time for significant consultation, which was simply not possible under crisis conditions. Our observation is that staff was willing to adopt new digital systems because the reason for change (the COVID-19 pandemic) was clearly pressing. This case study highlights just how critical a unified purpose, is to successful, rapid digital transformation.
APA, Harvard, Vancouver, ISO, and other styles
6

Cave, Danielle, Karen Abbey, and Sandra Capra. "Food and Nutrition Champions in Residential Aged Care Homes Are Key for Sustainable Systems Change within Foodservices; Results from a Qualitative Study of Stakeholders." Nutrients 13, no. 10 (October 12, 2021): 3566. http://dx.doi.org/10.3390/nu13103566.

Full text
Abstract:
The role of foodservices in aged care is difficult to understand, and strategies to improve the nutritional care of residents are often unsustainable. In particular, food-first strategies such as food fortification are poorly executed in everyday practice and its execution relies upon the foodservice system in aged care homes. The aim of this study was to explore the perspective of staff on the role of foodservices in aged care and gauge the level of skills, education, access, time, and ability to deliver food fortification. Semi-structured interviews were conducted with foodservice managers, foodservice workers, dietitians, carers, and other managers who work in aged care homes across Australia. Participants were recruited purposively through email and through snowballing. Interviews (n = 21) were recorded, transcribed verbatim, and analyzed using inductive thematic analysis. Three themes and six sub-themes were identified. The three themes include the role of foodservices being more than just serving food, teamwork between all staff to champion nutrition, and workplace culture that values continuous improvement. These themes identify how staff perceive the role of foodservices in aged care and provide an important perspective on the long-term sustainability of food fortification strategies and how to improve current practice.
APA, Harvard, Vancouver, ISO, and other styles
7

Ma, Zhenliang, Sicong Zhu, Haris N. Koutsopoulos, and Luis Ferreira. "Quantile Regression Analysis of Transit Travel Time Reliability with Automatic Vehicle Location and Farecard Data." Transportation Research Record: Journal of the Transportation Research Board 2652, no. 1 (January 2017): 19–29. http://dx.doi.org/10.3141/2652-03.

Full text
Abstract:
Transit agencies increasingly deploy planning strategies to improve service reliability and real-time operational control to mitigate the effects of travel time variability. The design of such strategies can benefit from a better understanding of the underlying causes of travel time variability. Despite a significant body of research on the topic, findings remain influenced by the approach used to analyze the data. Most studies use linear regression to characterize the relationship between travel time reliability and covariates in the context of central tendency. However, in many planning applications, the actual distribution of travel time and how it is affected by various factors is of interest, not just the condition mean. This paper describes a quantile regression approach to analyzing the impacts of the underlying determinants on the distribution of travel times rather than its central tendency, using supply and demand data from automatic vehicle location and farecard systems collected in Brisbane, Australia. Case studies revealed that the quantile regression model provides more indicative information than does the conditional mean regression method. Moreover, most of the coefficients estimated from quantile regression are significantly different from the conditional mean–based regression model in terms of coefficient values, signs, and significance levels. The findings provide information related to the impacts of planning, operational, and environmental factors on speed and its variability. On the basis of this information, transit designers and planners can design targeted strategies to improve travel time reliability effectively and efficiently.
APA, Harvard, Vancouver, ISO, and other styles
8

Maass, Brigitte L., and Bruce C. Pengelly. "Tropical and subtropical forage germplasm conservation and science on their deathbed! 1. A journey to crisis." Outlook on Agriculture 48, no. 3 (September 2019): 198–209. http://dx.doi.org/10.1177/0030727019867961.

Full text
Abstract:
While interest in the potential of tropical and subtropical forage (TSTF) germplasm for improved livestock production commenced earlier, it was not until the 1950s and 1960s that plant collecting and research on diversity and utilization of grasses and legumes reached significant global momentum. The subsequent engagement in pasture and forage research by the Consultative Group on International Agricultural Research (CGIAR) centres, such as the International Center for Tropical Agriculture (CIAT) and International Livestock Centre for Africa (ILCA; 1974–1995)/International Livestock Research Institute (ILRI; since 1995) from the 1970s onwards, built on the advances made by national centres in Australia, the United States of America, Kenya and elsewhere. By 1990–2000, TSTFs were recognized for contributing to a range of commercial and smallholder livestock production systems in Latin America, Australia, Southeast Asia, South Asia and Africa. However, their use, the value of further research and the need to maintain the very large and diverse collections held in international and national genebanks were challenged by this time because of perceived environmental risks, questions about whether or not past achievements could be bettered and the high costs of maintaining genebanks. Since then, the decline in investment and the quality of conservation and curation has been a relatively rapid process and reached the crisis point of today in just 20–25 years. This article traces 70 years of expansion and then decline of plant collecting, conservation, research and commercialization of TSTFs as a new commodity and examines the reasons for the sharp changes that have taken place. In a second article (this issue), the argument is made for swift and drastic action to prevent critical germplasm from being lost, to enable genebanks to play their crucial and unique role in underpinning improving production and productivity in livestock systems and to provide key germplasm tools to achieve environmental benefits.
APA, Harvard, Vancouver, ISO, and other styles
9

Gedeon, T. D. "Multimedia Information Compression Technologies." Journal of Advanced Computational Intelligence and Intelligent Informatics 4, no. 6 (November 20, 2000): 401–2. http://dx.doi.org/10.20965/jaciii.2000.p0401.

Full text
Abstract:
<em>Introduction</em> We are drowning in data. What kinds of data? - Text. Images. Sound. Numeric. Genome data. Text: Every day vast amounts of textual data are generated. This ranges from private corporate data, personal information, public and private government documents and so on. Much of this data needs to be accessed by many users for many tasks. For example, a corporate call centre needs fast access to documents at a semi-concept level to answer user requests. Another example: large litigations can involve 2 million documents, 200,000 of which are relevant, much fewer significant, and a handful pivotal. Techniques are desperately needer to automate the first few steps of this winnowing. Images: There are video cameras everywhere, trying to protect our safety in car parks, public places, even some lifts. There are huge and ever growing still and video archives of all aspects of our modern world. Access and indexing this data is a huge research enterprise. Much indexing is done manually. Sound: Often in concert with video in multi-media recordings. But what did the Prime Minister say on the 1st of November about the Republic? Did he sound like he meant it? These are currently not easily answered queries except if carried out by an expert human investigator. These kind of queries will need to be commonplace to access sound data in humanly meaningful ways. Numeric: Our industries generate vast amounts of valuable numeric data. In the petroleum industry geologic knowledge must be integrated with data from wells: laboratory core analysis data and on-site well logs, with seismic data generated from controlled explosions and dispersed recording devices. Then there is GIS data collected from satellites and so on. In the service industry, the stock exchange generates large amounts of hard to analyse data vital to the wellbeing of Australian companies. Genome data: The human genome project is almost complete. Researchers are finding genes by a mix of laboratory work and computerised database searches (e.g. as reported in the Weekend Australian 30 October). This is just the first step, the next will be sequencing of a number of individuals, and of course there are currently over 100 whole genome sequencing projects on other species. Fast genome sequencing is just around the corner. We will soon be drowning in this kind of data also. Multimedia data: Includes all of audio, text, graphics, images, video, animation, music. More data! <em>What Is The Real Problem?</em> Manual extraction of information from any large corpus is time con-suming and expensive, requiring specialised experience in the material. Even worse, beyond a certain point it is incredibly boring, and hence error prone. Human intelligence is best suited to dealing with information, as distinct to data! <em>A Solution</em> The development of automated systems for information extraction, and for the synthesis of the extracted information into humanly useful information resources. To avoid drowning in the ever increasing flow of multi-modal electronic information available, automated tools are required to reduce the cognitive load on users. <em>STEPS TOWARDS A SOLUTION</em> The key step towards a solution is the notion of information compression, being the compression of data to yield an information rich(er) resource. This is distinct from data compression which is merely the efficient storage of data. Further, the information compression must work on multi-model complex data, exemplified by multimedia data. Some of the techniques for doing this kind of information compression exist in a scattered way in areas such as fuzzy systems, and image analysis. We have identified a nascent field, which we can coalesce in an intensive short workshop. The first Australia-Japan Joint Workshop on Applications of Soft and Intelligent Computing to Multimodal and Multimedia Information Compression Technologies was held at Murdoch University in Perth, Western Australia from the 29 March to 5 April 2000. This special issue contains selected papers from the workshop.
APA, Harvard, Vancouver, ISO, and other styles
10

McKenzie, R., and C. Seago. "Assessment of real losses in potable water distribution systems: some recent developments." Water Supply 5, no. 1 (March 1, 2005): 33–40. http://dx.doi.org/10.2166/ws.2005.0005.

Full text
Abstract:
Considerable progress has been made over the past 10 years in the assessment and benchmarking of real losses in potable water distribution systems. Most of the advances have been based on the burst and background estimate (BABE) methodology, which was first developed in the mid-1990s by the UK water industry and has since been widely accepted and used in many parts of the world. Since the original BABE methodology was developed, several other key concepts have been added to the evergrowing list of water demand management tools. In particular, the infrastructure leakage index (ILI) and unavoidable annual real losses (UARL) introduced by A. Lambert, and the fixed area variable area discharge (FAVAD) theory by J. May, are now recognised as key “tools of the trade” in any water demand management assessment. One of the first main developments where the above-mentioned concepts were applied in practice to benchmark leakage was in South Africa, where the local Water Research Commission supported the production of the BENCHLEAK Model. This was basically the first comprehensive model to assess real losses in potable water distribution systems using the UARL and ILI concepts. The model was developed by one of the authors together with A. Lambert, and was soon followed by similar developments in Australia (BENCHLOSS) and New Zealand (BENCHLOSSNZ). Both models incorporated additions and enhancements to the original South African model, and were tailored to suit the local conditions in line with the clients' requirements. Similar developments took place in parallel by various leakage specialists, most notably in Brazil, Malaysia and Cyprus, to mention just a few of the similar initiatives. Each time a new model was developed, certain improvements were made and the “science” of leakage management and benchmarking was enhanced. Through the use of the different models and from discussions with various researchers from around the world, it has become clear that there is a genuine need for such models, and they are being readily accepted by clients in most areas. The discussions have also raised many questions concerning the derivation of the terms used to calculate the UARL and the ILI, and, to address these concerns a specialist group was created through the IWA to investigate the various issues. This paper will highlight the progress that has been made to date with regard to the key issues that have been raised by the task-team members, and recommendations based on the feedback that has been received from around the world. The paper will also present some of the results that have been obtained from different parts of the world to highlight both the progress and the problems associated with the assessment of real losses. The paper will conclude with a short description of several new models that have been developed and are in use, which demonstrate the latest improvements to an ongoing process to assess and benchmark real losses in water distribution systems.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Just-in-time systems Australia"

1

Britnell, Mark. "Australia—golden soil and wealth for toil." In Human: Solving the global workforce crisis in healthcare, 104–11. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198836520.003.0013.

Full text
Abstract:
Australia has set a new world record by enjoying 27 consecutive years of economic growth. It is on the right side of the world at just the right time in history, as Asia rises. It consistently ranks highly in the OECD Better Life Index which looks at the level of well-being in society. Indeed, the title of this chapter takes some of the lyrics out of the Australian national anthem, Advance Australia Fair. Its healthcare staff are well paid and looked after and clinical facilities are often good, but Australia’s workforce challenges are shaped by the vastness of its land and the enduring inequalities in health outcomes of its Aboriginal and Torres Strait Islander people. In this chapter, Mark Britnell takes a closer look at the Australian healthcare system and how it affects the country as a whole.
APA, Harvard, Vancouver, ISO, and other styles
2

Samuel, Delyth, and Danny Samson. "Government Insurer Enters the Brave New World." In IT Outsourcing, 1379–90. IGI Global, 2010. http://dx.doi.org/10.4018/978-1-60566-770-6.ch085.

Full text
Abstract:
Governments provide a wide range of services, and the digital economy provides both threats and opportunities in this sector. The Transport Accident Commission (TAC) is a compulsory, government owned and operated insurance scheme for third-party, no-fault liability insurance for transport accident victims, operated in Victoria, Australia. E-business has now been widely used in all sectors from small business (Loane, McNaughton, & Bell, 2004) to emerging economies (Li & Chang, 2004), and in very different industry sectors (Cagno, Di Giulio, & Trucco, 2004; Golden, Hughes, & Gallagher, 2003). Major steps forward and applications have occurred in retailing (Leonard & Cronan, 2003; Mackay, Altmann, & McMichael, 2003; Starr, 2003). Applications need to be highly customized as the business-to-consumer (B2C) and business-to-business (B2B) environments are very different, and requirements of industries such as retailing and mining, and indeed government, differ substantially (Carter, 2003; He & Lung, 2002; Rotondaro, 2002). Government provides a particularly different environment for e-business applications because government services are often delivered in monopoly circumstances, with no real profit motive behind them. At the height of the technology boom in October 1999, Tony Marxsen joined the TAC as head of IT to develop a new IT outsourcing contract for the organization as the current 5-year contract was due to end in July 2000. He quickly realized that the TAC IT systems were out of date, lacked IT process integration, and were constraining improvement in business processes, and that no significant investments had been made for some time. Renewing or redesigning the outsourcing contract, the basis for which he had been employed, would only be a short-term solution. The problem was that the cost of new infrastructure would be high, and return on technology investment would mainly be realized from redesigned business processes enabled by the new technology. Tony wanted to propose a business transformation, with process changes as well as significant investment in IT infrastructure. Together, these would take the TAC from 1970s technology into the 21st century. The problem was that their (investments in such transformation) payoffs are not easily and quickly achieved. Their value does not come from installing the technology; it comes from changing both operating and management processes—perhaps operating and managing cultures too. (Ross & Beath, 2002, p. 53) Tony knew he would have to win the support of the board and senior management, but he could not immediately give them a concrete business case for the investment. He also knew that any infrastructure investment had to be linked with a major process-improvement initiative from the start to avoid the double investment of building new applications to support old processes, and then undertaking major modifications or even replacement when the need for improvement became obvious to the board and management team. He compared investing in IT infrastructure to rewiring and replumbing your house: as far as visitors are concerned, there’s no visible difference, everything’s behind the walls, but as the owner you get the benefits of things like cheaper electricity and water bills because of efficiencies in the new redesigned systems. The problem is convincing people that they will get these results in the future, but that they need to hand over the money now, when there’s no hard evidence for the benefits they’ll get, just a bunch of assumptions and no guarantees. It’s a big ask for any Board. (Marxsen, personal communication, September 4, 2003) Tony knew that the first hurdle he would have to overcome would be getting the board to agree to give him the opportunity to put together a team to develop a business case for the board’s further consideration.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Just-in-time systems Australia"

1

Barbosa, Fábio C. "High Speed Rail Technology: Increased Mobility With Efficient Capacity Allocation and Improved Environmental Performance." In 2018 Joint Rail Conference. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/jrc2018-6137.

Full text
Abstract:
The increasing movement of people and products caused by modern economic dynamics has burdened transportation systems. Both industrialized and developing countries have faced transportation problems in urbanized regions and in their major intercity corridors. Regional and highway congestion have become a chronic problem, causing longer travel times, economic inefficiencies, deterioration of the environment and quality of life. Congestion problems are also occurring at airports and air corridors, with similar negative effects. In the medium distance travel market (from 160 up to 800 km), too far to drive and too short to fly, High Speed Rail (HSR) technology has emerged as a modern transportation system, as it is the most efficient means for transporting large passenger volumes with high speed, reliability, safety, passenger comfort and environmental performance. HSR system’s feasibility will depend on its capacity to generate social benefits (i.e. increased mobility rates, reduced congestion, capacity increase and reduced environmental costs), to be balanced with the high construction, maintenance and operational costs. So, it is essential to select HSR corridors with strong passenger demands to maximize these benefits. The first HSR line was Japan’s Shinkansen service, a dedicated HSR system, between Tokyo and Osaka, launched in 1964, which is currently the most heavily loaded HSR corridor in the world. France took the next step, launching the Train à Grande Vitesse (TGV), in 1981, with a dedicated line with shared-use segments in urban areas, running between Paris and Lyon. Germany joined the venture in the early 1990 with the Inter City Express – ICE, with a coordinated program of improvements in existent rail infrastructure and Spain, in 1992, with the Alta Velocidad Espanola – AVE, with dedicated greenfield lines. Since then, these systems have continuously expanded their network. Currently, many countries are evaluating the construction of new HSR lines, with European Commission deeming the expansion of the Trans European Network as a priority. United Kingdom, for example, has just awarded construction contracts for building the so called HS2, an HSRexpanded line linking London to the northern territory. China, with its dynamic economic development, has launched its HSR network in 2007 and has sped up working on its expansion, and currently holds the highest HSR network. United States, which currently operates high speed trainsets into an operationally restricted corridor (the so called Northeast Corridor (NEC), linking Washington, New York and Boston), has also embarked into the high speed rail world with the launch of Californian HSR Project, currently under construction, aimed to link Los Angeles to San Francisco mega regions, the ongoing studies for Texas HSR project, to connect Dallas to Houston, into a wholly privately funding model, as well as studies for a medium to long term NEC upgrade for HSR. Australia and Brazil are also seeking to design and launch their first HSR service, into a time consuming process, in which a deep discussion about social feasibility and affordability is under way. This work is supposed to present an overview of HSR technology worldwide, with an assessment of the main technical, operational and economical features of Asian and European HSR systems, followed by a snapshot of the general guidelines applied to some planned HSR projects, highlighting their demand attraction potential, estimated costs, as well as their projected economic and environmental benefits.
APA, Harvard, Vancouver, ISO, and other styles
2

el Mouhandiz, Abdel-Ali, and Job Bokhorst. "Analysis and Offshore Support for the Float-Over of a 24,250mT Topsides on the North West Shelf." In ASME 2013 32nd International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/omae2013-10376.

Full text
Abstract:
In April 2012 Heerema Marine Contractors (HMC) successfully performed the float-over of the North Rankin B topsides. With a weight of 24,250mT it is the heaviest deck installed in a single piece in the harsh offshore environment of Australia’s North West Shelf to date. During years of preparations extensive hydrodynamic analyses have been undertaken to achieve design loads, optimize the float-over equipment and ultimately determine operability limits for the installation. As the float-over analysis contains various components with non-linear characteristics (e.g. fenders, leg mating units, mooring lines), a non-linear problem solving approach was used. The selected analysis tools use an efficient time step integration enabling a large number of simulations to be run. The strong non-linearity of the system requires a Monte Carlo simulation to obtain statistically reliable results. The numerical model included purpose built modules for the float-over equipment (such as LMUs) taking into account relevant details and project specific geometries. Available data from model tests was used in calibration and validation of the model. Towards the offshore execution phase, the limiting combinations of wind sea and swell waves were identified. Forthese combinations the system response remained (just) within its ‘Not To Exceed’ values. During the offshore operation, prior to the actual float-over, the spectral wave conditions, wind and current were constantly monitored to validate the forecasts. Also the barge motions were measured and compared against results from the numerical models. This was used in support of the final decision to proceed with the float-over operation.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography