Academic literature on the topic 'Friday 13th risk modelling'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Friday 13th risk modelling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Friday 13th risk modelling"

1

Abdul-Halim, Nadiya, and Kenneth R. Davey. "A Friday 13th risk assessment of failure of ultraviolet irradiation for potable water in turbulent flow." Food Control 50 (April 2015): 770–77. http://dx.doi.org/10.1016/j.foodcont.2014.10.036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Davey, K. R. "A novel Friday 13th risk assessment of fuel-to-steam efficiency of a coal-fired boiler." Chemical Engineering Science 127 (May 2015): 133–42. http://dx.doi.org/10.1016/j.ces.2015.01.031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Maslina, Maslina, and Bima Dhevrando. "ANALISIS KESELAMATAN LALU-LINTAS JL.SOEKARNO HATTA BALIKPAPAN." INFO-TEKNIK 20, no. 1 (June 20, 2020): 1. http://dx.doi.org/10.20527/infotek.v20i1.6951.

Full text
Abstract:
Soekarno Hatta Road is one of the road shaft beetween Balikpapan and Samarinda which has 119 kilometres long. This road is province roads that had been through out or passed by with typical heavy and light vehicles which has high intensity accident level. This research is aimed to devise factor and characteristic of accidents along Sokearno Hatta roads from zero kilometres until 13th kilometres. This research begun with roads survey observation and secondary data collection including with roads accidents documents for the pass 3 years.( which is in years of 2015 – 2017). Data analysis prepared with calculating numbers of accidents using formula EAN (Equivalent Accidents Number) and UCL Method (Upper Control Limit) for determination area with high risk accidents (Black Spot). Anatomy Accidents Data Documents were enumerated to analyze in that specified area. That can be conclude that the numbers of accidents on Soekarno Hatta Street from zero kilometres until 13th kilometreswith EAN method i.g : more than 84 times bigger different from UCL value i.g: 43,3. Accidents that was happened in the mornings at weekdays Monday to Friday which were implicate 2 motorcycles and passenger in fifthteen till twenty years old students (boys).
APA, Harvard, Vancouver, ISO, and other styles
4

Pennings, Jeroen L. A., Sandra Imholz, Ilse Zutt, Maria P. H. Koster, Jacqueline E. Siljee, Annemieke de Vries, Peter C. J. I. Schielen, and Wendy Rodenburg. "Predictive Performance of a Seven-Plex Antibody Array in Prenatal Screening for Down Syndrome." Disease Markers 2015 (2015): 1–7. http://dx.doi.org/10.1155/2015/519851.

Full text
Abstract:
We evaluated the use of multiplex antibody array methodology for simultaneous measurement of serum protein markers for first trimester screening of Down Syndrome (DS) and other pregnancy outcomes such as preeclampsia. For this purpose, we constructed an antibody array for indirect (“sandwich”) measurement of seven serum proteins: pregnancy-associated plasma protein-A (PAPP-A), free beta subunit of human chorionic gonadotropin (fβ-hCG), alpha-fetoprotein (AFP), angiopoietin-like 3 (ANGPTL3), epidermal growth factor (EGF), insulin-like growth factor 2 (IGFII), and superoxide dismutase 1 (SOD1). This array was tested using 170 DS cases and 510 matched controls drawn during the 8th–13th weeks of pregnancy. Data were used for prediction modelling and compared to previously obtained AutoDELFIA immunoassay data for PAPP-A and fβ-hCG. PAPP-A and fβ-hCG serum concentrations obtained using antibody arrays were highly correlated with AutoDELFIA data. Moreover, DS prediction modeling using (log-MoMmed) antibody array and AutoDELFIA data gave comparable results. Of the other markers, AFP and IGFII showed significant changes in concentration, although adding these markers to a prediction model based on prior risk, PAPP-A and fβ-hCG did not improve the predictive performance. We conclude that implementation of antibody arrays in a prenatal screening setting is feasible but will require additional first trimester screening markers.
APA, Harvard, Vancouver, ISO, and other styles
5

Todd, Oliver M., Chris Wilkinson, Matthew Hale, Nee Ling Wong, Marlous Hall, James P. Sheppard, Richard J. McManus, et al. "Is the association between blood pressure and mortality in older adults different with frailty? A systematic review and meta-analysis." Age and Ageing 48, no. 5 (June 5, 2019): 627–35. http://dx.doi.org/10.1093/ageing/afz072.

Full text
Abstract:
Abstract Objective to investigate whether the association between blood pressure and clinical outcomes is different in older adults with and without frailty, using observational studies. Methods MEDLINE, EMBASE and CINAHL were searched from 1st January 2000 to 13th June 2018. PROSPERO CRD42017081635. We included all observational studies reporting clinical outcomes in older adults with an average age over 65 years living in the community with and without treatment that measured blood pressure and frailty using validated methods. Two independent reviewers evaluated study quality and risk of bias using the ROBANS tool. We used generic inverse variance modelling to pool risks of all-cause mortality adjusted for age and sex. Results nine observational studies involving 21,906 older adults were included, comparing all-cause mortality over a mean of six years. Fixed effects meta-analysis of six studies demonstrated that in people with frailty, there was no mortality difference associated with systolic blood pressure <140 mm Hg compared to systolic blood pressure >140 mm Hg (HR 1.02, 95% CI 0.90 to 1.16). In the absence of frailty, systolic blood pressure <140 mm Hg was associated with lower risk of death compared to systolic blood pressure >140 mm Hg (HR 0.86, 95% CI 0.77 to 0.96). Conclusions evidence from observational studies demonstrates no mortality difference for older people with frailty whose systolic blood pressure is <140 mm Hg, compared to those with a systolic blood pressure >140 mm Hg. Current evidence fails to capture the complexities of blood pressure measurement, and the association with non-fatal outcomes.
APA, Harvard, Vancouver, ISO, and other styles
6

Moreira de Sousa, A., and R. Capucho. "Local preparedness for Mass Gatherings in Northern Portugal." European Journal of Public Health 30, Supplement_5 (September 1, 2020). http://dx.doi.org/10.1093/eurpub/ckaa166.613.

Full text
Abstract:
Abstract Issue Lack of preparedness for mass gatherings and their possible impact on public health at the local and national level Description of the Problem Due to an outbreak in a Mass Gathering (MG), that took place in the northern region in 2018, the Public Health Unit (PHU) of Alto Tâmega and Barroso's, in collaboration with the Northern Region Public Health Department and the local government of Montalegre, implemented a risk assessment tool to new MGs, to mitigate the identified risks. Although Montalegre's MGs have considerable importance to the region's local economy, they are also high-risk events, to local and international health, due to the high number of foreigners on those events and the proximity to the Spanish border. Results The first in the field collaboration was the Forest Soul Gathering, a biennial transformational festival, held from 17th-21st of July 2019, with more than 1000 participants, from more than 20 countries. The collaboration with the local government continued throughout the year with several work meetings on Mass Gatherings and risk management. On Friday 6th, 2020, the PHU provided a risk assessment to the local government on the event Friday 13th (40.000 participants) and classified that risk as major due to the COVID-19 epidemic. In the local civil defense meeting, it was decided unanimously to cancel the event, taking into account the risk assessment of the PHU. Lessons There has been an increase in MG's in Portugal. This requires straight collaboration with the organization, health sector, and local authorities, to mitigate the risks of each event. This can be an example of how to go from theory to practice, in public health risk prevention, and institutional cooperation, being one of the first of its kind in Portugal. In the end, it allowed a fast response to the threat of COVID-19 and public health protection of the community of Montalegre. Key messages Proper preparation with partners to a public health emergency is vital to contain an outbreak. Local public health departments need to implement their emergency protocols and risk assessment procedures before a public health emergency.
APA, Harvard, Vancouver, ISO, and other styles
7

King, Ben. "It's a Scream." M/C Journal 1, no. 5 (December 1, 1998). http://dx.doi.org/10.5204/mcj.1733.

Full text
Abstract:
Why do so many horror films feature the young, pretty and prosperous at the business end of a carving knife? A few examples include Scream 2 (1998), I Know What You Did Last Summer (1997), Scream (1997), and The Hand That Rocks The Cradle (1992). In fact, the propensity for Hollywood to portray the narcissistic bourgeoisie being deprived of their pretensions has been around since Murnau sent a real estate agent to a vampire's house in 1922. But there are fundamental differences between horror films like Nosferatu (1922) or Psycho (1961) and the films mentioned above. The purpose of this essay is to suggest that in recent years Hollywood horror narratives have moved away from the tradition of legitimising violence for the viewer who wishes to participate in a world of aggression without feelings of remorse or guilt (Tudor), in favour of attending to the fears associated with a struggling middle class and dwindling American Puritanism. This feature of the modern horror narrative involves identical characterisation of both the victims and the stalkers: they are young, affluent, attractive, and completely desensitised to trauma though hyper-sensitive to materialism and mass media flippancy. In a modern sub-genre of the horror film, defined by Barry Keith Grant as 'yuppie horror' (288), we are seeing narrative representations of economic success and physical beauty involved in the time honoured murderous passage from Order->Disorder->Order. Exaggerated portrayals of economic and physical superiority is a staple of the horror genre -- it helps to establish a veneer of safety which exists only to be shattered. The distinguishing feature of films such as Wes Craven's Scream is that the killers are not hideous misfits, they are in fact equal in beauty and social stature to their victims. The other quality which defines the yuppie horror is a visual and narrative attention to material wealth and contrived suburban perfection and the ineffectuality of this world at preventing the cathartic violent acts from occurring. In both Scream and Pacific Heights typical symbols of post-modern affluence such as cars, wide screen televisions and plush interior design get destroyed during the bloody process of re-establishing a tenuous order. Prior to the unfolding of this crucial aspect of the plot, important relations and similarities in lifestyle are established between the victims' way of life and that of the killer(s). This is a dramatic shift away from the old school tactic of gradually revealing a dark past which emphatically distances the heroes from the stalkers in a way that preserves the sanctity of the American suburban dream defined by films such as Halloween, Friday the 13th, or Nightmare on Elm Street. The modern horror relies on the audience's understanding that the killers occupy the same exaggeratedly cosy space that the victims do. In most cases the means through which films such as Scream 'address the anxieties of an affluent culture in a period of prolonged recession' (Grant 280) involves the young and beautiful being stripped of their material shelter not by blue collar hicks or monsters but by other yuppies turned playfully psychotic. This revamping of the horror genre plays on strong, new concerns about capitalist ideology and media culture, and informs the audience about what effect this ideology is having on contemporary Western emotional life. The 'playfulness' mentioned above operates on various levels in most films of the genre; typically the yuppie-killers simply make it obvious they are enjoying a kind of selfish revelry in a rare immaterial act. Scream, on the other hand, is the best example of a new movement in the yuppie horror sub-genre which maintains a discreet distance from traditional horror via an unnerving joviality which pervades the script, performances and look of the film. The film is simultaneously satirical and diegetically faithful to the genre it debunks. Scream involves well off high school students treating the advent of mass murder in their leafy town as an opportunity to playfully act out clichéd roles which they also fulfil as legitimate victims. One perky cynic remarks: 'I see myself as sort of a young Meg Ryan, but with my luck I'll get Tori Spelling'. The film makes continual references to other films of the genre including those made by Craven himself. Scream has a narrative quality akin to the grim pleasures pursued by Patrick Bateman in the notorious novel by Brett Easton Ellis, American Psycho (1991). In the same manner as Ellis's psycho fetishises his possessions to disavow (justify?) the horrifying brutality of his favourite pastime of indiscriminate slaying, so too do both the victims and the killers of Scream fetishise horror films and media representations of thrill killing. Make no mistake, Scream is a horror film and extremely gory. Its appeal depends on its self-referential and dichotomous relationship with the viewer who is encouraged to reject the conventions of horror via the playfulness of its tone, as well as be horrified by the frequent disembowelling of innocents. In this way, the film cheats us: there is something transcendental about the graphic violence which makes it impossible for Scream to detach itself from the conventions of the horror genre. The playful behaviour of both the protagonists and the director is a very dark message that illustrates the vanishing potential of film to resolve tensions between conscious and unconscious attitudes towards media saturation and trash culture. Extremely violent representations of affluent American society during a period of both economic and moral recession in Scream promote the notion that the sanctimonious, puritanical institutions of the middle class are at risk of being exposed due to the desensitising nature of television media, personified in the film by the aggressive and bloodthirsty reporter Courtney Cox. It is partially her jocular disavowal of the threat that makes Scream such an interesting film, much more so than similar representations of media in Oliver Stone's Natural Born Killers (1996) due to Cox's clever intertextual link to yuppie heaven in the huge television sit com, Friends. This idea of symbolising or disguising threats to the American Way has always been a driving force in Hollywood production. The Western is perhaps the most conspicuous, where staunchly defended pastoral values serve to undermine a perceived social threat posed by the industrial revolution (Wark 10). Other examples include the textualisation of a 'red menace' from Mars in SF films to reinforce Cold War paranoia, and the use of the musical during the thirties distracted audiences from the harsh realities of the Depression. Horror films have traditionally drawn on trauma from the stalker's childhood which is commemorated in the act of killing, and according to revisionist Freudian criticism this representation acts on the predominantly adolescent viewers' voyeuristic desires for psychosexual empowerment over childhood (Tudor 130). The advent of the yuppie horror has corrupted this crucial distinction between the killer and the victim, due to the killer's participation in the same affluent and material world which dominates their lives. This materialism includes the media and their dangerously superficial retelling of tragic events. The anxieties encoded in Scream and its spin-offs activate, through the violence adopted by psychologically identical characters, a new regression similar to the Freudian one mentioned above. The crucial difference is that the trauma stems from a desensitisation to media representation of real events, ultimately realised in the apparent emotional stability of the affluent and beautiful who playfully slaughter the inhabitants of their own, false world. References Grant, B. K. "Rich and Strange: The Yuppie Horror Film." Contemporary Hollywood Cinema. Eds. Steve Neale and Murray Smith. New York: Routledge, 1988. Tudor, A. Monsters and Mad Scientists: A Cultural History of the Monster Movie. Oxford: Basil Blackwell, 1989. Wark, M. "Technofear 2." 21·C 8 (1992): 10. Citation reference for this article MLA style: Ben King. "It's a Scream: Playful Murder and the Ideology of Yuppie Horror." M/C: A Journal of Media and Culture 1.5 (1998). [your date of access] <http://www.uq.edu.au/mc/9812/scream.php>. Chicago style: Ben King, "It's a Scream: Playful Murder and the Ideology of Yuppie Horror," M/C: A Journal of Media and Culture 1, no. 5 (1998), <http://www.uq.edu.au/mc/9812/scream.php> ([your date of access]). APA style: Ben King. (1998) It's a scream: playful murder and the ideology of yuppie horror. M/C: A Journal of Media and Culture 1(5). <http://www.uq.edu.au/mc/9812/scream.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Friday 13th risk modelling"

1

Abdul, Halim Nadiya. "Quantitative Fr 13 Failure Modelling of Uv Irradiation for Potable Water Production – Demonstrated with Escherichia Coli." Thesis, 2017. http://hdl.handle.net/2440/119334.

Full text
Abstract:
Steady-state ultraviolet (UV) irradiation for potable water production is becoming an important global alternative to traditional disinfection by chlorination. Failure of UV to reduce the number of viable contaminant pathogens however can lead to enduring health legacies (with or without fatalities). To better understand vulnerability of UV operations to failure, the probabilistic Fr 13 risk framework of Davey and co-workers1 is applied for the first time in this thesis. Fr 13 is predicated on underlying chemical engineering unit-operations. It is based on the hypothesis that naturally occurring, chance (stochastic) fluctuations about the value of ‘set’ process parameters can unexpectedly combine and accumulate in one direction and leverage significant change across a binary ‘failure– not failure’ boundary. Process failures can result from the accumulation of these fluctuations within an apparent steady-state process itself. That is to say, even with good design and operation of plant, there can be unexpected (surprise and sudden) occasional failures without ‘human error’ or ‘faulty fittings’. Importantly, the impact of these naturally occurring random fluctuations is not accounted for explicitly in traditional chemical engineering. Here, the Fr 13 risk framework is applied for the first time to quantitatively assess operations of logically increasing complexity, namely, a laminar flow-through UV reactor, with turbulent flow in a concentric annular-reactor, both with and without suspended solids present (Davey, Abdul-Halim and Lewis, 2012; Davey and Abdul-Halim, 2013; Abdul-Halim and Davey, 2015; 2016), and; a two-step ‘global’ risk model of combined rapid-sand-filtration and UV irradiation (SF-UV) (Abdul-Halim and Davey, 2017). The work is illustrated with extensive independent data for the survival of viable Escherichia coli - a pathogenic species of faecal bacteria widely used as an indicator for health risk. A logical and step-wise approach was implemented as a research strategy. UV reactor unit-operations models are first synthesized and developed. A failure factor is defined in terms of the design reduction and actual reduction in viable E. coli contaminants. UV reactor operation is simulated using a refined Monte Carlo (with Latin Hypercube) sampling of UV lamp intensity (I), suspended solids concentrations [conc] and water flow (Q). A preliminary Fr 13 failure simulation of a single UV reactor unit-operation (one-step), developed for both simplified laminar flow and turbulent flow models, showed vulnerability to failure with unwanted survival of E. coli of, respectively, 0.4 % and 16 %, averaged over the long term, of all apparently successful steady-state continuous operations. A practical tolerance, as a design margin of safety, of 10 % was assumed. Results from applied ‘second-tier’ studies to assess re-design to improve UV operation reliability and safety and to reduce vulnerability to Fr 13 failure showed that any increased costs to improve control and reduce fluctuations in raw feed-water flow, together with reductions in UV lamp fluence, would be readily justified. The Fr 13 analysis was shown to be an advance on alternate risk assessments because it produced all possible practical UV outcomes, including failures. A more developed and practically realistic model for UV irradiation for potable water production was then synthesized to investigate the impact of the presence of suspended solids (SS) (median particle size 23 μm) as UV shielding and UV absorbing agents, on overall UV efficacy. This resulted in, respectively, some 32.1 % and 43.7 %, of apparent successful operations could unexpectedly fail over the long term due, respectively, to combined impact of random fluctuations in feed-water flow (Q), lamp intensity (I0) and shielding and absorption of UV by SS [conc]. This translated to four (4) failures each calendar month (the comparison rate without suspended solids was two (2) failures per month). Results highlighted that the efficacy of UV irradiation decreased with the presence of SS to 1-log10 reduction, compared with a 4.35-log10 reduction without solids present in the raw feed-water. An unexpected outcome was that UV failure is highly significantly dependent on naturally occurring fluctuations in the raw feed-water flow, and not on fluctuations in the concentration of solids in the feed-water. It was found that the initial presence of solids significantly reduced the practically achievable reductions in viable bacterial contaminants in the annular reactor, but that fluctuations in concentration of solids in the feed-water did not meaningfully impact overall vulnerability of UV efficacy. This finding pointed to a pre-treatment that would be necessary to remove suspended solids prior to the UV reactor, and; the necessity to improve control in feed-water flow to reduce fluctuations. The original synthesis was extended therefore for the first time to include a rapid sand-filter (SF) for pre-treatment of the raw feed-water flow to the UV reactor, and; a Fr 13 risk assessment on both the SF, and sequential, integrated rapid sand-filtration and UV reactor (SF-UV). For the global two-step SF-UV results showed vulnerability to failure of some 40.4 % in overall operations over the long term with a safety margin (tolerance) of 10 %. Pre-treatment with SF removed SS with a mean of 1-log10 reduction (90 %). Subsequently, an overall removal of viable E. coli from the integrated SF-UV reactor was a 3-log10 reduction (99.9 %). This is because the efficacy of UV light to penetrate and inactivate viable E. coli, and other pathogens, is not inhibited by SS in the UV reactor. This showed that the physical removal of E. coli was accomplished by a properly functioning SF and subsequently disinfection was done by UV irradiation to inactivate viable E. coli in the water. Because the Regulatory standard for potable water is a 4-log10 reduction, it was concluded that flocculation and sedimentation prior to SF was needed to exploit these findings. Flocculation is a mixing process to increase particle size from submicroscopic microfloc to visible suspended particles prior to sedimentation and SF. This research will aid understanding of factors that contribute to UV failure and increase confidence in UV operations. It is original, and not incremental, work. Findings will be of immediate interest to risk analysts, water processors and designers of UV reactors for potable water production.
Thesis (Ph.D.) -- University of Adelaide, School of Chemical Engineering & Advanced Materials, 2017
APA, Harvard, Vancouver, ISO, and other styles
2

Chu, James Yick Gay. "Synthesis and experimental validation of a new probabilistic strategy to minimize heat transfers used in conditioning of dry air in buildings with fluctuating ambient and room occupancy." Thesis, 2017. http://hdl.handle.net/2440/114256.

Full text
Abstract:
Steady-state unit-operations are globally used in chemical engineering. Advantages include ease of control and a uniform product quality. Nonetheless there will be naturally occurring, random (stochastic) fluctuations about any steady-state ‘set’ value of a process parameter. Traditional chemical engineering does not explicitly take account of these. This is because, generally, fluctuation in one parameter appears to be off-set by change in another – with the process outcome remaining apparently steady. However Davey and co-workers (e.g. Davey et al., 2015; Davey, 2015 a; Zou and Davey, 2016; Abdul-Halim and Davey, 2016; Chandrakash and Davey, 2017 a) have shown these naturally occurring fluctuations can accumulate and combine unexpectedly to leverage significant impact and thereby make apparently well-running processes vulnerable to sudden and surprise failure. They developed a probabilistic and quantitative risk framework they titled Fr 13 (Friday 13th) to underscore the nature of these events. Significantly, the framework can be used in ‘second-tier’ studies for re-design to reduce vulnerability to failure. Here, this framework is applied for the first time to show how naturally occurring fluctuations in peak ambient temperature (T₀) and occupancy (room traffic flows) (Lᴛ) can impact heat transfers for conditioning of room air. The conditioning of air in large buildings, including hotels and hospitals, is globally important (Anon., 2012 a). The overarching aim is to quantitatively ‘use’ these fluctuations to develop a strategy for minimum energy. A justification is that methods that permit quantitative determination of reliable strategies for conditioning of air can lead to better energy use, with potential savings, together with reductions in greenhouse gases (GHG). Oddly many buildings do not appear to have a quantitative strategy to minimize conditioning heat transfers. Wide-spread default practice is to simply use an on-off strategy i.e. conditioning-on when the room is occupied and conditioning-off, when un-occupied. One alternative is an on-only strategy i.e. leave the conditioner run continuously. A logical and stepwise combined theoretical-and-experimental, approach was used as a research strategy. A search of the literature showed that work had generally focused on discrete, deterministic aspects and not on mathematically rigorous developments to minimise overall whole-of-building conditioning heat transfers. A preliminary steady-state convective model was therefore synthesized for conditioning air in a (hotel) room (4.5 x 5.0 x 2.5, m) in dry, S-E Australia during summer (20 ≤ T₀ ≤ 40, °C) to an auto-set room bulk temperature of 22 °C for the first time. This was solved using traditional, deterministic methods to show the alternative on-only strategy would use less electrical energy than that of the default on-off for Lᴛ > 36 % (Chu et al., 2016). Findings underscored the importance of the thermal capacitance of a building. The model was again solved using the probabilistic Fr 13 framework in which distributions to mimic fluctuations in T₀ and Lᴛ were (reasonably) assumed and a new energy risk factor (p) was synthesized such that all p > 0 characterized a failure in applied energy strategy (Chu and Davey, 2015). Predictions showed on-only would use less energy on 86.6 % of summer days. Practically, this meant that a continuous on-only strategy would be expected to fail in only 12 of the 90 days of summer, averaged over the long term. It was concluded the Fr 13 framework was an advance over the traditional, deterministic method because all conditioning scenarios that can practically exist are simulated. It was acknowledged however that: 1) a more realistic model was needed to account for radiative heat transfers, and; 2) to improve predictive accuracy, local distributions for T₀ and Lᴛ were needed. To address these: 1) the model was extended mathematically to account for radiative transfers from ambient to the room-interior, and; 2) distributions were carefully-defined based on extensive historical data for S-E Australia from, respectively, Bureau of Meteorology (BoM) (Essendon Airport) and Clarion Suites Gateway Hotel (CSGH) (Melbourne) – a large (85 x 2-room suites) commercial hotel (latitude -37.819708, longitude 144.959936) – for T₀ and Lᴛ for 541 summer days (Dec. 2009 to Feb. 2015) (Chu and Davey, 2017 a). Predictions showed that radiative heat transfers were significant and highlighted that for Lᴛ ≥ 70 %, that is, all commercially viable occupancies, the on-only conditioning strategy would be expected to use less energy. Because findings predicted meaningful savings with the on-only strategy, ‘proof-of-concept’ experiments were carried out for the first time in a controlled-trial in-situ in CSGH over 10 (2 x 5 contiguous) days of summer with 24.2 ≤ T₀ ≤ 40.5, °C and 13.3 ≤ Lᴛ ≤ 100, %. Independent invoices (Origin Energy Ltd, or Simply Energy Ltd, Australia) (at 30 min intervals from nationally registered ‘smart’ power meters) for geometrically identical control and treated suites showed a mean saving of 18.9 % (AUD $2.23 per suite per day) with the on-only strategy, with a concomitant 20.7 % reduction (12.2 kg CO₂-e) in GHG. It was concluded that because findings supported model predictions, and because robust experimental SOPs had been established and agreed by CSGH, a large-scale validation test of energy strategies should be undertaken in the hotel. Commercial-scale testing over 77 contiguous days of summer (Jan. to Mar., 2016) was carried out in two, dimensionally-identical 2-room suites, with the same fit-out and (S-E) aspect, together with identical air-conditioner (8.1 kW) and nationally registered meters to automatically transmit contiguous (24-7) electrical use (at 30 min intervals) (n = 3,696) for the first time. Each suite (10.164 x 9.675, m floor plan) was auto-set to a bulk air temperature of 22 °C (Chu and Davey, 2017 b). In the treated suite the air-conditioner was operated on-only, whilst in the control it was left to wide-spread industry practice of on-off. The suites had (standard) single-glazed pane windows with heat-attenuating (fabric) internal curtains. Peak ambient ranged from 17.8 ≤ T₀ ≤ 39.1, °C. There were 32 days with recorded rainfall. The overall occupancy Lᴛ of both suites was almost identical at 69.7 and 71.2, % respectively for the treated and control suite. Importantly, this coincided with a typical business period for the CSGH hotel. Based on independent electrical invoices, results showed the treated suite used less energy on 47 days (61 %) of the experimental period, and significantly, GHG was reduced by 12 %. An actual reduction in electrical energy costs of AUD $0.75 per day (9 %) averaged over the period was demonstrated for the treated suite. It was concluded therefore that experimental findings directly confirmed the strategy hypothesis that continuous on-only conditioning will use less energy. Although the hypothesis appeared generalizable, and adaptable to a range of room geometries, it was acknowledged that a drawback was that extrapolation of results could not be reliably done because actual energy used would be impacted by seasons. The in-situ commercial-scale experimental study was therefore extended to encompass four consecutive seasons. The research aim was to provide sufficient experimental evidence (n = 13,008) to reliably test the generalizability of the on-only hypothesis (Chu and Davey, 2017 c). Ambient peak ranged from 9.8 ≤ T₀ ≤ 40.5, °C, with rainfall on 169 days (62 %). Overall, Lᴛ was almost identical at 71.9 and 71.7, % respectively, for the treated and control suite. Results based on independent electrical energy invoices showed the on-only strategy used less energy on 147 days (54 %) than the on-off. An overall mean energy saving of 2.68 kWh per suite per day (9.2 %) (i.e. AUD $0.58 or 8.0 %) with a concomitant reduction in indirect GHG of 3.16 kg CO₂-e was demonstrated. Extrapolated for the 85 x 2-room suites of the hotel, this amounted to a real saving of AUD $18,006 per annum - plus credit certificates that could be used to increase savings. Overall, it was concluded therefore the on-only conditioning hypothesis is generalizable to all seasons, and that there appears no barrier to adaption to a range of room geometries. Highly significantly, the methodology could be readily applied to existing buildings without capital outlays or increases in maintenance. A total of five (5) summative research presentations of results and findings were made to the General Manager and support staff of CSGH over the period to July 2017 inclusive (see Appendix I) that maintained active industry-engagement for the study. To apply these new findings, the synthesis of a computational algorithm in the form of a novel App (Anon., 2012 b; Davey, 2015 b) was carried out for the first time (Chu and Davey, 2017 d). The aim was to demonstrate an App that could be used practically to minimize energy in conditioning of dry air in buildings that must maintain an auto-set temperature despite the impact of fluctuations in T₀ and Lᴛ . The App was synthesized from the extensive experimental commercial-scale data and was applied to compute energy for both strategies from independently forecast T₀ and Lᴛ . Practical performance of the App was shown to be dependent on the accuracy of locally forecast T₀ and Lᴛ . Overall results predicted a saving of 2.62 kWh per 2-room suite per day ($47,870 per annum for CSGH) where accuracy of forecast T₀ is 77 % and Lᴛ is 99 %, averaged over the long term. A concomitant benefit was a predicted reduction greenhouse emissions of 3.1 kg CO₂-e per day. The App appears generalizable – and importantly it is not limited by any underlying heat-model. Its predictive accuracy can be refined with accumulation of experimental data for a range of geo-locations and building-types to make it globally applicable. It was concluded that the App is a useful new tool to minimize energy transfers in conditioning of room dry air in large buildings – and could be readily developed commercially 6. Importantly, it can be applied without capital outlays or additional maintenance cost and at both design and analysis stages. This research is original and not incremental work. Results of this research will be of immediate benefit to risk analysts, heat-design engineers, and owners and operators of large buildings.
Thesis (Ph.D.) -- University of Adelaide, School of Chemical Engineering, 2018
APA, Harvard, Vancouver, ISO, and other styles
3

Chandrakash, Saravanan. "A new risk analysis of clean-in-place (CIP) milk processing." Thesis, 2012. http://hdl.handle.net/2440/76140.

Full text
Abstract:
The food and pharmaceutical industry are generally a nation’s largest manufacturing sector – and importantly one of the most stable. Clean-In-Place (CIP)² is a ubiquitous process in milk processing as thorough cleaning of wet surfaces of equipment is an essential part of daily operations. Faulty cleaning can have serious consequences as milk acts as an excellent substrate in which unwanted micro-organisms can grow and multiply rapidly. Davey & Cerf (2003) introduced the notion of Friday 13th Syndrome³ i.e. the unexpected failure of a well-operated process plant by novel application of Uncertainty Failure Modelling (Davey, 2010; 2011). They showed that failure cannot always be put down to human error or faulty fittings but could be as a result of stochastic changes inside the system itself. In this study a novel CIP failure model based on the methodology of Davey and co-workers is developed using the published models of Bird & Fryer (1991); Bird (1992) and Xin (2003); Xin, Chen & Ozkan (2004) for the first time. The aim was to gain insight into conditions that may lead to unexpected failure of an otherwise well-operated CIP plant. CIP failure is defined as failure to remove proteinaceous deposits on wet surfaces in the auto-set cleaning time. The simplified two-stage model of Bird & Fryer (1991) and Bird (1992) was initially investigated. This model requires input of the thickness of the deposit (δ = 0.00015 m) and the temperature and Re of the cleaning solution (1.0-wt% NaOH). The deposit is considered as two layers: an upper layer of swelled deposit which can be removed (xδ) by the shear from the circulating cleaning solution and a lower layer (yδ) that is not yet removable. The output parameters of particular interest are the rate of deposit removal (R) and total cleaning time (t[subscript]T) needed to remove the deposit. The more elaborate three-stage model of Xin (2003) and Xin, Chen & Ozkan (2004) is based on a polymer dissolution process. This model requires input values of temperature of the cleaning solution (T), critical mass of the deposit (m[subscript]c) and cleaning rate (R[subscript]m). The output parameters of particular interest are the rate of removal during swelling and uniform stage (R[subscript]SU), the rate of removal during decay stage (R[subscript]D) and the total cleaning time needed to remove the deposit (t[subscript]T). The two CIP models are appropriately formatted and simulations used to validate them as a unit-operation. A risk factor (p) together with a practical process tolerance is defined in terms of the auto-set CIP time to remove a specified deposit and the actual cleaning time as affected by stochastic changes within the system (t[subscript]T'). This is computationally convenient as it can be articulated so that all values p > 0 highlight an unwanted outcome i.e. a CIP failure. Simulations for the continuous CIP unit-operation are carried out using Microsoft Excel™ spreadsheet with an add-in @Risk™ (pronounced ‘at risk’) version 5.7 (Palisade Corporation) with some 100,0004 iterations from Monte Carlo sampling of input parameters. A refined Latin Hypercube sampling is used because ‘pure’ Monte Carlo samplings can both over- and under-sample from various parts of a distribution. Values of the input parameters took one of the two forms. The first was the traditional Single Value Assessment (SVA) as defined by Davey (2011) in which a single, ‘best guess’ or mean value of the parameter is used. The output therefore is a single value. The alternate form was a Monte Carlo Assessment (MCA) (Davey, 2011) in which the ‘best guess’ values take the form of a probability distribution around the mean value. Many thousands of randomly sampled values for each input parameter are obtained using Monte Carlo sampling. Generally, in QRA the input parameters take the form of a distribution of values. The output therefore is a distribution of values with each assigned a probability of actually occurring. The values of all inputs are carefully chosen for a realistic simulation of CIP. Results reveal that a continuous CIP unit-operation is actually a mix of successful cleaning operations along with unsuccessful ones, and that these can tip unexpectedly. For example for the unit-operations model of Bird & Fryer (1991) and Bird (1992) failure to remove a proteinaceous milk deposit (δ = 0.00015 m) can occur unexpectedly in 1.0% of all operations when a tolerance of 6% is allowed on the specified auto-set cleaning time (t[subscript]T = 914 s) with a cleaning solution temperature of 60 °C. Using Xin, Chen & Ozkan (2004) model as the underlying unit-operation some 1.9% of operations at a nominal mid-range cleaning solution temperature of 75 °C could fail with a tolerance of 2% on the auto-set CIP time (t[subscript]T = 448 s). Extensive analyses of comparisons of the effect of structure of the two CIP unit-operations models on predictions at similar operating conditions i.e. 2% tolerance on the auto-set clean time (~ 656 s) and 1%-sd in the nominal mean temperature of the NaOH cleaning solution at 65 °C, highlighted that the underlying vulnerability to failure of the simplified model of Bird & Fryer (1991) and Bird (1992) was 1.8 times that of the more elaborate model of Xin (2003) and Xin, Chen & Ozkan (2004). The failure analysis presented in this thesis represents a significant advance over traditional analysis in that all possible practical scenarios that could exist operationally are computed and rigorous quantitative evidence is produced to show that a continuous CIP plant is actually a mix of failed cleaning operations together with successful ones. This insight is not available from traditional methods (with or without sensitivity analysis). Better design and operating decisions can therefore be made because the engineer has a picture of all possible outcomes. The quantitative approach and insight presented here can be used to test re-designs to reduce cleaning failure through changes to the plant including improved temperature and auto-set time control methods. 2 see Appendix A for a definition of some important terms used in this research. 3 Unexpected (unanticipated) failure in plant or product of a well-operated, well-regulated unit-operation. 4 Experience with the models highlighted that stable output values would be obtained with 100,000 iterations (or CIP ‘scenarios’).
Thesis (M.Eng.Sc.) -- University of Adelaide, School of Chemical Engineering, 2012
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Friday 13th risk modelling"

1

Isiker, Murat, Umut Ugurlu, and Oktay Tas. "Investigation of the Calendar Effect." In Recent Applications of Financial Risk Modelling and Portfolio Management, 47–67. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-5083-0.ch003.

Full text
Abstract:
This chapter aims to examine calendar anomaly in selected sample countries by using second-order stochastic dominance (SSD) approach. Day-of-the-week and month-of-the-year effects are analysed for a group of 5 developed and 5 developing country indexes to estimate efficient (inefficient) weekdays and months for the period between 1988 and 2016. Then, back-testing procedure is applied for each sample country to compare performance of index returns for 2017-2019 with the strategy arisen by estimation results. Findings suggest that Monday and Friday returns are inefficient and efficient respectively in all developing countries where different results obtained for developed ones. In monthly analysis, December returns found efficient in 8 indexes including S&P 500. However, October is inefficient for all indexes. Positive January effect seems disappeared in most cases. Back-testing results indicate that in a bearish market condition SSD strategy outperforms index returns in general for daily and monthly comparison.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Friday 13th risk modelling"

1

Dumitriu, Ramona. "Stock Prices Behavior Before and After Friday the 13th." In International Conference Risk in Contemporary Economy. Dunarea de Jos University of Galati, Romania Faculty of Economics and Business Administration, 2019. http://dx.doi.org/10.35219/rce206705323.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kempster, PM, MP Peirson, S. Williams, and DJ McKeon. "P188 Modelling to mitigate: risk factors for hospital acquired pneumonia." In British Thoracic Society Winter Meeting, Wednesday 17 to Friday 19 February 2021, Programme and Abstracts. BMJ Publishing Group Ltd and British Thoracic Society, 2021. http://dx.doi.org/10.1136/thorax-2020-btsabstracts.333.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Miranskyy, A., N. Madhavji, M. Davison, and M. Reesor. "Modelling assumptions and requirements in the context of project risk." In 13th IEEE International Conference on Requirements Engineering (RE'05). IEEE, 2005. http://dx.doi.org/10.1109/re.2005.44.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Krey, Mike, Bettina Harriehausen, and Matthias Knoll. "Approach to the Classification of Information Technology Governance, Risk and Compliance Frameworks." In 2011 UkSim 13th International Conference on Computer Modelling and Simulation (UKSim). IEEE, 2011. http://dx.doi.org/10.1109/uksim.2011.73.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ayello, Francois, Guanlan Liu, Yonghe Yang, and Ning Cui. "Probabilistic Digital Twins for Transmission Pipelines." In 2020 13th International Pipeline Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/ipc2020-9240.

Full text
Abstract:
Abstract Digitalization in the oil and gas industry has led to the formation of digital twins. Digital twins bring closer the physical and virtual world as data is transmitted seamlessly between real time sensors, databases and models. The strength of the digital twin concept is the interconnectivity of data and models. Any model can use any combination of inputs (e.g. operator owned data sets and sensors, third-party databases such as soil composition or weather data, results from other models such as flow assurance, threat modelling or risk modelling). Consequently, the result of one model may become the input of another. This strength is also a weakness, as uncertain (or missing data) will lead to a great source of uncertainty and may lead to wrong results. Worst case scenarios have been used to solve this issue without success. This paper presents a new concept: probabilistic digital twins for pipelines. Probabilistic digital twins do not lose uncertainty as results pass from one model to another, thus providing greater confidence in the final results. This publication reviews the probabilistic digital twin concept and demonstrates how it can be implemented using gas pipeline data from West Pipeline Company, CNPC.
APA, Harvard, Vancouver, ISO, and other styles
6

Dinet, Jérôme. ""Would You be Friends with a Robot?”: The Impact of Perceived Autonomy and Perceived Risk." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002306.

Full text
Abstract:
This paper is aiming to investigate the impact of perceived autonomy and perceived risk on attitudes and opinion about two assistive robots (Paro© and Asimo©), as factors explaining the probability to become “friend” with a robot. The worldwide population of elderly people is growing rapidly and in the coming decades the proportion of older people in the developed countries will change significantly. This demographic shift will create a huge increase in demand for domestic and health-care robotics systems. But the spread of robots in everyday life particularly for purposes of healthcare already gives rise to questions about acceptability, moral and legal responsibility. A robotics system can be powerful and useful, there is not a reason why this system is usable and/or desirable and in fine, accepted. It is still unclear how well these new “faux-people” will be accepted by society, for they raise fundamental questions about what it means to be human, especially at home or in nursing house.METHOD. In a large online survey conducted in France, 2 783 participants (936 adolescents with a mean-age of 12.2 years; 1077 adults with a mean-age of 33.4 years; and 770 seniors with a mean-age of 71.3 years) were asked to complete three questionnaires: (1) The DOSPERT scale (for Domain-Specific Risk-Taking; Blais & Weber, 2006) to assess risk attitude and perception of risks for our participants; (2) The revised version of the FQUA-R scale (for Friendship Quality- Revised; Thien, Razak & Jamil, 2012) to assess close relationships and potential friendship with a robot; (3) The PAS (for Perception of Autonomy Scale; Lombard & Dinet, 2015) to assess positive and negative attitudes towards autonomy of robots. Each participant was asked to complete the three questionnaires twice: before and after viewing two videos showing two assistive robots (Paro© and Asimo©) interacting with human people: In one of the videos, a young woman interacts with the robotic baby-seal Paro©, and gives many explanations about the interests for elderly people (“Paro© gives kindness”, “its allows to create an attachment”). Moreover, we can see an elderly woman who caresses Paro©. In the other video, several physical characteristics of Asimo© are presented (size, weight) and the robot performs several tasks by interacting with a young woman (Asimo© walks, runs, plays football, opens a bottle, serves a glass, etc.).RESULTS AND DISCUSSION. For the two robots, structural equation modelling was used to determine the relationships between all the variables. Results have mainly showed that (i) Perceived risk is mainly and significantly explained by attitudes about risks in health and social domains whatever the gender and the age, and (ii) Perceived autonomy has a direct and positive effect on Friendship quality. In other words, our results tend to confirm that our three factors (perceived risk, perceived autonomy and friendship) are strongly interrelated and should be integrated in studies investigating the acceptability of assistive robots, and confirm that these three factors have different impact according to the physical appearance of the robot (human-like for Asimo© or animal-like shape for Paro©). Industrial and theoretical perspectives are discussed.
APA, Harvard, Vancouver, ISO, and other styles
7

Markopoulos, Evangelos, Emmanuel Querrec, and Mika Luimula. "A strategic partner selection decision-making support methodology in the business modelling phase for startups in the pre-incubation phase." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1001529.

Full text
Abstract:
Partner choice is an important element for any business throughout its lifecycle. It is even more strategic in early startup stages, when the business model is set in the pre-incubation phase.Entrepreneurs are confronted to take decisions on which partners to choose. Those strategic decisions on which partners to commit with, and defining their roles, can be made more or less formally, with the risk of relying on “gut feelings” when there is complex data to be taken in consideration and when there is pressure, constraints, limited resources and no proper methodology for the entrepreneur to base its decision on.Confronted to such a situation, it is interesting to consider building a decision-making support methodology for strategic partner choice for in the business modelling phase of a startup pre-incubation phase. This can offer support to the entrepreneur and make its leadership anchored in more formal approach to decision-making.This research presents a methodological framework that can support early startups, while still in the pre-incubation phase, to select the most suitable strategic business partner(s) and develop, based on that, their business operations, management, development and commercialization models. The methodology offers an initial approach which allows an entrepreneur to make more formal investigation and be assisted in the decision-making process on choosing the partners and defining their roles and contribution in the strategy of the start-up. Specifically, the methodology intends to provide support on selecting the most relevant and feasible data types that need to be collected for the effective partner evaluation and selection. Furthermore, it provides a data collection mechanism and algorithm, a partner evaluation procedure, support on identifying the strategic intend or need from a specific partner, the analysis of the potential partner based on the partnership needs, a scoring tableau based on several parameters per partner selection criteria and finally the calculation of the potential partner’s score. The research conducted evaluated twenty-one potential partners for a VR training startup that intends to operate in the following months and it is currently at the partnerships establishment phase. The partners that have been analyzed derive from eight, related to the start-up, professional sectors, from five countries, and with more than fifty unique activities that cover the fourteen key parameters of the partner evaluation methodology. The paper presents the overall methodological approach in stages and the procedure (steps) of each stage. It indicates the goal setting approach, the evaluation of the partner’s activities, the partner’s evaluation scorecard, the computation of the scoring process and the visualization of the scoring results in tables and charts that create a partner’s evaluation dashboard for effective partners comparison in total or in specific partnership requirements as set in the partnerships strategy and objectives. It must be noted that the proposed methodology is not an optimal tool but more of a heuristic exploratory tool. Further research has been scheduled to be extend the testing of the methodology with more cases, to increase the number of partner evaluation parameters and to link several of the related parameter metrics with sources than can provide more subjective values.
APA, Harvard, Vancouver, ISO, and other styles
8

Smith, Shawn, Alex M. Fraser, and Mari Shironishi. "Recommendations for Jet Fire Model Selection When Performing Consequence Assessments of Onshore Natural Gas Pipelines and Facilities." In 2020 13th International Pipeline Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/ipc2020-9483.

Full text
Abstract:
Abstract The thermal radiation from a jet fire is the dominant hazard resulting from accidental natural gas releases from onshore pipelines or facilities. To assess the consequences to both individuals and equipment, we require models to estimate the incident radiation from the jet fire to the surroundings. Simpler models with shorter implementation and run times are more viable for use in a full probabilistic risk assessment, in which the number of scenarios assessed could number in the millions. However, the level of accuracy within these models must be considered to ensure a reasonably conservative estimate is produced. A review and comparison of semi-empirical models from literature was performed and used to develop a decision tree to recommend the most computationally efficient jet fire modelling approaches based on the release scenario, while maintaining reasonable conservatism. Options for both vertical and non-vertical releases are presented, as well as corrections for lift-off, wind, and buoyancy. Additionally, an efficient algorithm from the area of computer graphics was adapted and applied to a weighted multiple point source jet fire model to account for the reduction in incident radiation to a receptor due to topography or structures partially obstructing the view of the jet fire.
APA, Harvard, Vancouver, ISO, and other styles
9

Holliday, Chris, Andy Young, Terri Funk, and Carrie Murray. "The North Saskatchewan River Valley Landslide: Slope and Pipeline Condition Monitoring." In 2020 13th International Pipeline Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/ipc2020-9532.

Full text
Abstract:
Abstract Following a loss of containment incident in July 2016 on a 16-inch diameter pipeline on the south slope of the North Saskatchewan River located in Saskatchewan, Canada, Husky completed extensive studies to understand and learn from the failure. The cause of the incident was ground movement resulting from a landslide complex on the slope involving two deep-seated compound basal shear slides as well as a near surface translational slide in heavily over consolidated marine clays of the Upper Cretaceous Lea Park Formation. One aspect of the studies has been to undertake structural analysis of the pipeline response to the loading imposed from the ground movement to minimize the potential for a similar occurrence from happening in the future and determine the integrity of the pipeline at the time of the assessment. Given the scale and complexity of the landslide, slope stabilization measures were not practical to implement, so repeat ILI using caliper and inertial measurement technology (IMU), in addition to a robust monitoring program was implemented. Realtime monitoring of ground movements, pipe strain and precipitation levels provided a monitoring and early-warning system combined with documented risk thresholds that identified when to proactively shut-in the pipeline. The methodology and findings of the slope monitoring and structural analysis that was undertaken to examine the robustness of the pipeline to withstand future landslide movement are presented herein. The work involved modelling of the pipeline history on the slope including loads that had accumulated in the original pipeline sections based on historical ILI results and slope monitoring. The pipeline orientation was parallel with the ground movement in the landslide complex, so the development of axial strain in the pipeline was the dominant load component, which are particularly damaging in the compression zone. The work provided recommendations and technical basis to continue safe operation of the pipeline with consideration of continuing ground movement and assisted the operator with decisions over the long-term strategy for the pipeline.
APA, Harvard, Vancouver, ISO, and other styles
10

Dinovitzer, Aaron, Sanjay Tiku, and Mark Piazza. "Dent Assessment and Management: API Recommended Practice 1183." In 2020 13th International Pipeline Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/ipc2020-9724.

Full text
Abstract:
Abstract Pipeline dents can be developed from the pipe resting on rock, a third-party machinery strike, rock strikes during backfilling, amongst other causes. The long-term integrity of a dented pipeline segment is a complex function of a variety of parameters including pipe geometry, indenter shape, dent depth, indenter support, secondary features, and pipeline operating pressure history at and following indentation. In order to estimate the safe remaining operating life of a dented pipeline, all of these factors must be considered and guidelines for this assessment are not available. US DOT regulations (49 CFR 192 and 195) include dent repair and remediation criteria broadly based upon dent depth, dent location (top or bottom side), pressure cycling (liquid or gas), and dent interaction with secondary features (weld, corrosion, cracks). The criteria defined above are simple to use, however, they may not direct maintenance to higher risk dent features and be overly conservative or, in some cases, unconservative. PRCI, USDOT, CEPA and other full-scale testing, finite element modelling and engineering model development research has been completed to evaluate the integrity of pipeline dents. These results have demonstrated trends and limits in dent behavior and life that can improve on existing codified and traditional treatment of dents. With these research results a guideline for dent management can be developed to support operators develop and implement their pipeline integrity management programs. This paper provides an overview of the newly developed API recommended practice for assessment and management of dents (RP 1183). The RP considers dent formation strain, failure pressure and fatigue limit states including the effects of coincident features (i.e. welds, corrosion, cracks and gouges). This paper will focus on how pipeline operators can derive value from this step change in integrity management for dents. The paper describes the basis for the dent screening and integrity assessment tools included in the RP. This RP provides well founded techniques for engineering assessment that may be used to determine the significance of dent features, if remedial actions are required and when these actions should be taken.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography