Academic literature on the topic 'Beyond worst-case analysis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Beyond worst-case analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Beyond worst-case analysis"

1

Roughgarden, Tim. "Beyond worst-case analysis." Communications of the ACM 62, no. 3 (February 21, 2019): 88–96. http://dx.doi.org/10.1145/3232535.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Manthey, Bodo, and Heiko Röglin. "Smoothed Analysis: Analysis of Algorithms Beyond Worst Case." it - Information Technology 53, no. 6 (December 2011): 280–86. http://dx.doi.org/10.1524/itit.2011.0654.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kirner, Raimund, Jens Knoop, Adrian Prantl, Markus Schordan, and Albrecht Kadlec. "Beyond loop bounds: comparing annotation languages for worst-case execution time analysis." Software & Systems Modeling 10, no. 3 (April 9, 2010): 411–37. http://dx.doi.org/10.1007/s10270-010-0161-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Arnestad, Håvard Kjellmo, Gábor Geréb, Tor Inge Birkenes Lønmo, Jan Egil Kirkebø, Andreas Austeng, and Sven Peter Näsholm. "Worst-case analysis of array beampatterns using interval arithmetic." Journal of the Acoustical Society of America 153, no. 6 (June 1, 2023): 3312. http://dx.doi.org/10.1121/10.0019715.

Full text
Abstract:
Over the past decade, interval arithmetic (IA) has been used to determine tolerance bounds of phased-array beampatterns. IA only requires that the errors of the array elements are bounded and can provide reliable beampattern bounds even when a statistical model is missing. However, previous research has not explored the use of IA to find the error realizations responsible for achieving specific bounds. In this study, the capabilities of IA are extended by introducing the concept of “backtracking,” which provides a direct way of addressing how specific bounds can be attained. Backtracking allows for the recovery of the specific error realization and corresponding beampattern, enabling the study and verification of which errors result in the worst-case array performance in terms of the peak sidelobe level (PSLL). Moreover, IA is made applicable to a wider range of arrays by adding support for arbitrary array geometries with directive elements and mutual coupling in addition to element amplitude, phase, and positioning errors. Last, a simple formula for approximate bounds of uniformly bounded errors is derived and numerically verified. This formula gives insights into how array size and apodization cannot reduce the worst-case PSLL beyond a certain limit.
APA, Harvard, Vancouver, ISO, and other styles
5

Mitzenmacher, Michael, and Sergei Vassilvitskii. "Algorithms with predictions." Communications of the ACM 65, no. 7 (July 2022): 33–35. http://dx.doi.org/10.1145/3528087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Pape, Marieke, Steven Kuijper, Pauline A. J. Vissers, Geert-Jan Creemers, Hanneke W. M. Van Laarhoven, and Rob Verhoeven. "Beyond median overall survival: Estimating multiple survival scenarios in patients with metastatic esophagogastric cancer." Journal of Clinical Oncology 40, no. 4_suppl (February 1, 2022): 261. http://dx.doi.org/10.1200/jco.2022.40.4_suppl.261.

Full text
Abstract:
261 Background: Recent clinical trials of novel systemic therapies showed improved survival of patients with metastatic esophageal cancer (EC) and gastric cancer (GC). Survival improvements observed in clinical trials might be unrepresentative for the total population as the percentage of patients whom participate in clinical trials is limited and more than half of all patients receive best supportive care (BSC). The aim of our study is to assess the best-case, typical and worst-case survival scenarios in patients with metastatic esophagogastric cancer. Methods: We selected patients with metastatic EC (including junction) or GC diagnosed in 2006-2019 from the Netherlands Cancer Registry. Survival scenarios were calculated based on percentiles of the survival curve stratified by tumor location and treatment (tumor-directed therapy or BSC). Survival scenarios were calculated for the 10th (best-case), 25th (upper-typical), 75th (lower-typical) and 90th (worst-case) percentiles. Linear trend analysis was performed to test if changes in survival over the diagnosis years were significant. Results: We identified 12739 patients with EC and 6833 patients with GC. Percentage of patients receiving tumor-directed therapy increased from 34% to 47% and 30% to 45% for patients with EC and GC, respectively. The median survival remained unchanged for patients with EC (5.0 months) and improved slightly for patients with GC (3.1 to 3.7 months; P=0.006). For patients with EC survival of the best-case scenario improved (17.4 to 22.8 months; P=0.001), whereas the other scenarios remained unchanged: upper-typical 11.2 to 11.7 (P=0.11), lower-typical 2.1 to 2.0 (P=0.10) and worst-case 0.9 to 0.8 months (P=0.22). For patients with GC survival improved for the best-case (13.1 to 19.5; P=0.005) and upper-typical scenario (6.7 to 10.6 months; P=0.002), whereas the lower-typical (1.2 to 1.4 months; P=0.87) and worst-case (0.6 to 0.6 months; P=0.60) remained unchanged. For patients with EC receiving tumor-directed therapy survival in all scenarios remained unchanged while for patients receiving BSC survival decreased: best-case 11.8 to 9.8 (P=0.005), upper-typical 6.0 to 5.0 (P=0.002), lower-typical 1.4 to 1.0 (P=0.003) and worst-case 0.7 to 0.5 months (P=0.03). For patients with GC receiving tumor-directed therapy survival improved for all scenarios: best-case 19.8 to 30.4 (P=0.005), upper-typical 6.4 to 10.3 (P=0.002), lower-typical 3.6 to 5.4 (P<0.001) and worst-case 1.4 to 2.6 months (P<0.001), and for patients receiving BSC survival for all scenarios remained unchanged. Conclusions: The proportion of patients with EC and GC receiving tumor-directed therapy increased over time. Despite the fact that survival improvements were not observed across all scenarios, at least an increase in survival was observed in certain subgroups of patients.
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, S. C., S. J. Hu, and T. C. Woo. "Tolerance Analysis for Sheet Metal Assemblies." Journal of Mechanical Design 118, no. 1 (March 1, 1996): 62–67. http://dx.doi.org/10.1115/1.2826857.

Full text
Abstract:
Traditional tolerance analyses such as the worst case methods and the statistical methods are applicable to rigid body assemblies. However, for flexible sheet metal assemblies, the traditional methods are not adequate: the components can deform, changing the dimensions during assembly. This paper evaluates the effects of deformation on component tolerances using linear mechanics. Two basic configurations, assembly in series and assembly in parallel, are investigated using analytical methods. Assembly sequences and multiple joints beyond the basic configurations are further examined using numerical methods (with finite element analysis). These findings constitute a new methodology for the tolerancing of deformable parts.
APA, Harvard, Vancouver, ISO, and other styles
8

Söderlund, Ellinor Susanne, and Natalia B. Stambulova. "In a Football Bubble and Beyond." Scandinavian Journal of Sport and Exercise Psychology 3 (June 14, 2021): 13–23. http://dx.doi.org/10.7146/sjsep.v3i.121756.

Full text
Abstract:
The objectives of this study were: (1) to explore cultural transition pathways of Swedish professional football players relocated to another European country, (2) to identify shared themes in their transition narratives. We interviewed three professional players who in their early twenties relocated to Italy, Turkey, and Switzerland, and then analyzed their stories using holistic and categorical analyses following the narrative oriented inquiry (NOI) model (Hiles & Čermák, 2008). The holistic analysis resulted in creating three core narratives (i.e., re-telling of the participants’ stories) entitled: Preparing for the worst-case scenario and saved by dedication to football; Showing interest for the host culture and carrying responsibility as a foreign player; and A step for personal development: from homesickness to being hungry for more. The categorical analysis resulted in 12 shared themes from the players’ stories arranged around three phases of the cultural transition model (Ryba et al., 2016). In the pre-transition all the participants were established players searching for new professional opportunities. In the acute cultural adaptation phase, they all prioritized adjustment in football (e.g., fitting in the team, performing). In the socio-cultural adaptation phase, they broaden their perspectives and realized that finding a meaningful life outside of football was just as important to function and feel satisfied as football success.
APA, Harvard, Vancouver, ISO, and other styles
9

Xu, Chenyang, and Benjamin Moseley. "Learning-Augmented Algorithms for Online Steiner Tree." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 8 (June 28, 2022): 8744–52. http://dx.doi.org/10.1609/aaai.v36i8.20854.

Full text
Abstract:
This paper considers the recently popular beyond-worst-case algorithm analysis model which integrates machine-learned predictions with online algorithm design. We consider the online Steiner tree problem in this model for both directed and undirected graphs. Steiner tree is known to have strong lower bounds in the online setting and any algorithm’s worst-case guarantee is far from desirable. This paper considers algorithms that predict which terminal arrives online. The predictions may be incorrect and the algorithms’ performance is parameterized by the number of incorrectly predicted terminals. These guarantees ensure that algorithms break through the online lower bounds with good predictions and the competitive ratio gracefully degrades as the prediction error grows. We then observe that the theory is predictive of what will occur empirically. We show on graphs where terminals are drawn from a distribution, the new online algorithms have strong performance even with modestly correct predictions.
APA, Harvard, Vancouver, ISO, and other styles
10

Lucarelli, Giorgio, Benjamin Moseley, Nguyen Kim Thang, Abhinav Srivastav, and Denis Trystram. "Online Non-preemptive Scheduling on Unrelated Machines with Rejections." ACM Transactions on Parallel Computing 8, no. 2 (June 30, 2021): 1–22. http://dx.doi.org/10.1145/3460880.

Full text
Abstract:
When a computer system schedules jobs there is typically a significant cost associated with preempting a job during execution. This cost can be incurred from the expensive task of saving the memory’s state or from loading data into and out of memory. Thus, it is desirable to schedule jobs non-preemptively to avoid the costs of preemption. There is a need for non-preemptive system schedulers for desktops, servers, and data centers. Despite this need, there is a gap between theory and practice. Indeed, few non-preemptive online schedulers are known to have strong theoretical guarantees. This gap is likely due to strong lower bounds on any online algorithm for popular objectives. Indeed, typical worst-case analysis approaches, and even resource-augmented approaches such as speed augmentation, result in all algorithms having poor performance guarantees. This article considers online non-preemptive scheduling problems in the worst-case rejection model where the algorithm is allowed to reject a small fraction of jobs. By rejecting only a few jobs, this article shows that the strong lower bounds can be circumvented. This approach can be used to discover algorithmic scheduling policies with desirable worst-case guarantees. Specifically, the article presents algorithms for the following three objectives: minimizing the total flow-time, minimizing the total weighted flow-time plus energy where energy is a convex function, and minimizing the total energy under the deadline constraints. The algorithms for the first two problems have a small constant competitive ratio while rejecting only a constant fraction of jobs. For the last problem, we present a constant competitive ratio without rejection. Beyond specific results, the article asserts that alternative models beyond speed augmentation should be explored to aid in the discovery of good schedulers in the face of the requirement of being online and non-preemptive.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Beyond worst-case analysis"

1

Etscheid, Michael [Verfasser]. "Beyond Worst-Case Analysis of Max-Cut and Local Search / Michael Etscheid." Bonn : Universitäts- und Landesbibliothek Bonn, 2018. http://d-nb.info/1167857003/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Beyond worst-case analysis"

1

Gkaragkounis, Emmanouil Vasileios Vlatakis. Beyond Worst-Case Analysis of Optimization in the Era of Machine Learning. [New York, N.Y.?]: [publisher not identified], 2022.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Roughgarden, Tim, and Daniel A. Spielman. Beyond Worst-Case Analysis. University of Cambridge ESOL Examinations, 2021.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Roughgarden, Tim, ed. Beyond the Worst-Case Analysis of Algorithms. Cambridge University Press, 2020. http://dx.doi.org/10.1017/9781108637435.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Roughgarden, Tim. Beyond the Worst-Case Analysis of Algorithms. Cambridge University Press, 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Roughgarden, Tim. Beyond the Worst-Case Analysis of Algorithms. Cambridge University Press, 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Beyond worst-case analysis"

1

Babaioff, Moshe, Ruty Mundel, and Noam Nisan. "Beyond Pigouvian Taxes: A Worst Case Analysis." In Web and Internet Economics, 226–43. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-94676-0_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Thakurta, Abhradeep. "Beyond Worst Case Sensitivity in Private Data Analysis." In Encyclopedia of Algorithms, 192–99. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4939-2864-4_547.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Thakurta, Abhradeep. "Beyond Worst Case Sensitivity in Private Data Analysis." In Encyclopedia of Algorithms, 1–8. Boston, MA: Springer US, 2014. http://dx.doi.org/10.1007/978-3-642-27848-8_547-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kamali, Shahin, and Helen Xu. "Beyond Worst-case Analysis of Multicore Caching Strategies." In Symposium on Algorithmic Principles of Computer Systems (APOCS), 1–15. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2021. http://dx.doi.org/10.1137/1.9781611976489.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Xia, Lirong, and Weiqiang Zheng. "Beyond the Worst Case: Semi-random Complexity Analysis of Winner Determination." In Web and Internet Economics, 330–47. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-22832-2_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Reghenzani, Federico. "Beyond the Traditional Analyses and Resource Management in Real-Time Systems." In Special Topics in Information Technology, 67–77. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-85918-3_6.

Full text
Abstract:
AbstractThe difficulties in estimating the Worst-Case Execution Time (WCET) of applications make the use of modern computing architectures limited in real-time systems. Critical embedded systems require the tasks of hard real-time applications to meet their deadlines, and formal proofs on the validity of this condition are usually required by certification authorities. In the last decade, researchers proposed the use of probabilistic measurement-based methods to estimate the WCET instead of traditional static methods. In this chapter, we summarize recent theoretical and quantitative results on the use of probabilistic approaches to estimate the WCET presented in the PhD thesis of the author, including possible exploitation scenarios, open challenges, and future directions.
APA, Harvard, Vancouver, ISO, and other styles
7

Karlin, Anna R., and Elias Koutsoupias. "Beyond Competitive Analysis." In Beyond the Worst-Case Analysis of Algorithms, 529–46. Cambridge University Press, 2020. http://dx.doi.org/10.1017/9781108637435.031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Roughgarden, Tim. "Distributional Analysis." In Beyond the Worst-Case Analysis of Algorithms, 167–88. Cambridge University Press, 2020. http://dx.doi.org/10.1017/9781108637435.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Manthey, Bodo. "Smoothed Analysis of Local Search." In Beyond the Worst-Case Analysis of Algorithms, 285–308. Cambridge University Press, 2020. http://dx.doi.org/10.1017/9781108637435.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Barbay, Jérémy. "From Adaptive Analysis to Instance Optimality." In Beyond the Worst-Case Analysis of Algorithms, 52–71. Cambridge University Press, 2020. http://dx.doi.org/10.1017/9781108637435.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Beyond worst-case analysis"

1

Vishkin, Uzi. "Beyond worst-case analysis." In PPoPP '22: 27th ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3528425.3529105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ngo, Hung Q., Dung T. Nguyen, Christopher Re, and Atri Rudra. "Beyond worst-case analysis for joins with minesweeper." In SIGMOD/PODS'14: International Conference on Management of Data. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2594538.2594547.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ergür, Alperen, Josué Tonelli-Cueto, and Elias Tsigaridas. "Beyond Worst-Case Analysis for Root Isolation Algorithms." In ISSAC '22: International Symposium on Symbolic and Algebraic Computation. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3476446.3535475.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hardt, Moritz, and Aaron Roth. "Beyond worst-case analysis in private singular vector computation." In the 45th annual ACM symposium. New York, New York, USA: ACM Press, 2013. http://dx.doi.org/10.1145/2488608.2488650.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Iqbal, Javeria, Iftikhar Ahmad, and Asadullah Shah. "Disparity between Theory & Practice: Beyond the Worst-Case Competitive Analysis." In 2018 IEEE 5th International Conference on Engineering Technologies and Applied Sciences (ICETAS). IEEE, 2018. http://dx.doi.org/10.1109/icetas.2018.8629111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dereziński, Michał, Rajiv Khanna, and Michael W. Mahoney. "Improved Guarantees and a Multiple-descent Curve for Column Subset Selection and the Nystrom Method (Extended Abstract)." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/647.

Full text
Abstract:
The Column Subset Selection Problem (CSSP) and the Nystrom method are among the leading tools for constructing interpretable low-rank approximations of large datasets by selecting a small but representative set of features or instances. A fundamental question in this area is: what is the cost of this interpretability, i.e., how well can a data subset of size k compete with the best rank k approximation? We develop techniques which exploit spectral properties of the data matrix to obtain improved approximation guarantees which go beyond the standard worst-case analysis. Our approach leads to significantly better bounds for datasets with known rates of singular value decay, e.g., polynomial or exponential decay. Our analysis also reveals an intriguing phenomenon: the cost of interpretability as a function of k may exhibit multiple peaks and valleys, which we call a multiple-descent curve. A lower bound we establish shows that this behavior is not an artifact of our analysis, but rather it is an inherent property of the CSSP and Nystrom tasks. Finally, using the example of a radial basis function (RBF) kernel, we show that both our improved bounds and the multiple-descent curve can be observed on real datasets simply by varying the RBF parameter.
APA, Harvard, Vancouver, ISO, and other styles
7

Seevam, Patricia, Julia Race, Martin Downie, Julian Barnett, and Russell Cooper. "Capturing Carbon Dioxide: The Feasibility of Re-Using Existing Pipeline Infrastructure to Transport Anthropogenic CO2." In 2010 8th International Pipeline Conference. ASMEDC, 2010. http://dx.doi.org/10.1115/ipc2010-31564.

Full text
Abstract:
Climate change has been attributed to green house gases, with carbon dioxide (CO2) being the main contributor. Sixty to seventy percent of carbon dioxide emissions originate from fossil fuel power plants. Power companies in the UK, along with oil and gas field operators, are proposing to capture this anthropogenic CO2 and either store it in depleted reservoirs or saline aquifers (carbon capture and storage, CCS), or use it for ‘Enhanced Oil Recovery’ (EOR) in depleting oil and gas fields. This would involve extensive onshore and offshore pipeline systems. The decline of oil and gas production of reservoirs beyond economic feasibility will require the decommissioning onshore and offshore facilities post-production. This creates a possible opportunity for using existing pipeline infrastructure. Conversions of pipelines from natural gas service to CO2 service for EOR have been done in the United States. However, the differing sources of CO2 and the differing requirements for EOR and CCS play a significant part in allowing the re-use of existing infrastructure. The effect of compositions, the phase of transportation, the original pipeline specifications, and also the pipeline route require major studies prior to allowing re-use. This paper will first review the requirements for specifying the purity of the CO2 for CCS and to highlight the implications that the presence of impurities and the current water specifications for pipelines has on the phase diagram and the associated physical properties of the CO2 stream. A ‘best’ and ‘worst’ case impurity specification will be identified. Then an analysis on the impact and subsequent validation, of equations of state based on available experimental data on the phase modelling of anthropogenic CO2 is presented. A case study involving an existing 300km gas pipeline in the National Transmission System (NTS) in the UK is then modelled, to demonstrate the feasibility of using this pipeline to transport anthropogenic CO2. The various issues involved for the selected ‘best’ and ‘worst’ case specification are also covered. This is then followed by an investigation of the options for transport in the ‘gas’ phase and ‘supercritical’ phases, and also identifying the limitations on re-using pipeline infrastructure for CCS.
APA, Harvard, Vancouver, ISO, and other styles
8

Robert Lugner, Robert Lugner, Maximilian Inderst Maximilian Inderst, Gerald Sequeira Gerald Sequeira, Kilian Schneider Kilian Schneider, and Thomas Brandmeier Thomas Brandmeier. "Collision Prediction for Irreversible Pre-Crash Safety Measures." In FISITA World Congress 2021. FISITA, 2021. http://dx.doi.org/10.46720/f2020-pif-033.

Full text
Abstract:
The precise and reliable prediction of vehicle movements based on information from environmental sensors such as radar, camera or LiDAR is an essential constituent of future pre-crash safety functions triggering irreversible actuators. These predictions require novel motion models that go beyond the state-of-the-art and the comparatively low requirements regarding accuracy in today's AEB systems. Because of this, methods for existing driver assistance systems are not suitable for the prediction of crash parameters in inevitable crash situations and the subsequent activation of irreversible systems with ASIL D classification like prospective smart airbags. The proposed highly accurate inevitability model allows the integration of a specially adapted motion model for worst-case assessment within the physical limits. The focus is on integration into pre-crash functionality with near-field environment models in order to provide reliable information of the expected crash constellation to subsequent crash severity estimation. An adapted single-track model describes the nonholonomic vehicle behavior. Physically possible trajectories for crash avoidance are calculated using Kamm circle as the limitation for the vehicle accelerations. This model is modified to minimize the dependency on the unavailable and hence only roughly estimated vehicle parameters of the collision partner. Innovative concepts for crash severity estimation require detailed information on the expected collision parameters (location, angle, velocities) as well as the estimated times to the inevitable collision. Depending on the application and the traffic scenario, this information can then be categorized into collision classes. These classes can be defined according to standard crash test scenarios (ODB40, AZT, etc.). Particularly interesting examples for the inevitability model are critical traffic situations with crossing vehicles. This paper focuses on the effects of different scenario constellations in the urban to rural speed range, on the inevitability model, and the subsequent crash severity prediction. Therefore, heat maps are presented as a suitable tool for the evaluation of the scenarios. They provide a simple illustration of how critical the unavoidable collision is as well as the functional requirements of pre-crash systems in different situations. The proposed model is the groundwork for further developments and investigations towards a holistic methodology for pre-crash safety systems. The prototype application in a CARISSMA research vehicle is currently in preparation. In addition, the presented methodology will also be extended to investigations of sensor requirements and system tolerances. This enables the comprehensive analysis of sensor systems and algorithms, which are essential components of future irreversible safety systems.
APA, Harvard, Vancouver, ISO, and other styles
9

Wilkowski, G., B. Brust, T. Zhang, G. Hattery, S. Kalyanam, D. J. Shim, E. Kurth, et al. "Robust LBB Analyses for Atucha II Nuclear Plant." In ASME 2011 Pressure Vessels and Piping Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/pvp2011-57939.

Full text
Abstract:
The Atucha II nuclear power plant is a unique pressurized heavy water reactor being constructed in Argentina. The original plant design was by KWU in the 1970’s using the then German methodology of break preclusion, which assumed that the largest break-opening area would be 10-percent of the cross-sectional area of the largest pipe diameter. That philosophy was used for the design of the emergency core cooling system in the 1970’s. The plant construction was halted for several decades, but a recent need for power was the driver for restarting the construction. The construction is progressing with initial start-up in 2011. Since the 10-percent of the cross-sectional area is a smaller ECCS design requirement than the normally assumed double-ended-guillotine break, the safety evaluation of the plant for beyond design basis seismic loading of the nuclear plant was a regulatory requirement. This overview paper describes a Robust LBB Evaluation that was conducted in great detail to assess the safety aspects of the piping system under beyond design basis seismic loading and the implications to the ECCS. Key aspects involved: • Static and dynamic material property testing, • Determination of weld residual stresses, • Determination of crack sizes that might evolve by worst case SCC growth rates under weld residual stresses and normal operating stresses, • Determination of leakage rates as a function of time with the upper-bounding crack growth rates, • Development of seismic hazard curves for the site, • Development of FE models of the containment building and primary NSSS system within the building, • Determination of normal operating stresses, SSE stresses and 10−6 seismic stresses using worst case soil foundation assumptions, • Evaluation of flaw behavior for circumferential cracks using the shapes from the natural crack growth. • Evaluation of margins on the critical flaw size and times to leakage, and • Standard LBB analyses, as well as Transition Break Size evaluations. The key result from this effort was that even with all the normal operating plus 10−6 seismic event loading, the pipe system behaved more like it was displacement-controlled than load-controlled. The displacement-controlled behavior made the pipe much more flaw tolerant, and it was found that a DEGB was not possible because the flaw could never reach the critical flaw size without greatly surpassing the leakage and water make-up capacity of the plant. Since there are many details in this multi-year effort, only the key points will be summarized in this paper while other details will be the topics of other papers.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography