Academic literature on the topic 'Best-Estimate code'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Best-Estimate code.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Best-Estimate code"

1

Prošek, Andrej, and Borut Mavko. "RELAP5/MOD3.3 Best Estimate Analyses for Human Reliability Analysis." Science and Technology of Nuclear Installations 2010 (2010): 1–12. http://dx.doi.org/10.1155/2010/797193.

Full text
Abstract:
To estimate the success criteria time windows of operator actions the conservative approach was used in the conventional probabilistic safety assessment (PSA). The current PSA standard recommends the use of best-estimate codes. The purpose of the study was to estimate the operator action success criteria time windows in scenarios in which the human actions are supplement to safety systems actuations, needed for updated human reliability analysis (HRA). For calculations the RELAP5/MOD3.3 best estimate thermal-hydraulic computer code and the qualified RELAP5 input model representing a two-loop pressurized water reactor, Westinghouse type, were used. The results of deterministic safety analysis were examined what is the latest time to perform the operator action and still satisfy the safety criteria. The results showed that uncertainty analysis of realistic calculation in general is not needed for human reliability analysis when additional time is available and/or the event is not significant contributor to the risk.
APA, Harvard, Vancouver, ISO, and other styles
2

Gonfiotti, Bruno, Michela Angelucci, Bradut-Eugen Ghidersa, Xue Zhou Jin, Mihaela Ionescu-Bujor, Sandro Paci, and Robert Stieglitz. "Best-Estimate for System Codes (BeSYC): A New Software to Perform Best-Estimate Plus Uncertainty Analyses with Thermal-Hydraulic and Safety System Codes for Both Fusion and Fission Scenarios." Applied Sciences 12, no. 1 (December 29, 2021): 311. http://dx.doi.org/10.3390/app12010311.

Full text
Abstract:
The development and the validation of old and new software in relevant DEMO reactor conditions have been exploited in the latest years within the EUROfusion Consortium. The aim was to use—if possible—the software already validated for fission reactors and to fill the gaps with new ad-hoc software. As contribution to this effort, the Karlsruhe Institute of Technology (KIT) developed and tested a novel software to apply the Best-Estimate Model Calibration and Prediction through Experimental Data Assimilation methodology to the system codes RELAP5-3D, MELCOR 1.8.6, and MELCOR 2.2. This software is called Best-estimate for SYstem Codes (BeSYC), and it is developed as a MATLAB App. The application is in charge of applying the mathematical framework of the methodology, writing and executing the code runs required by the methodology, and printing the obtained results. The main goal of BeSYC is to wrap up the methodology in a software suitable to be used by any user through a simple graphical user interface. Albeit developed in the fusion research context, BeSYC can be applied to any reactor/scenario type supported by the specific system code. The goals of BeSYC, the mathematical framework, the main characteristics, and the performed verification and validation activities are described in this paper.
APA, Harvard, Vancouver, ISO, and other styles
3

Takeuchi, K., and M. Y. Young. "Assessment of flooding in a best estimate thermal hydraulic code (W̱COBRA/TRAC)." Nuclear Engineering and Design 186, no. 1-2 (November 1998): 225–55. http://dx.doi.org/10.1016/s0029-5493(98)00224-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cross, A. W., D. P. DiVincenzo, and B. M. Terhal. "A comparative code study for quantum fault tolerance." Quantum Information and Computation 9, no. 7&8 (July 2009): 541–72. http://dx.doi.org/10.26421/qic9.7-8-1.

Full text
Abstract:
We study a comprehensive list of quantum codes as candidates for codes used at the physical level in a fault-tolerant code architecture. Using the Aliferis-Gottesman-Preskill (AGP) ex-Rec method we calculate the pseudo-threshold for these codes against depolarizing noise at various levels of overhead. We estimate the logical noise rate as a function of overhead at a physical error rate of $p_0=1 \times 10^{-4}$. The Bacon-Shor codes and the Golay code are the best performers in our study.
APA, Harvard, Vancouver, ISO, and other styles
5

FURUYA, Masahiro, Yoshihisa Nishi, and Nobuyuki Ueda. "S083034 Predictability of Best-Estimate Code, TRACE for Flashing-Induced Density Wave Oscillations." Proceedings of Mechanical Engineering Congress, Japan 2012 (2012): _S083034–1—_S083034–5. http://dx.doi.org/10.1299/jsmemecj.2012._s083034-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Carlos, S., F. Sanchez-Saez, and S. Martorell. "Use of TRACE best estimate code to analyze spent fuel storage pools safety." Progress in Nuclear Energy 77 (November 2014): 224–38. http://dx.doi.org/10.1016/j.pnucene.2014.07.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Vinai, Paolo, Rafael Macian-Juan, and Rakesh Chawla. "A statistical methodology for quantification of uncertainty in best estimate code physical models." Annals of Nuclear Energy 34, no. 8 (August 2007): 628–40. http://dx.doi.org/10.1016/j.anucene.2007.03.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Petruzzi, Alessandro, Francesco D'Auria, Tomislav Bajs, Francesc Reventos, and Yassin Hassan. "International Course to Support Nuclear Licensing by User Training in the Areas of Scaling, Uncertainty, and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes: 3D S.UN.COP Seminars." Science and Technology of Nuclear Installations 2008 (2008): 1–16. http://dx.doi.org/10.1155/2008/874023.

Full text
Abstract:
Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the “user effect” and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to the areas of the scaling, uncertainty, and 3D coupled code analysis.
APA, Harvard, Vancouver, ISO, and other styles
9

Vileiniskis, V., and A. Kaliatka. "Best estimate analysis of PHEBUS FPT1 experiment bundle phase using ASTEC code ICARE module." Kerntechnik 76, no. 4 (August 2011): 254–60. http://dx.doi.org/10.3139/124.110158.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bousbia-Salah, Anis, and Francesco D'Auria. "Use of coupled code technique for Best Estimate safety analysis of nuclear power plants." Progress in Nuclear Energy 49, no. 1 (January 2007): 1–13. http://dx.doi.org/10.1016/j.pnucene.2006.10.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Best-Estimate code"

1

BERSANO, ANDREA. "Analysis of natural circulation and passive systems phenomenology in nuclear plants." Doctoral thesis, Politecnico di Torino, 2020. http://hdl.handle.net/11583/2829632.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pericas, Raimon. "Contribution to the validation of best estimate plus uncertainties coupled codes for the analysis of NK-TH nuclear transients." Doctoral thesis, Universitat Politècnica de Catalunya, 2015. http://hdl.handle.net/10803/308137.

Full text
Abstract:
The calculations that allow the operating license of the nuclear reactors are usually made using conservative methods. This conservatism, often excessive, limits the actual capabilities of the industry to increase energy production from nuclear power plants. Currently the best estimate calculations are the most advanced tool in the study and analysis of hypothetical accident scenarios. This new technique is superior compared to the old methodology, where the safety margins were established by experts, using assumptions operation and conservative assumptions. The methodology of the best estimate plus uncertainties calculations is able to provide a solution in terms of increased production of nuclear energy without compromising the safety margins. This thesis presents a comparison between the methodology of the best estimate plus uncertainties and methodology within the traditional conservative calculations of coupled three-dimensional neutron-kinetic and thermo-hydraulic. In the framework of the security analysis using system code, coupled three-dimensional kinetic thermo-hydraulic calculations are also the most advanced tools, and they are particularly suitable for those transients involving basic asymmetric conditions and return to criticality scenarios. The best estimate plus uncertainties calculation methodology has been applied with success for the first time within the framework of the present study. This group of new methods requires a new set of calculation tools as well as defining criteria for their validation. The thesis analyzes the existing tools and includes the improvement of some of them in order to allow a more accurate and reliable usage. The scenarios of interest are those that require the coupling between three-dimensional neutron kinetics codes and thermo-hydraulics codes. The first improvement made is based on a methodology for creating a cross section library which applies to any point in the life cycle of the reactor studied. Second improvement applies by establishing an interface between the equations of motion and control rods of neutron absorbing. The analysis of the main steam line break scenario in a pressurized water reactor nuclear power plant, included in this thesis, allows exercising the generated logics and applying it to the cases that are coming for the future innovative license calculations.
Els càlculs que permeten la llicència d'operació dels reactors nuclears són normalment realitzats seguint mètodes conservadors. Aquest conservadorisme, sovint desmesurat, limita en moltes ocasions les capacitats reals de la indústria per augmentar la producció energètica de les plantes d'energia nuclear. Actualment els càlculs de millor estimació, són l'eina més avançada en l'estudi i l'anàlisi d'hipotètics escenaris accidentals. Aquesta nova tècnica és superior en comparació amb l'antiga metodologia, on els marges de seguretat van ser establerts per experts, utilitzant hipòtesis d'operació i suposicions molt conservadores. La metodologia de càlculs de millor estimació amb avaluació d'incerteses, és capaç de proporcionar una solució en termes d'augment de la producció d'energia nuclear sense comprometre els marges de seguretat. En aquesta tesi es presenta una comparació entre la metodologia dels càlculs de millor estimació i la metodologia tradicional conservadora dins l'àmbit dels càlculs de neutró-cinètics i termo-hidràulics acoblats. En el marc de l'anàlisi de la seguretat amb l'ús de codis del sistema, els càlculs acoblats de cinètica tridimensional amb termohidràulica, són també l'eina més avançada, i estan especialment indicats per a aquells transitoris que impliquen condicions bàsiques asimètriques i escenaris de retorn a criticitat. La metodologia dels càlculs de millor estimació més incerteses és doncs, aplicada per primer cop, satisfactòriament dins el marc d'estudi del present document. Aquest grup de noves metodologies exigeix un conjunt de noves eines de càlcul així com la definició de criteris per a la seva validació. La tesi analitza les eines existents i inclou la millora d'alguna d'elles per tal de permetre'n un us més acurat i solvent. Els escenaris d'interès són els que exigeixen l'acoblament de codis de cinètica neutrònica tridimensional i termo-hidràulica. La millora dels primers es concreta en una metodologia de creació de una biblioteca de seccions eficaces vàlida per a qualsevol punt del cicle de vida del reactor estudiat. La millora dels segons està en l'establiment d'una interfase entre les equacions de control i el moviment de les barres d'absorbent neutrònic. L'anàlisi de l'escenari accidental de trencament d'una canonada de vapor principal per una central nuclear, inclòs a la tesi, permet exercitar les lògiques generades i aplicar-les a casos pròxims el que seran els futurs càlculs innovadors de llicència
APA, Harvard, Vancouver, ISO, and other styles
3

Basualdo, Perelló Joaquín Rubén [Verfasser], and R. [Akademischer Betreuer] Stieglitz. "Development of a Coupled Neutronics/Thermal-Hydraulics/Fuel Thermo-Mechanics Multiphysics Tool for Best-Estimate PWR Core Simulations / Joaquín Rubén Basualdo Perelló ; Betreuer: R. Stieglitz." Karlsruhe : KIT-Bibliothek, 2020. http://d-nb.info/1220359068/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Best-Estimate code"

1

Lee, Myeong-Soo, In-Yong Seo, Yo-Han Kim, Yong-Kwan Lee, and Jae-Seung Suh. "Transient Test of a NSSS Thermal-Hydraulic Module for the Nuclear Power Plant Simulator Using a Best-Estimate Code, RETRAN." In Lecture Notes in Computer Science, 529–35. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/978-3-540-30585-9_59.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Aggarwal, Ritu, and Suneet Kumar. "Missing Value Imputation and Estimation Methods for Arrhythmia Feature Selection Classification Using Machine Learning Algorithms." In Machine Learning Methods for Engineering Application Development, 145–63. BENTHAM SCIENCE PUBLISHERS, 2022. http://dx.doi.org/10.2174/9879815079180122010013.

Full text
Abstract:
 Electrocardiogram signal analysis is very difficult to classify cardiac arrhythmia using machine learning methods. The ECG datasets normally come with multiple missing values. The reason for the missing values is the faults or distortion. When performing data mining, missing value imputation is the biggest task for data preprocessing. This problem could arise due to incomplete medical datasets if the incomplete missing values and cases were removed from the original database. To produce a good quality dataset for better analyzing the clinical trials, the suitable missing value imputation method is used. In this paper, we explore the different machine-learning techniques for the computed missing value in the electrocardiogram dataset. To estimate the missing imputation values, the collected data contains feature dimensions with their attributes. The experiments to compute the missing values in the dataset are carried out by using the four feature selection methods and imputation methods. The implemented results are shown by combined features using IG (information gain), GA (genetic algorithm) and the different machine learning classifiers such as NB (naïve bayes), KNN (K-nearest neighbor), MLP (Multilayer perception), and RF (Random forest). The GA (genetic algorithm) and IG (information gain) are the best suitable methods for obtaining the results on lower dimensional datasets with RMSE (Root mean square error. It efficiently calculates the best results for missing values. These four classifiers are used to analyze the impact of imputation methods. The best results for missing rate 10% to 40% are obtained by NB that is 0.657, 0.6541, 0.66, 0.657, and 0.657, as computed by RMSE (Root mean Square error). It means that error will efficiently reduced by naïve bayes classifier.
APA, Harvard, Vancouver, ISO, and other styles
3

Woodruff, Todd E. "Optic Nerve Examination." In Glaucoma. Oxford University Press, 2012. http://dx.doi.org/10.1093/oso/9780199757084.003.0009.

Full text
Abstract:
•Small field of view but increased magnification •Significant degradation of the view from media opacities or small pupil •Lacks ability to perform stereoscopic examination •Useful when slit-lamp exam not possible •Most commonly used system •Variable magnification and field of view • Reasonable view through small pupil • Good stereoscopic view •Less degradation of view from media opacities than direct ophthalmoscope •The magnification and stereopsis obtained with a slit-lamp system is generally superior to that of a handheld or headlamp-based system. • Poor magnification, with wide field of view •Least degradation of view from media opacities •Value of stereopsis limited by poor magnification Fair to poor with small pupils •May be useful for bedside exam or in the operating room • The vertical diameter of the optic nerve can be easily estimated during a slit-lamp exam by the following method: 1. A thin slit beam is focused on the nerve through the lens of choice. 2. The vertical length of the beam is adjusted to match the height of the disc. 3. The length of the beam is read off the slit-lamp beam scale. 4. The scale reading is adjusted by the correction factor of the specific lens. •This estimation is reasonably accurate, but it tends to underestimate disc size in high levels of myopia, and overestimate in high hyperopia. • One can also use the scale projection of the direct ophthalmoscope to estimate nerve head size, with no correction factor needed. The small light cone of the direct ophthalmoscope subtends an angle of about 5 degrees, about the same size as an average optic disc, and can give a quick estimate of relative size. • Stereo-photographs are the most helpful but are more difficult to obtain in a reproducible manner. •Cameras with a prism-based fixed-angle method of taking simultaneous stereo-photos tend to produce more consistent results, but are more expensive. • While non-mydriatic cameras exist, images suffer when the pupil is smaller than 4 mm, and dilation is commonly employed to obtain the best images.
APA, Harvard, Vancouver, ISO, and other styles
4

Verschuur, Gerrit L. "Offering Odds on Impact." In Impact! Oxford University Press, 1996. http://dx.doi.org/10.1093/oso/9780195101058.003.0016.

Full text
Abstract:
There is no doubt that the earth continues to be struck by objects from space. Most of the impactors are very tiny, such as those that produce common meteor trails, and major collisions no longer happen very often. But if a large object, a half kilometer across say, were to strike our planet, the consequences would be devastating. In 1989 an asteroid large enough to bring civilization to the brink of total destruction missed earth by 6 hours and this close encounter in itself should be enough to give us food for thought. There will be other close shaves in the years to come, but no one can predict just when or how close. Only time will tell. Fortunately there are several dedicated groups of astronomers around the world searching for near-earth asteroids (NEAs) in order to catalog their existence and figure orbits lest any should be on a collision course. As a result of their efforts, crucial data are being obtained that will allow the probability of impact to be more accurately estimated, even if only in a statistical sense. The best anyone can do, or will ever be able to do, is to offer odds on the chance of collisions. Odds on comet impact, in the form of estimates of the period between such events, have been published for two centuries. Each generation no doubt felt that the latest estimates were superior to those that went before. For example, in 1861 James Watson, in A Popular Treatise on Comets, said that “it has been found by actual calculation, from the theory of probabilities, that if the nucleus of a comet having a diameter equal to only one fourth part of that of the earth...the probability of receiving a shock from it, is only one in two hundred and eighty-one millions.” This estimate was also quoted by Thomas Dick in 1840 who, in turn, credited it to Francois Arago for calculating this around 1800.
APA, Harvard, Vancouver, ISO, and other styles
5

Glatt, Stephen J., Stephen V. Faraone, and Ming T. Tsuang. "How Common is Schizophrenia?" In Schizophrenia. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198813774.003.0010.

Full text
Abstract:
To this point, we have been providing consensus descriptions of schizophrenia, what it is and what it is not, and describing the means by which it is detected and diagnosed. In this and later chapters, we present the evidence about the causal factors, treatments, and outcomes of schizophrenia from scientific studies. Such studies sometimes find results that differ from each other due to differences in the methods used or the types of patients that are studied. Random differences in measurement between studies also leads to discrepancies, which is perfectly normal.How, then, can we come to firm conclusions in the presence of variable results from different studies? Our approach as scientists, and as authors trying to distil the facts, is to always rely on the preponderance of evidence, or the best estimate that can be made when putting all the evidence together. Thus, as we present the facts moving forward, we will base our claims on the largest studies avail­able, since these usually give more reliable results than small studies. Whenever possible, we will present the results of analyses that put the results of otherstudies together using a formal statistical method called ‘meta- analysis’. Thus, instead of comparing and contrasting the results from two or more studies, we will let the reader know the overall result found when all studies were pooled together. In some instances, however, it is instructive to compare and contrast studies because each study tells us something different and uniquely important, and we will point this out when doing so.In this chapter, we describe the epidemiology of schizophrenia. Epidemiology is a branch of science concerned with the distribution and determinants of illness in the population, and the transmission of illness within families. Two important epidemiologic measures of disease burden in society are prevalence and incidence. The prevalence of schizophrenia (i.e., the number of affected individuals in the population) has been estimated at least 60 times in 30 dif­ferent countries. The prevalence estimates seen in these studies are very consistent, despite cultural differences between samples and the dif­ferent methods used and timeframes sampled in the studies.
APA, Harvard, Vancouver, ISO, and other styles
6

Lund, Steve P., and Larry V. Benson. "A comparison of western Great Basin paleoclimate records for the last 3000 yr: Evidence for multidecadal- to millennial-scale drought." In From Saline to Freshwater: The Diversity of Western Lakes in Space and Time, 183–99. Geological Society of America, 2021. http://dx.doi.org/10.1130/2019.2536(11).

Full text
Abstract:
ABSTRACT This paper summarizes the hydrological variability in eastern California (central Sierra Nevada) for the past 3000 yr based on three distinct paleoclimate proxies, δ18O, total inorganic carbon (TIC), and magnetic susceptibility (chi). These proxies, which are recorded in lake sediments of Pyramid Lake and Walker Lake, Nevada, and Mono Lake and Owens Lake, California, indicate lake-level changes that are mostly due to variations in Sierra Nevada snowpack and rainfall. We evaluated lake-level changes in the four Great Basin lake systems with regard to sediment-core locations and lake-basin morphologies, to the extent that these two factors influence the paleoclimate proxy records. We documented the strengths and weaknesses of each proxy and argue that a systematic study of all three proxies together significantly enhances our ability to characterize the regional pattern, chronology, and resolution of hydrological variability. We used paleomagnetic secular variation (PSV) to develop paleomagnetic chronostratigraphies for all four lakes. We previously published PSV records for three of the lakes (Mono, Owens, Pyramid) and developed a new PSV record herein for Walker Lake. We show that our PSV chronostratigraphies are almost identical to previously established radiocarbon-based chronologies, but that there are differences of 20–200 yr in individual age records. In addition, we used eight of the PSV inclination features to provide isochrons that permit exacting correlations between lake records. We also evaluated the temporal resolution of our proxies. Most can document decadal-scale variability over the past 1000 yr, multidecadal-scale variability for the past 2000 yr, and centennial-scale variability between 2000 and 3000 yr ago. Comparisons among our proxies show a strong coherence in the pattern of lake-level variability for all four lakes. Pyramid Lake and Walker Lake have the longest and highest-resolution records. The δ18O and TIC records yield the same pattern of lake-level variability; however, TIC may allow a somewhat higher-frequency resolution. It is not clear, however, which proxy best estimates the absolute amplitude of lake-level variability. Chi is the only available proxy that records lake-level variability in all four lakes prior to 2000 yr ago, and it shows consistent evidence of a large multicentennial period of drought. TIC, chi, and δ18O are integrative proxies in that they display the cumulative record of hydrologic variability in each lake basin. Tree-ring estimations of hydrological variability, by contrast, are incremental proxies that estimate annual variability. We compared our integrated proxies with tree-ring incremental proxies and found a strong correspondence among the two groups of proxies if the tree-ring proxies are smoothed to decadal or multidecadal averages. Together, these results indicate a common pattern of wet/dry variability in California (Sierra Nevada snowpack/rainfall) extending from a few years (notable only in the tree-ring data) to perhaps 1000 yr. Notable hydrologic variability has occurred at all time scales and should continue into the future.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Best-Estimate code"

1

Kaliatka, Tadas, Eugenijus Ušpuras, and Virginijus Vileiniškis. "Best Estimate Analysis of PHEBUS FPT1 Test Using RELAP/SCDAPSIM Code." In 2012 20th International Conference on Nuclear Engineering and the ASME 2012 Power Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/icone20-power2012-54693.

Full text
Abstract:
The PHEBUS-FP program is an outstanding example of an international cooperative research program that is yielding valuable data for validating severe accident analysis computer codes. The main objective of the PHEBUS FPT1 experiment was to study the processes in the overheated reactor core, release of fission products and their subsequent transport and deposition under conditions representative of a severe accident of a Pressurised Water Reactor. The FPT1 test could be divided in the bundle degradation, aerosol, washing and chemistry phases. The objective of this article is the best estimate analysis of the bundle degradation phase. GRS (Germany) best estimate method with the statistic tool SUSA used for uncertainty and sensitivity analysis of calculation results and RELAP/SCDAPSIM code, designed to predict the behaviour of reactor systems during severe accident conditions, was used for the simulation of this test. The RELAP/SCDAPSIM calculation results were compared with the experimental measurements and calculations results, received by employing ICARE module of ASTEC V2 code. The performed analysis demonstrated, that the best estimate method, employing RELAP/SCDAPSIM and SUSA codes, is capable to model main severe accidents phenomena in the fuel bundle during the overheating and melting of reactor core.
APA, Harvard, Vancouver, ISO, and other styles
2

Kerner, Alexander, Anselm Schaefer, and Kan Chen. "Parameter Adaptation Techniques for the Best-Estimate Thermal-Hydraulic Code ATHLET." In 17th International Conference on Nuclear Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/icone17-75705.

Full text
Abstract:
Parameter adaptation can be used for automated adjustment of real time simulation according to measured data from the simulated plant. It is able to prevent that model uncertainties lead to an increasing deviation between simulation results and real plant behaviour. The objective is to ensure that the simulation corresponds with reality even over a longer period. The present investigation deals with the application of parameter adaptation techniques to best-estimate simulation. This opens a range of new applications including the use of best-estimate codes for improving process diagnostics, signal validation and the man-machine-interface. The adaptation of ATHLET poses several challenges. Among them are non-linear dependencies within the equation system to be solved, the interference of a sophisticated time integration technique, and the identification and proper consideration of those code uncertainties which are most influential with regard to deviations between simulation and real plant behaviour. To overcome these problems a set of new techniques has been developed: - A refined method for the definition of the parameters to be adapted is based on the consideration of the parameter’s uncertainty and sensitivity to the results. Statistic sensitivity analysis is used to validate and improve parameters obtained by expert judgement. - Different parameter adaptation techniques (e.g. based on standard controllers or parameter estimation techniques) are applied to produce feasible solutions for the large systems to be modeled. - In view of preventing non-reversible propagation of simulation inaccuracies decomposition techniques are applied assuring that such temporary inaccuracies do not spread over subsystem boundaries. Results of adaptation based on these techniques are shown for an experiment performed at the test facility PKL III. It turns out that adaptive simulation is feasible for such facilities and that it offers possibilities to improve prediction of experiments by code calculations.
APA, Harvard, Vancouver, ISO, and other styles
3

Dupleac, Daniel, Roxana-Mihaela Nistor-Vlad, Chris Allison, Judith Hohorst, and Marina Perez-Ferragut. "Development of the ASYST VER 3.x LWR/HPWR Best Estimate Integral Code." In 2021 10th International Conference on ENERGY and ENVIRONMENT (CIEM). IEEE, 2021. http://dx.doi.org/10.1109/ciem52821.2021.9614801.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lee, Seok-Ho, Mun-Soo Kim, and Han-Gon Kim. "Development of Best-Estimate ECCS Evaluation Methodology for APR1400." In 17th International Conference on Nuclear Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/icone17-75879.

Full text
Abstract:
Advanced Power Reactor 1400 (APR1400) is an evolutionary Pressurized Water Reactor (PWR) equipped with such advanced features as the Direct Vessel Injection (DVI), the Fluidic Device (FD) in the Safety Injection Tank (SIT), and the In-containment Refueling Water Storage Tank (IRWST) in the Emergency Core Cooling System (ECCS). To verify the performance of these advanced features, more realistic performance evaluation methodology is desired since existing methodologies use too conservative assumptions which cause negative biases to these features. In this study, therefore, a best estimate evaluation methodology for the APR1400 ECCS under large break loss of cooling accident (LBLOCA) is developed targeting operating license of the Shin Kori 3&4 nuclear power plants (SKN 3&4), the first commercial APR1400 plants. On this purpose, a variety of existing best estimate evaluation methodologies previously used are reviewed. As a result of this review, a methodology named KREM is selected for this study. The KREM is based on RELAP5/MOD3.1K and has been used for Korean operating plants since 2002 when it was first approved by Korean regulation. For this study, RELAP5/MOD3.3 (Patch 3), the latest version of RELAP series is selected since it could appropriately simulate the multi-dimensional phenomena for the APR1400 design characteristics. To quantify the code accuracy, analyses covering experimental data have been performed for 36 kinds of separated effect tests (SETs) and integral effect tests (IETs). The uncertainty in the peak cladding temperature (PCT) of the APR1400 is evaluated preliminarily. Based on the preliminary calculation, final uncertainty quantification and bias evaluation are performed to obtain the licensing PCT for Shin-Kori 3&4 plants and the result shows that the LBLOCA licensing acceptance criteria are well satisfied.
APA, Harvard, Vancouver, ISO, and other styles
5

Sabotinov, Luben, Abhishek Srivastava, and Pierre Probst. "Best Estimate Simulation and Uncertainty Analysis of LB LOCA for KudanKulam VVER-1000 NPP." In 17th International Conference on Nuclear Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/icone17-76033.

Full text
Abstract:
In the accident analysis of the Nuclear Power Plants (NPP) nowadays the international licensing practice considers several acceptable options for demonstrating the safety i.e. use of conservative computer codes with conservative assumptions, best estimate codes combined with conservative assumptions and conservative input data and application of best estimate codes with assumptions and realistic input data but associated with uncertainty evaluation of the results. The last option is particularly attractive because it allows for more precise prediction of safety margins with respect to safety criteria and their future use for power up-rating. The best estimate simulation with uncertainty analysis constitutes the framework of the present study which is to apply the last version of the French best estimate computer code CATHARE 2 in order to predict the thermal-hydraulic phenomena in the Indian KudanKulam Nuclear Power Plant (KK NPP) with VVER-1000 reactors during LB LOCA and to evaluate uncertainty along with sensitivity studies using the IRSN methodology. The paper first describes the modeling aspects of LB LOCA with CATHARE and then it presents the basic results. It highlights the use of SUNSET statistical tool developed by IRSN for sampling, management of several runs using CATHARE and further post treatment for uncertainty and sensitivity evaluation. The paper also deals with the difficulties associated with the selection of input uncertainties, code applicability and discusses the challenges in uncertainty evaluation.
APA, Harvard, Vancouver, ISO, and other styles
6

Cheng-Cheng, Deng, Liu Wei-Li, Yang Jun, and Wu Qiao. "Analysis and Validation of Wilks Nonparametric Uncertainty Method for Best-Estimate Calculations in Nuclear Safety." In 2017 25th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/icone25-66655.

Full text
Abstract:
Traditional safety analysis of nuclear power plants is usually performed by conservative methods. However, conservative methods bring some problems, for instance, the calculation results deviate much from the true physical process, and the results are uncertain to be fully conservative. In 1988, Nuclear regulatory commission (NRC) amended the Emergency Core Cooling System (ECCS) licensing rules contained in 10CFR50.46 to permit Best-Estimate Plus Uncertainty (BEPU) analysis as an alternative of conservative methods in nuclear safety analysis. Best-estimate codes are used to perform more realistic calculations, and uncertainty quantitative analysis is required as a necessary complement of best-estimate calculation. To support the revised ECCS regulation, the Code Scaling, Applicability and Uncertainty (CSAU) methodology was developed by NRC as a basic framework of uncertainty analysis. In the original CSAU methodology, response surface method is applied to perform uncertainty quantitative calculations, however, constructing appropriate response surface takes demanding computation costs. Recently, Wilks nonparametric statistical method is widely used in nuclear industry for uncertainty quantification due to its simplicity and efficiency. However, the accuracy and stability of the confidence results using Wilks method is still controversial in actual applications. In this paper, Wilks nonparametric uncertainty method was studied through statistical theory, and then applied in the uncertainty analysis of loss of flow accident for a simple pressurized water reactor. The best-estimate code RELAP5/MOD3.4 was employed to build a model of the simple nuclear plant and carry out uncertainty calculations. In order to improve the calculation efficiency, the thermal-hydraulic code RELAP5 and the statistical code DAKOTA were bridged to perform integrated uncertainty propagating calculations, including input uncertain parametric sampling, multiple executions of code runs in batches and data extraction after calculations. And the BEPU analysis of loss of flow accident was demonstrated through the integrated uncertainty calculating platform. Thereafter, the output database of the figure of merit (Peak Cladding Temperature, PCT) was obtained for uncertainty evaluation. The statistical distribution of PCT results was fitted and its 95% quantile value was given as the reference point. Moreover, the probability density distribution and the corresponding mean and variance values of the 95/95 results of PCT using Wilks uncertainty method at different orders were presented. Thus, the accuracy and stability of the confidence results using Wilks method at different orders was analyzed and validated. The results indicate that, with the increasing of the order, the 95/95 PCT results using Wilks uncertainty method become more accurate with less conservative degree, and also become more stable with smaller dispersal degree, which is agreeable with Wallis’ theoretical analysis. Therefore, as the computing resources permit, Wilks method at higher order is suggested to be used to carry out uncertainty quantification so as to achieve the confidence results with enough accuracy and stability. Our study will provide some theoretical statistic basis and applicable guidance for best-estimate nuclear safety analysis.
APA, Harvard, Vancouver, ISO, and other styles
7

Saha, P., T. K. Das, A. Chanda, and S. Ray. "Best-Estimate Analysis of Loss-of-Coolant Accident in Pressurized Heavy Water Reactors Using Microcomputers." In ASME 1992 International Computers in Engineering Conference and Exposition. American Society of Mechanical Engineers, 1992. http://dx.doi.org/10.1115/cie1992-0074.

Full text
Abstract:
Abstract The present paper discusses the development of a computer software or code for best-estimate analysis of Loss-of-Coolant Accident (LOCA) in Pressurized Heavy Water Reactor (PHWR) systems. The formulation is comparable to U.S., Canadian, French LOCA codes, namely, TRAC, RELAPS, ATHENA, CATHARE, etc. However, the present software has been developed on Microcomputers, namely, PC-XT and AT, whereas the other softwares were developed and are being used primarily on Mainframes such as CDC-7600, CYBER-176, CRAY, etc.
APA, Harvard, Vancouver, ISO, and other styles
8

Tian, Xinlu, Jianping Jing, Shaoxin Zhuang, and Haiying Chen. "Application of Best-Estimate Plus Uncertainty Analysis Method in Nuclear Safety Evaluation." In 2021 28th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/icone28-64393.

Full text
Abstract:
Abstract Most of the computer codes for nuclear power plants safety analysis were developed based on conservative methods, which would lead to unnecessary excessive margins and more difficulty for nuclear power plants design and improvement. Using best-estimate plus uncertainty analysis method may reduce or eliminate these unnecessary limits. The NRC had used the code as a basis for safety reviews of the AP series nuclear power plants, driving the development of the best-estimate plus uncertainty analysis methods (BEPU) in the independent audit calculation. It had become an international trend to use BEPU in accident analysis and safety review of nuclear power plants. At present, the calculation method of the best-estimate in the world is used to better understand the accident sequence in the water-cooled reactor, and uncertainty assessment is the necessary supplement to the best-estimate method. The research and application of BEPU method are becoming more and more extensive. Best-estimate method was combined with best-estimation program and the initial and boundary conditions. It accurately described the current safety margin. It was required to consider each impact analysis of the available information and data, including the best-estimate model, the uncertainty calculated in the transient state, the input and the uncertainty of input parameters. Compared with the conservative evaluation method, the BEPU analysis method defined the gap between the calculated result and the real value through the uncertainty analysis, and made a more reasonable evaluation of the safety margin. In the process of nuclear power plant safety review, National Nuclear Safety Administration (NNSA) had adopted the best-estimate plus uncertainty analysis method to review the LOCA accident. In this paper, the four best-estimate plus uncertainty analysis methods are introduced, the source of uncertainty and uncertainty statistic methods are described. Then, the application of best-estimate plus uncertainty analysis method in nuclear safety evaluation are discussed.
APA, Harvard, Vancouver, ISO, and other styles
9

Shi, Hao, Qi Cai, Yuqing Chen, and Lizhi Jiang. "Uncertainty Analysis of SB-LOCA in AP1000 Nuclear Power Plant." In 2017 25th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/icone25-66586.

Full text
Abstract:
Best estimate plus uncertainty methods (BEPUs) could be used to eliminate the over-conservatism and gain more safety margin in the analysis of thermal-hydraulic transient process at nuclear power plant. Based on the Best estimate thermal-hydraulic system code RELAP5/MOD3.2 platform, the best estimate plus uncertainty methods (BEPUs) proposed by GRS (Gesellschaft fur Anlagen- and Reaktorsicherheit) are presented together with applications to a small break loss of coolant accident (SB-LOCA) on the AP1000 Nuclear Power Plant best estimate analysis model. According to the results of uncertainty calculations, the dispersion bands of maximum cladding temperature and the core outlet void fraction are displayed and assessed.
APA, Harvard, Vancouver, ISO, and other styles
10

Yang, Jung-Hua, Jong-Rong Wang, Hao-Tzu Lin, and Chunkuan Shih. "Best Estimate Analysis of Maanshan PWR LBLOCA by TRACE Coupling With DAKOTA." In 2014 22nd International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/icone22-30817.

Full text
Abstract:
This research is focused on the Large Break Loss of Coolant Accident (LBLOCA) analysis of the Maanshan power plant by TRACE-DAKOTA code. In the acceptance criteria for Loss of Coolant Accidents (LOCAs), there are two accepted analysis methods: conservative methodology and best estimate methodology. Compared with conservative methodology, the best estimate and realistic input data with uncertainties to quantify the limiting values i.e., Peak Cladding Temperature (PCT) for LOCAs analysis. By the conservative methodology, the PCTCM (PCT calculated by conservative methodology) of Maanshan power plant LBLOCA calculated is 1422K. On the other hand, there are six key parameters taken into account in the uncertainty analysis in this study. In PCT95/95 (PCT of 95/95 confidence level and probability) calculation, the PCT95/95 is 1369K lower than the PCTCM (1422K). In addition, the partial rank correlation coefficients between input parameters and PCT indicate that accumulator pressure is the most sensitive parameter in this study.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Best-Estimate code"

1

Suh, Kune Y. Modifications for the development of the MAAP-DOE code: Volume 7: A best-estimate correlation of in-vessel fission product release for severe accident analyses WBS 3. 4. 9. Office of Scientific and Technical Information (OSTI), January 1989. http://dx.doi.org/10.2172/6300747.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nin Pratt, Alejandro, and Héctor Valdés Conroy. After the Boom: Agriculture in Latin America and the Caribbean. Inter-American Development Bank, December 2020. http://dx.doi.org/10.18235/0002955.

Full text
Abstract:
The convergence of a favorable macroeconomic environment and high prices of primary commodities between 2000 and 2011 contributed to the best performance of agriculture in Latin America and the Caribbean (LAC) since the 1980s, with steady growth of total factor productivity (TFP) and output per worker and a reduction in the use of input per worker. The end of the upward phase of the commodity cycle in 2011 together with less favorable external markets and a deterioration of the policy environment in several countries, motivates us to revisit the situation of agriculture in LAC in recent years to analyze how these changes have affected its performance. This study applies a framework that uses index numbers together with data envelopment analysis (DEA) to estimate levels of productivity and efficiency, incorporating technical change together with technical (TE) and environmental efficiency (EE) into the decomposition of TFP. The EE index adjusts the TFP measure for pollution, treating GHG emissions as a by-product of the desired crop or livestock outputs. TFP and efficiency of crop and livestock sub-sectors was calculated for 24 LAC countries from 2000 to 2016. Our results show that the period of fast agricultural growth in LAC, driven by technical change and resource reallocation, transformed agriculture in the region leaving it in a better position to cope with the more unfavorable regional macroeconomic environment and the less dynamic global markets observed after 2011.
APA, Harvard, Vancouver, ISO, and other styles
3

Alchanatis, Victor, Stephen W. Searcy, Moshe Meron, W. Lee, G. Y. Li, and A. Ben Porath. Prediction of Nitrogen Stress Using Reflectance Techniques. United States Department of Agriculture, November 2001. http://dx.doi.org/10.32747/2001.7580664.bard.

Full text
Abstract:
Commercial agriculture has come under increasing pressure to reduce nitrogen fertilizer inputs in order to minimize potential nonpoint source pollution of ground and surface waters. This has resulted in increased interest in site specific fertilizer management. One way to solve pollution problems would be to determine crop nutrient needs in real time, using remote detection, and regulating fertilizer dispensed by an applicator. By detecting actual plant needs, only the additional nitrogen necessary to optimize production would be supplied. This research aimed to develop techniques for real time assessment of nitrogen status of corn using a mobile sensor with the potential to regulate nitrogen application based on data from that sensor. Specifically, the research first attempted to determine the system parameters necessary to optimize reflectance spectra of corn plants as a function of growth stage, chlorophyll and nitrogen status. In addition to that, an adaptable, multispectral sensor and the signal processing algorithm to provide real time, in-field assessment of corn nitrogen status was developed. Spectral characteristics of corn leaves reflectance were investigated in order to estimate the nitrogen status of the plants, using a commercial laboratory spectrometer. Statistical models relating leaf N and reflectance spectra were developed for both greenhouse and field plots. A basis was established for assessing nitrogen status using spectral reflectance from plant canopies. The combined effect of variety and N treatment was studied by measuring the reflectance of three varieties of different leaf characteristic color and five different N treatments. The variety effect on the reflectance at 552 nm was not significant (a = 0.01), while canonical discriminant analysis showed promising results for distinguishing different variety and N treatment, using spectral reflectance. Ambient illumination was found inappropriate for reliable, one-beam spectral reflectance measurement of the plants canopy due to the strong spectral lines of sunlight. Therefore, artificial light was consequently used. For in-field N status measurement, a dark chamber was constructed, to include the sensor, along with artificial illumination. Two different approaches were tested (i) use of spatially scattered artificial light, and (ii) use of collimated artificial light beam. It was found that the collimated beam along with a proper design of the sensor-beam geometry yielded the best results in terms of reducing the noise due to variable background, and maintaining the same distance from the sensor to the sample point of the canopy. A multispectral sensor assembly, based on a linear variable filter was designed, constructed and tested. The sensor assembly combined two sensors to cover the range of 400 to 1100 nm, a mounting frame, and a field data acquisition system. Using the mobile dark chamber and the developed sensor, as well as an off-the-shelf sensor, in- field nitrogen status of the plants canopy was measured. Statistical analysis of the acquired in-field data showed that the nitrogen status of the com leaves can be predicted with a SEP (Standard Error of Prediction) of 0.27%. The stage of maturity of the crop affected the relationship between the reflectance spectrum and the nitrogen status of the leaves. Specifically, the best prediction results were obtained when a separate model was used for each maturity stage. In-field assessment of the nitrogen status of corn leaves was successfully carried out by non contact measurement of the reflectance spectrum. This technology is now mature to be incorporated in field implements for on-line control of fertilizer application.
APA, Harvard, Vancouver, ISO, and other styles
4

NUMERICAL STUDY ON SHEAR BEHAVIOUR OF ENHANCED C-CHANNELS IN STEEL-UHPC-STEEL SANDWICH STRUCTURES. The Hong Kong Institute of Steel Construction, September 2021. http://dx.doi.org/10.18057/ijasc.2021.17.3.4.

Full text
Abstract:
This paper firstly developed a three-dimensional (3D) finite element model (FEM) for enhanced C-channels (ECs) in steel-UHPC-steel sandwich structures (SUSSSs). The FEM was validated by 12 push-out tests on ECs with UHPC. With the validated FEM, this paper performed in-depth parametric studies on shear behaviours of ECs with ultra-high performance concrete (UHPC). These investigated parameters included bolt-hole gap (a), grade (M) and diameter (d) of bolt, core strength (fc), length of C-channel (Lc), and prestressing force ratio on bolt (ρ) in ECs. Under shear forces, the ECs in UHPC exhibited successive fractures of bolts and C-channels. Increasing the bolt-hole gap within 0-2 mm has no harm on the ultimate shear resistance, but greatly improves the slip capacity of ECs. Increasing grade and diameter of bolts improves the shear resistance and ductility of ECs through increasing the PB/PC (shear strength of bolt to that of C-channel) ratio. Increasing the core strength increased the shear resistance, but reduced the ductility of ECs due to the reduced PB/PC ratio. The ECs with Lc value of 50 mm offer the best ductility. Prestressing force acting on the bolts reduced the shear strength and ductility of ECs with UHPC. Analytical models were proposed to estimate the ultimate shear resistance and shear-slip behaviours of ECs with UHPC. The extensive validations of these models against 12 tests and 31 FEM analysis cases proved their reasonable evaluations on shear behaviours of ECs with UHPC.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography