The regulation and management of nuclear safety

Category: Determinism, Safety
Last Updated: 08 Apr 2020
Pages: 9 Views: 212
Table of contents

Abstract

Deterministic and Probabilistic safety assessments are both important aspects of the ‘Defense in depth’ strategy that is globally adopted today. Advancements in computational capacities, along with development of superior algorithms and code development for probabilistic safety assessments have significantly improved our risk assessment, preparedness, and event response. Innovative approaches such as 2 Phase Monte Carlo simulations and the use of second order probabilities and nested iterations produce better mixed aleatory and epistemic uncertainty quantifications.

Order custom essay The regulation and management of nuclear safety with free plagiarism report

feat icon 450+ experts on 30 subjects feat icon Starting from 3 hours delivery
Get Essay Help

Introduction

Radioactive materials are in wide use. The applications for radioactive material are growing, especially in scientific domains such as medical diagnosis and treatment, industrial research, and power generation. However, safety continues to be a major concern due to the scale of damage that could be caused by radioactive contamination or an uncontrolled nuclear reaction. The Nuclear Energy Agency (NEA) continuously strives to update guidelines pertaining to the safety regulations of nuclear installations and their operations. Though ‘Defense in depth’ has been the main strategy adopted by all nuclear countries over the last couple of decades, there has been considerable trend in the inclusion of ‘Probabilistic Safety Analysis’ (PSA) as part of the regulatory requirements in nuclear plants (NEA 2007).

Today, in most countries, deterministic designs and probabilistic assessments have become key components in achieving compliance with international nuclear safety regulations. With the phenomenal growth in modern computational capabilities, the PSA has also evolved significantly and is providing nuclear scientists better evolutional tools and capabilities for risk metrics. With PSA being a part of the periodic safety reviews it is useful in identifying design and operational improvements that could translate to lowered risks. Decoupling of uncertainty and containment of damage are important safety goals in nuclear installations.

This paper will focus on deterministic and probabilistic safety analysis and will discuss the role played by these techniques in better risk assessment disaster prevention and better preparedness, event response, and damage control in the event of an incident that is beyond the design basis safety. In particular events such as ‘Station blackout (SBO), ‘Hydraulic study, Flood modeling and Correlation hazards would be discussed using examples.

The Deterministic Approach

Deterministic safety analysis includes the ‘defense in depth’ and ‘Leak tight barriers’ approach that ensure successive stages of preventive measures to limit the damage due to possible equipment failures and human errors. Deterministic safety analysis provides for expected operational deviations, ‘Design Basis Accidents’ (DBAs), and those that are ‘Beyond Design Basis Accidents’ (BDBAs) (IAEA 2009). Over the years, the improvements in computational capacities have improved deterministic safety analysis from purely conservative estimates of anticipated operational deviations towards more realistic modeling of complicated nuclear processes and thermal hydraulic events. For instance, ‘Two phase flow’ models that are currently gaining popularity are beginning to provide a more accurate modeling of thermal hydraulic events in the water piping systems connected to the nuclear reactors. This allows nuclear scientists to achieve a more realistic uncertainty evaluation. With the best estimate approach, it is currently possible to obtain more accurate simulations of operational occurrences – for instance such an approach would be highly effective in the evaluation of ‘Loss of coolant accidents’ (IAEA 2009). Performing a best estimate analysis involves the use of best estimate codes with realistic data along with an evaluation of uncertainties pertaining to both the models used and the data input. This approach contributes in identifying the safety parameters that are most relevant to the plant functioning.

Deterministic safety analysis, also known as ‘accident analysis’ is one of the important tools in assessing and fixing problems. Periodic analysis is essential to ensure the safety of the nuclear systems. It is possible that design based safety may sometimes be overridden during the operational stages of the reactor. One of the well known examples of how design based safety was made invalid was the Fermi-1 reactor incident of 1966. In this case, an unexpected and inexplicable sub-assembly fuel blockage scenario resulted in fuel melting and triggering of radiation alarms, which was contained within the reactor. Luckily the control systems that were in place worked perfectly to limit any further damage. Incident analysis later revealed that Design Basis Safety was over ridden by a late stage addition of 6 Zircaloy plates that were used to direct NA coolant flow. It turned out that one of these Zircaloy plates got detached from the section liners and obstructed the fuel assemblies. Since these plates were not in the original design basis, they escaped quality control pertaining to the design. However, from the point of deterministic safety analysis, this incident provided valued input to design specifications, with future fuel assemblies incorporating multiple fuel inlets to avoid possible complete blockage and fuel starvation as was the case in Fermi-1 (Ragheb 2010).

Probabilistic Safety Analysis

The Probabilistic Safety Assessments (PSA) also referred to as the ‘Probabilistic Risk Assessment’ (PRA), is currently part of the regulatory safety assessment of nuclear plants in most countries. The PSA provides plant mangers with a more comprehensive risk assessment and better interpretation of operational events. The PSA enables plant operators achieve broader risk perception and better prioritization of these risks based on their probabilistic values. It provides for better sensitivity testing and provision of a broader range of counter measures to these probabilistic events. In a nut shell, the PSA enables plant managers to operate with better knowledge and anticipation by providing a ‘risk informed- integrated decision making process’ (NEA, 2007).

PSA results could be used as a complement to deterministic analysis to improve the overall safety of the nuclear plant. One of the earliest applications of probabilistic models that was used in the UK was in the design of ‘Advanced Gas-Cooled reactors.’ The Hartlepool and Heysham 1 reactors were among the first to incorporate probabilistic analysis alongside deterministic safety evaluation in the UK. Today, the PSA is mandatory in the UK for all nuclear plants and a PSA review is conducted periodically for all plants to make sure that plant designs are in conformance with the safety standards that ensure the risk to staff and public are ‘as low as reasonably practicable’ (ALARP) (NEA 2007).

The inadequacy of the current regulations with regards to safety review and in particular the PSAs was bought to the fore by the Fukushima disaster. The Fukushima nuclear plant in Japan is a well known case in point of the failure of a proper PSA assessment of external correlated hazards. This type of PSA assessment is done by applying statistical correlation to the related data pertaining to the site’s external conditions. The extensive incident report of the Fukushima disaster concluded that the current regulatory requirements in many aspects, including flooding studies and flood modeling simulations and station blackout requirements, need to be reviewed and improved. In the Fukushima case, the prolonged station blackout, instrumentation failure, and the consequent core meltdown provided many useful conclusions. In particular, the boiling water reactors at the site suffered from design failures. Moreover, during a power outage, the operators could not vent out the hydrogen gas accumulation which eventually caused the explosions. Therefore, the probability of hydrogen accumulation and explosion during a station blackout event assumes greater significance for future PSA studies. Furthermore, the Fukushima incident also highlight that the interactions between multiple units were left out of the PSA purview. Future PSA assessments should also consider several correlated hazards such as ‘mechanical overload’, ‘submergence’, ‘powerful winds’, ‘seismic hazard’ and the potential combined effect of these (Lyubarskiy et.al 2012).

Level 1 PSA (Monte Carlo simulations)

Both deterministic and probabilistic analyses are based on our current knowledge of the behavior of processes and their parameters under normal conditions. However, this introduces epistemic uncertainty. The following figures illustrate these uncertainties.

Fig: Curves showing Continuous epistemic distribution (Rao et.al, 2008)

The standard practice for the calculation of epistemic uncertainties is the use of Monte Carlo Simulations. Also, to identify aleatory uncertainties, several models such as ‘Reliability Block Diagram’(RBD), ‘Fault Tree Analysis’ (FTA), and ‘Event tree analysis’ are used. The problem with using FTA to simulate reliability and availability of engineering systems is that it ignores variables pertaining to ‘time to repair/failure’ resulting in aleatory uncertainties. To overcome these defects and to reduce the uncertainty component in the risk assessment, some improvements are suggested. One of them is the use of 2 point Monte Carlo Simulations that sample both epistemic and aleatory variables (Rao et.al, 2008). A better approximation of uncertainties is possible by separating the aleatory uncertainties from the epistemic uncertainties. The following figure illustrates a 2 phase Monte Carlo simulation with the epistemic and the aleatory variables (Rao et.al, 2008).

A model that provides better uncertainty quantification for events that involve both aleatory and epistemic uncertainties is to perform second order probability analysis. This involves individual treatment of both aleatory and epistemic variables followed by nested iterations (Eldred 2009).

Level 2 PSA (ASTEC)

One of the important aspects of defense- in depth strategy is to consider the possibility of ‘Severe Accidents’ (SA). These accidents are outside the design basis accidents and usually have a powerful impact on the environment. Analyzing these SAs necessitates comprehensive knowledge of critical processes, such as the ‘containment system,’ ‘core melting,’ ‘core concrete interaction,’ ‘fuel flow mechanics,’ etc. Very often the results of the analysis would be affected by many factors such as, computer code related uncertainties, data uncertainties and plant-based uncertainties, and last but not the least is the user expertise.

One of the well known simulation codes that is also approved by the ARPNET is the ASTEC. Primarily used for level 2 PSA, the ASTEC provides a comprehensive simulation of the complete phenomenology of nuclear accidents. This provides researchers with detailed stage specific simulations in any part of the reactor. Several models including the Core degradation model and fuel and aerosol hydraulics model that are part of the ASTEC PSA code provide for a through investigation of the incident (IRSN 2012). The following figure illustrates this (IRSN, 2012).

The simulations can also illustrate core meltdown and corium formation, hydraulics of aerosols, etc (IRSN, 2012).

While the ASTEC code is widely recognized and approved as an industry standard for PSA simulations, there are still problems that are reactor specific. For instance, the SOPHAEROS simulation (an ASTEC module) that is used to understand the transfer of fission products inside the reactor produces deviations in CANDU type reactors. This is ascribed to the differences in model geometry as used in SOPHAEROS compared with CANDU reactors. The following graph illustrates the differences in deposits of Cs, Sr and I with respect to the feeder radius (APOSTOL et.al, 2011).

The following table illustrates the differences in results obtained using Taylor expansion (M1) and the direct SOPHAEROS code based calculation (APOSTOL et.al 2011).

This difference is due to the use of average feeder dimensions in the SOPHAEROS code. This aberration could be corrected by creating a structure with Md diameters and Ml lengths (Apostol 2011).

Fault Tree Analysis

Fault tree analysis is the simple logical method to identify the relationship between the various component parts and how it affects the overall functioning of a system. Fault tree analysis is important tool for the probabilistic models used in the study of nuclear plant safety. The following illustration shows a fault tree

Fig: Fault Tree: Injection Failure (Manely, 2003)

Though fault trees are very effective in probabilistic risk assessment, they may become difficult for human analysis purpose, particularly, if the fault trees are large and there are so many logical levels between the ‘Top event’ and the lower initiating events (Woody 2011).

Conclusion

Nuclear power is one of the primary energy sources for many countries. The fast depleting carbon-based energy sources and their costly global warming foot prints have increased the shift towards Nuclear energy. However, issues about safety of design and operations of nuclear plants causes significant concerns among the public. Particularly, the history of nuclear accidents such as the ‘Three Mile Island’, ‘Chernobyl disaster’ and the more recent Fukushima incident have prompted for more stringent regulatory requirements and safety testing procedures.

The Fermie-1 incident clearly indicates how ‘design based safety’ scrutiny could be inadvertently circumvented resulting in serious problems. Deterministic and Probabilistic safety assessments are both important aspects of the ‘Defense in depth’ strategy that is globally adopted today. Advancements in computational capacities, along with development of superior algorithms and code development for probabilistic safety assessments have significantly improved our risk assessment, preparedness and event response. Fault trees and event tree analysis and the use of state of the art software simulations (ASTEC) provide better security review and valid input for future plant designs or alterations to current plant designs. Innovative approaches, such as 2 Phase Monte Carlo simulations, and newer methods, such as applying second order probabilities and nested iterations, produce better mixed aleatory and epistemic uncertainty quantifications. The increasing complexity of current model in itself contributes to model uncertainty, as was discussed in the case of the SOPHAEROS module. However, by applying innovative solutions (Taylor expansion, in this case) these inadequacies between models could be resolved.

Bibliography

Dan Manely, (2003), Nuclear Safety and Reliability, viewed Nov 13th 2012,

IAEA, (2009), Deterministic Safety Analysis for Nuclear Power Plants, Safety Guide.

K. Durga Rao, H.S. Kushwahaa, A.K. Vermab & A. Srividya (2008), Quantification of epistemic and aleatory uncertainties in level-1 probabilistic safety assessment studies, Indian Institute of Technology, Mumbai.

Lyubarskiy, A, Kuzmina I, El-Shanawany M, (2012). Potential Areas for Enhancement of the PSA Methodology based on Lessons Learned from the Fukushima Accident, IAEA.

M Ragheb PhD, (2010), The Fermi-1 Fuel meltdown incident, viewed Nov 12th 2012, https://netfiles.uiuc.edu/mragheb/www/NPRE%20457%20CSE%20462%20Safety%20Analysis%20of%20Nuclear%20Reactor%20Systems/

Michael S. Eldred, Laura P. Swiler, (2009), Efficient Algorithms for Mixed Aleatory-Epistemic Uncertainty Quantification with Application to Radiation-Hardened Electronics Part I: Algorithms and Benchmark Results, Sandia Report.

NEA, 2007, Use and development of probabilistic safety assessment, Published by OECD, viewed Nov 13th 2012, < www.oecd-nea.org/nsd/docs/2007/csni-r2007-12.pdf>

IRSN, (2012), Enhancing Nuclear Safety: The AZTEC software package, viewed Nov13th 2012,

Minodora APOSTOL, Aureliu LECA, Marin CONSTANTIN & Ilie PRISECARU (2011), Dealing with uncertainties in Nuclear Safety Analysis (Part 11), U.P.B. Sci. Bull., Series C, Vol. 73, Iss. 4.

Woody Epstein, (2011), What’s wrong with Fault tree linking approach for complex PRA modelsviewed Nov 13th 2012, < woody.com/papers/whats-wrong-with-the-fault-tree/>

Cite this Page

The regulation and management of nuclear safety. (2019, Mar 09). Retrieved from https://phdessay.com/the-regulation-and-management-of-nuclear-safety-2/

Don't let plagiarism ruin your grade

Run a free check or have your essay done for you

plagiarism ruin image

We use cookies to give you the best experience possible. By continuing we’ll assume you’re on board with our cookie policy

Save time and let our verified experts help you.

Hire writer