MAM-B - Special Session: Data Quality Centennial Ballroom 300B 09:30 - 11:30
|
Chair(s): Jeff Chapman
|
|
MAM-B.1
09:30 Adapting the Data Quality Process to the Needs of Radiological Emergency Response EM Becker*, Pacific Northwest National Laboratory
; J Mosser, Environmental Protection Agency
Abstract: The data quality process results in defensible decisions that can be made more confidently than those based on data with high statistical variance or unknown collection parameters. Data quality is regularly applied in the context of large environmental remediation, and decontamination and decommissioning projects, where decisions about the radiological state of an area are made only after much analysis and quality assessment of the data on the order of years. Emergency response presents an entirely different set of challenges with respect to data quality. Emergency response is a fast-paced, rapidly-evolving application where decisions must be made quickly; therefore, conclusions about the data must be drawn quickly, in order to avoid adverse effects for the population affected by an incident. Data quality is still important in this context since collecting data of sufficient quality is a critical step to making traceable and defensible decisions. It is important to consider, then, how to adapt the existing data quality process to the needs of emergency response without compromising the process itself. This adaptation can be done through the application of overarching data quality principles and an understanding of the limitations of emergency response. This application of data quality requires a phased approach, where data acceptance criteria will be less stringent during the early phase, when little data has been collected, and become more stringent as the response continues, more data is collected, and more resources are brought to bear. Efforts across multiple Federal agencies have begun to support better implementation of the data quality process and principles in emergency response, from establishing data quality objectives to data quality assessment. One of these efforts is currently being funded by the U.S. Department of Homeland Security (DHS) Science and Technology Directorate (S&T). The results and ongoing discussions from these efforts will be presented.
|
MAM-B.2
09:45 Upper Tolerance Limits for Radiological Decision Making M Obiri*, Pacific Northwest National Laboratory
; LN Newburn, Pacific Northwest National Laboratory; DK Fagan, Pacific Northwest National Laboratory
Abstract: Cleanup decisions based on dose assessment criteria typically utilize comparisons between observed mean concentrations and action limits based on a risk model or background value (NUREG-1575, NUREG-1757, NUREG-1505). However, there are situations where decisions should be based on comparing the upper tail of a distribution, rather than a mean, to an action limit. Examples include: 1) in decommissioning, when the site is classified as a MARSSIM Class III area (no prior knowledge that contamination is present) (NUREG-1575), 2) in consequence management, when determining whether an area is contaminated, and 3) in the event of a fuel transportation accident when it is important to verify that no radioactivity has been released. In these cases, as well as others, the relevant determination is whether some high percentage of an area or population of individuals is below an action limit.
Upper tolerance limits (UTLs) are upper confidence limits on an upper quantile of a distribution. In this presentation, we will discuss the general decision frameworks to which UTLs apply as well as some considerations when formulating action limits and estimates based on data.
|
MAM-B.3
10:00 Statistical methods for subsurface decommissioning JC Huckett, Pacific Northwest National Laboratory
; ZD Weller*, Pacific Northwest National Laboratory; DK Fagan, Pacific Northwest National Laboratory; CD Johnson, Pacific Northwest National Laboratory
Abstract: Visual Sample Plan (VSP) software includes tools that implement the guidance outlined in Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM) (NUREG-1575) for data collection and statistical analysis to demonstrate compliance of the final status survey of decommissioning projects. Users can apply VSP to potentially contaminated surface soil as well as to areas where unexploded ordnance may exist. New modules and capabilities are continuously added based on users’ needs, including current research and development of tools to demonstrate that suspected contaminants in subsurface soil at a site comply with release criteria for radiation dose- or risk-based regulations. As in surface soil cases, primary survey goals for the subsurface include reaching conclusions via hypothesis testing about whether decommissioning efforts led to sufficiently reduced radiation levels. Sufficient data and statistical modeling at this stage must also allow the principle responsible party (and regulator(s)) to identify areas with potential residual contamination for further investigation.
Whereas a large portion of a site’s area is accessible for scanning surface soils, subsurface analyses introduce challenges due to the difficulty and cost of accessing the subsurface. Addressing these challenges requires leveraging additional data sources including qualitative data (e.g., subject matter expertise, historical site assessments) in combination with quantitative data of varying resolutions and sources (e.g., soil contamination, groundwater characteristics, contaminant plumes, and fate and transport models) and data from sampling and analysis performed during decommissioning to inform final site survey sampling. In this presentation, we outline requirements for data sets resulting from merging disparate sources and present the statistical methods and remaining challenges recommended for future VSP releases.
|
MAM-B.4
10:15 Statistical methods to analyze continuously collected data DK Fagan*, Pacific Northwest National Laboratory
; MO Obiri, Pacific Northwest National Laboratory; LN Newburn, Pacific Northwest National Laboratory; AL Bunn, Pacific Northwest National Laboratory; J Huckett, Pacific Northwest National Laboratory
Abstract: U.S. federal guidance currently demonstrates the minimum requirements and necessary conditions for conducting radiological surveys to support decommissioning decisions (NUREG-1575 Rev 1, NUREG-1505 Rev 1, NUREG-1757 Vol 2, Rev 1). Several of the statistical methodologies of the Multiagency Radiation Survey and Site Investigation Manual (MARSSIM) (NUREG-1575 Rev 1, NUREG-1505 Rev 1) are nonparametric because data collected for the purposes of a final status survey often cannot be assumed to be realizations from a parameterized model (e.g., a normal probability distribution that can be fully specified using the mean and variance). While nonparametric methods relax such parametric assumptions, the assumption of independence persists. Continuously sampled data collected via walk-over or autonomous vehicle scans result in observations at discrete points that are a function of the contamination at surrounding points (e.g., moving average). One feature of such data is that they are auto- or serially-correlated, thus violating the independence assumption required to implement the statistical methodologies of MARSSIM. Ignoring such correlation typically leads to underestimating the uncertainty and increasing false detection error rates (i.e., falsely concluding contamination is significantly higher than established action limits).
This presentation identifies two methods to account for autocorrelation in the statistical analysis of continuously collected survey data and properly control the statistical decision error rates. One method effectively removes the autocorrelation via a time-series approach so that traditional MARSSIM approaches can be used with the resulting data set. The other method accounts for the correlation explicitly in the statistical model. Both methods provide a pathway for statistical analysis to compare observed survey data to action limits that lead to reliable significance tests and robust decision-making.
|
MAM-B.5
10:30 Data quality assessment of continuously collected survey data AL Bunn, Pacific Northwest National Laboratory
; TA Ikenberry*, Pacific Northwest National Laboratory; DK Fagan, Pacific Northwest National Laboratory; LL Newburn, Pacific Northwest National Laboratory
Abstract: Derived Concentration Guideline Levels or (DCGLs) are radionuclide-specific concentration limits used to guide decommissioning activities for a site so that it meets the radiological release criteria for license termination (NUREG-1575, NUREG-1505, NUREG-1757). In determining the detector system and parameters for radiological surveys, DCGLs are converted to minimum detectable concentrations (MDCs) with subsequent determination of action levels (cpm). MDC conversions are based on index of sensitivity, surveyor efficiency, survey parameters (e.g. velocity and detector height), and ratios that describe exposure rate to concentration and concentration to detector count relationships. Action levels are established based on MDCs and statistical decision error probabilities.
Historically, surveys have been conducted with survey vigilance, that is, the surveyor is responsible for identifying the differences in detector response from background and background plus signal. Differences are identified in real time, with the surveyor adjusting behavior to investigate potential areas of residual contamination during the survey (I.e., human factors). Licensees have modernized the methods used to perform radiological surveys since MARSSIM (NUREG-1575) was first published. Current practice often involves conducting surveys without surveyor vigilance by pairing survey instrumentation and data capture tools (GPS, GIS) with autonomous vehicles, towing vehicles, or humans, removing the responsibility to investigate possible residual contamination in real time. As a result, data quality assessment (DQA) to determine whether collected data are consistent with the sampling and analysis plan is more important than ever. This presentation presents an approach that converts survey data (cpm) into concentrations (pCi/g) for evaluating whether continuous survey data collected without vigilance achieved the planned data quality objectives.
|
MAM-B.6
10:45 In an Age of Misinformation and Disinformation: Yes, Data Quality Still Matters JA Chapman*, NNSA
; R Mieskoski, NNSA
Abstract: The U.S. Department of Energy, working with its interagency partners, has effectively implemented the approach to data quality (DQ) objectives and assessment, particularly in nuclear facility decontamination and decommissioning. These DQ practices have been developed over the past three decades for establishing measurement performance for Nondestructive Assay (NDA) systems for measurement of TRU waste dispositioned to WIPP, and for the D&D of the K-25 gaseous diffusion plant. These projects are executed over the course of many years, the radiological source term is well bounded, and project results are communicated to all stakeholders in a judicious fashion. In a radiological emergency however, the source term characterization and release parameters are not initially known, atmospheric release models are initial guesses based on judgement, and early measurement results are likely to be fraught with error and mistakes. In short, it takes some time to truly assess the magnitude and scale of the radiological hazard. While officials and experts are estimating the extent and magnitude of the release, public speculation may generate any number of ill-informed theories in the vacuum of informed sources. Miscommunication and misunderstanding are compounded by this new and evolving age of misinformation and disinformation. We will explore in this paper, what preparatory measures can be taken to assure data quality is sufficient to support critical decision makers and that there is a counter-misinformation or -disinformation plan in place.
|
MAM-B.7
11:00 Panel
|