TAM-B - Emergency Response Centennial Ballroom 300B 09:30 - 11:45
|
Chair(s): Steve Sugarman
|
|
TAM-B.1
09:30 The Importance Of Effective And Understandable Communication Of Radiation-Related Information SL Sugarman*, SummitET
Abstract: Words have meaning, and subtle shifts in the language we use can have a large impact on the message that’s being delivered and the perceptions of the receiving audience. This is especially important when speaking about topics that may cause anxiety in people…such as radiation. While various basic concepts may seem elementary to a health physicist, there are oftentimes subtle - yet extremely important - differences that can be primary drivers in emergency response and related communications, for example contamination vs. exposure. The importance of effective communication cannot be overstated. Radiation professionals work with something that not a lot of people understand. Radiation can be a scary word. A lack of knowledge and/or not understanding how radiation works can lead people to make decisions they may not have otherwise made had they been more aware of the true nature of the potential hazard. It can be difficult to take a complicated topic and simplify it into terms that are easily understood, all the while maintaining factual integrity. The implications of effective communications are far reaching – whether it is helping an individual who has radiation-related concerns about an anticipated medical procedure or affecting the public’s willingness to accept emergency management recommendations during an incident involving radioactive materials. Understanding the emotional status of your audiences and tailoring messages to address their fears and/or perceptions is important. Strong emotions are not likely to be overcome by simply providing facts. Remember that the people you are talking to may have fears and preconceptions that as a radiation professional you overcame long ago, and your empathy when dealing with a situation will likely go as far – or further – than the facts you are providing.
|
TAM-B.2
09:45 The Radiation Field Training Simulator (RaFTS): Reducing Dose by Simulating Sources GK White*, Lawrence Livermore National Laboratory, Livermore, USA
; SA Kreek, Lawrence Livermore National Laboratory, Livermore, USA; WM Dunlop, Lawrence Livermore National Laboratory, Livermore, USA; JD Oakgrove, Lawrence Livermore National Laboratory, Livermore, USA; DE Bower, Lawrence Livermore National Laboratory, Livermore, USA; DG Trombino, Lawrence Livermore National Laboratory, Livermore, USA; EK Swanberg, Lawrence Livermore National Laboratory, Livermore, USA; SD Pike, Argon Electronics (UK) Limited, Luton, UK; JN King, Argon Electronics (UK) Limited, Luton, UK
Abstract: To prepare for real-world emergencies involving high-radiation-dose hazards or unknown radiation hazards, first responders need to practice in those environments. However, training with actual hazardous high-radiation sources involves tremendous logistical difficulties, expense, and unnecessary radiation exposure. In general, current approaches to training avoid high-radiation sources. Frequently, such training uses event controllers who tell participants what their instruments should be reading or utilize simulated instruments with preprogrammed readings, but this is underpreparing emergency responders for the complexities of such hazards because instruments behave differently in such high-hazard environments. People in charge of the training recognize the inadequacies of these approaches and wish for something better. The Radiation Field Training Simulator (RaFTS) is the solution they have wished for.
Current training simulators are limited in a variety of ways. They do not adequately provide the operational realities of training with actual responder equipment. They also do not provide the most realistic and scientifically-sound scenarios to train against. To address this need, the Lawrence Livermore National Laboratory (LLNL) developed a next-generation training capability, the Radiation Field Training Simulator (RaFTS) for which we were awarded an R&D100 Award by R&D Magazine in 2017 and a TechConnect Defense Innovation Award in 2018. Three US patents protect LLNL’s intellectual property. RaFTS has been granted a three-year US Department of Energy (DOE) Technology Commercialization Fund award for LLNL to work with the UK’s Argon Electronics Limited and the Tennessee-based ORTEC, among other detector manufacturers, to commercialize RaFTS. Our project was selected for a 2021 FLC National Excellence in Technology Transfer award. This paper will describe our past work and then discuss the next generation of RaFTS which will meet and hopefully exceed the expectations of the emergency responder community.
|
TAM-B.3
10:00 Final Results from Nuclear Accident Simulation Study Comparing 2017 vs. 1992 Protective Action Guidelines MD McMahon*, Tulane University
Abstract: In 2020 the Louisiana Department of Environmental Quality (LDEQ) aligned its Radiological Emergency Planning and Response (REPR) program with the 2017 Environmental Protection Agency (EPA) Protective Action Guides (PAG) Manual, with the addition of a one-year old child thyroid dose evacuation threshold in lieu of distributing Potassium Iodide (KI) to the public. We previously presented the initial results of a study which performed a set of nuclear power plant accident simulations for a wide variety of accident and weather conditions. Because the Nuclear Power Plants will continue to align with the 1992 PAG manual for the foreseeable future, the study compared the dose assessment results using the 2017 versus 1992 PAG manual at the critical points of Site Boundary, 5 miles downwind and 10 miles downwind. The completed results of this study for the River Bend Station Nuclear Power Plant (a Boiling Water Reactor (BWR)) are now presented, using a combination of the RASCAL software package (provided by NRC) and URI, a Software system used by Entergy combining the basic RASCAL dose modeling methodology with plant specific input. The focus is on potential accidents having release pathways with little filtration, and critical thresholds for otherwise identical accidents from accident parameters such as containment leak rates or activity release rates detected by stack monitors are tabulated and compared for the 2017 vs 1992 methodology respectively.
|
TAM-B.4
10:15 Electromagnetic Reliability Effects Probability (EMREP) tool strength and stress testing. MT Bak*, DoD Defense Threat Reduction Agency
Abstract: Geomagnetic disturbances (GMD) and electromagnetic pulses (EMP) from high-altitude nuclear detonations have the ability to affect large geographic areas and disrupt, degrade, or damage defense and civilian technologies and critical infrastructure components and systems. The threat of possible nuclear attacks and the rapidly increasing dependence on digital electronics drastically accelerates the need to foster sustainable, efficient, and cost-effective approaches to improving our resilience to the effects of EMPs.
EMREP (Electromagnetic Reliability and Effects Predictions) is the premier Department of Defense (DoD) EMP system reliability tool that compares components and system strengths to EMP stresses and combines the results to predict reliability for items exposed to various types of hostile EMP scenarios. EMREP is a unique test-based EMP prediction tool. Its database uses real test data for coupling and effects. It is continually updated with new test data. It is also independently verified and validated. Stakeholders use EMREP as a decision support tool, for contingency/emergency management planning support, and for consequence management.
EMREP is a Defense Threat Reduction Agency (DTRA) produced tool. It leverages decades of experience in EMP environment calculations, system testing, proven statistical methods and sound engineering judgment. EMREP uses both legacy and new test data to ascertain reliability and survivability of technologies, from smart phones to medical equipment to vehicles that are critical to our Nation’s security and economic prosperity. DTRA has a multi-year test campaign that tests DoD mission critical assets such as vehicles, communications equipment, commercial-off-the-shelf (COTs) information technology (IT) and medical equipment to assess their resistance and survivability to an EMP threat environment.
|
TAM-B.5
10:30 Challenges in Translation of Biodosimetry Diagnostics to the Field: Prediction of Total Body and Partial Body Exposures to Radiation Using Plasma Proteomic Expression Profiles M Sproull*, NIH/NCI/ROB
; U Shankavaram, NIH/NCI/ROB; K Camphausen, NIH/NCI/ROB
Abstract: Purpose: Our work has centered on development of new dose prediction models for unknown received radiation dose using a proteomic approach. Using murine models to characterize novel biomarkers of radiation exposure, we have also developed models for dose prediction using both total body and partial body radiation exposures including organ specific exposures.
Methods: In our latest work, female C57BL6 mice received either a total body exposure of 2, 3.5 or 8 Gy with blood collection at days 1, 3 and 7 post-exposure, or an organ specific partial body exposure to only the brain, gut or lung at doses of 2 or 8 Gy to the lung or gut and 2, 8 or 16 Gy to the brain using a Pantak X-ray source. Blood samples were collected at days 1, 3 and 7 post-exposure for lung and brain and at days 3, 7 and 14 post-exposure for gut. All samples were analyzed for 1,310 protein analytes using the Somalogic SOMAscan assay, a highly multiplexed aptamer-based proteomics screening platform. Radiation exposure prediction algorithms were then constructed using these aggregated Somalogic data. For this study series, predictive models were developed using several treatment group combinations including (control vs. TBI only), (control vs. all pooled irradiated samples, which included TBI and all organ specific partials) and (control vs. each individual organ specific partial body exposure separately (brain, gut and lung). For each model, predictive algorithms were generated using both a Linear discriminant analysis (LDA) and Recursive PARTitioning tree (RPART) statistical methodology and separate murine sample cohorts were used for the training of each model and the testing of each model.
Results: Though the models which included TBI had relatively high overall predictive accuracies of 85%, and 89% respectively, the predictive accuracy of the (control vs. organ specific exposures) was only 59% and all the models which included partial body exposures performed poorly at differentiating control samples from partial body exposure samples. This reflects the challenges of using biomarkers to differentiate between samples from unirradiated animals and samples from animals receiving radiation exposures to relatively small percentages of body mass.
Conclusion: Overall, these findings show that proteomic profiling can be a useful tool for development of predictive algorithms of radiation exposure, but that development of algorithms for field diagnostics should consider these inherent biological limitations. Our findings indicate that these dose prediction models have potential utility for mass population screening, but as most field exposures will be inherently heterogeneous, further research to support more robust biodosimetry algorithms is needed.
|
TAM-B.6
10:45 Gamma Radioactivity Detection Limits and Dose Assessment in Artificial Human Urine Using Sodium-Iodide and High-Purity Germanium Detectors AG Burn*, Wadsworth Center, NYS Department of Health
; DK Haines, Wadsworth Center, NYS Department of Health; AJ Khan, Wadsworth Center, NYS Department of Health; MA Torres, Wadsworth Center, NYS Department of Health; SA Faye, Wadsworth Center, NYS Department of Health; CA Costello, Bureau of Environmental Radiation Protection, NYS Department of Health; TJ Hoffman, Wadsworth Center, NYS Department of Health; TM Semkow, Wadsworth Center, NYS Department of Health
Abstract: Rapid detection of radioactivity in urine samples is critical for effective radiological emergency response. Since sample throughput is important during an emergency situation, sample count times are relatively short which can make it difficult to confidently distinguish signal from noise. Additionally, many urine samples from a radiological emergency will likely not have appreciable radioactivity. The goal of this study was to investigate the performance of several gamma detectors for the urine bioassay screening. To achieve this goal, unspiked artificial urine samples were measured for ten minutes/sample on four different gamma detectors: 80% relative efficiency high purity Ge (HPGe) detector in standard shielding, 102% relative efficiency low-background HPGe detector equipped with muon shield, HPGe well, and 4” x 4” NaI well. The measured gamma spectra were analyzed in two ways: 1) for the 364-keV peak region of I-131 and 662-keV peak region of Cs-137, and 2) for the total counts in the full energy spectrum (50-2048 keV). These two radionuclides were chosen since they are among the prominent gamma-emitting radionuclides of interest in a radiological emergency. The detectors were calibrated for both peak and total efficiencies using NIST-traceable solutions of the radionuclides in the geometries studied. The results were analyzed using the principles of signal detection theory according to the Currie’s formalism extended by a complete uncertainty propagation. This enabled calculation of the decision levels as well as detection limits expressed in Bq/L of urine, the latter referred to as minimum detectable activity (MDA). The sodium-iodide well detector had the highest efficiency for I-131 and Cs-137. The high purity germanium well detector had the lowest peak MDA value while the sodium-iodide well detector had the lowest total MDA. The annual limit on intake (ALI) for I-131 was often exceeded at 30 days post-intake for the high purity germanium detectors while the calculated Cs-137 intakes were always below the Cs-137 ALI. The results are interpreted from a risk assessment perspective and applicability to a large-scale radiological emergency response.
|
TAM-B.7
11:00 Impact Analysis of Age-based demographic data and FGR 15 on Mortality Estimations in HPAC JT Dant*, Applied Research Associates
; IA Castillo, Defense Threat Reduction Agency; CG Nye, Applied Research Associates
Abstract: A nuclear detonation (NUDET) near the ground will cause catastrophic damage and generate large amounts of fallout that can be transported tens to hundreds of kilometers away from ground zero. Traditionally, consequence assessment modeling tools have been based on effects data from healthy, young adults since they were developed to focus primarily on nuclear battlefield survival. Radiation response can vary significantly across different demographic groups (age, sex, etc.), and demographic distributions can fluctuate greatly depending on geographical location. The assumption that a population consists of healthy, young adults will lead to inaccurate casualty estimations when applied to a general population. The Defense Threat Reduction Agency’s (DTRA) Hazard Prediction and Assessment Capability (HPAC) software is used to model the hazardous environments generated by a NUDET. The current dose conversion factors (DCFs) in HPAC were developed using the methods and models consistent with Federal Guidance Report (FGR) 12. These DCFs were calculated for an adult anthropomorphic model of the body derived from the International Council on Radiation Protection (ICRP) Reference Man data. Since the publication of FGR 12, there have been substantial improvements in computational capabilities, radiation protection databases, anthropomorphic computational phantoms, and guidance from the ICRP. FGR 15 updates and expands on FGR 12 with the most significant changes being the addition of age-specific phantoms for the Newborn, 1-year old, 5-year old, 10-year old and 15-year old age groups. Previous analysis was performed to investigate the effect of demographic dose modifying factors for radiation lethality for various nuclear detonation scenarios modeled with HPAC paired with age-based demographic data. This analysis derives separate mortality estimates using the updated DCFs from FGR 15 and the same population distribution to quantify the impact of the newer guidance.
|
TAM-B.8
11:15 The Advisory Team for Environment, Food and Health G Chen*, U.S. EPA
Abstract: The Advisory Team for Environment, Food and Health is a US federal interagency group of subject matter experts in radiological health and related matters. The mission of the Advisory Team is to provide protective action recommendations to decision makers at all levels following accidents or incidents that result in the release of radioactive material to the environment, from the early phase through the late phase. Working with the Department of Energy-led Federal Radiological Monitoring and Assessment Center (FRMAC) and other technical and regulatory organizations, the Advisory Team provides coordinated advice and recommendations on the environment, food, water and health, both human and animal. The Advisory Team is comprised of representatives from four agencies: the Environmental Protection Agency, the US Department of Agriculture, the Centers for Disease Control and Prevention and the Food and Drug Administration. A major function of the Advisory Team is to interpret published guidance on radiation protection, known as Protective Action Guides or PAGs, in the context of the specific contamination or exposure scenario. These guidance documents include protective action guides from the EPA for the general public and emergency workers; guidance from the FDA on acceptable levels of contamination in food and animal feed; guidance from the FDA on the use of potassium iodide (KI) for the protection of the thyroid following releases of radioactive iodine, and; guidance from the Department of Homeland Security on long term recovery following incidents involving radiological dispersal devices (RDDs) or improvised nuclear devices (INDs). Advisory Team operations are discussed in the context of stakeholder interactions and national response doctrine.
|
TAM-B.9
11:30 RadResponder Network – A Quick Walkthrough With The Newest Updates G Chen*, U.S. EPA
Abstract: In the past, there were tools that federal agencies used to input and share radiological data. However, the tools were not easy to use, and the output data were not easily shareable across the agencies. Most importantly, they were not available to all of the radiation emergency organizations at all levels (state, tribal and local). It is essential that, during a radiological emergency event, there is a common tool that is easy to use and accessible to all organizations nationwide to share radiological data. Together, the Federal Emergency Management Agency (FEMA), the Department of Energy (DOE) National Nuclear Security Administration (NNSA) and Environmental Protection Agency (EPA), created the RadResponder Network. The programing development of RadResponder Network began in 2012, and it is aiming to become the national standard and Whole Community solution for the management of radiological data.
Since 2012, the RadResponder Network have grown from 300 registered organizations and 1000 registered users to over 1970 organizations and 11000 users today. RadResponder is an ongoing project, and it constantly improves itself by adding new functions and enhancement from the communities' input and suggestions. The purpose of this presentation is to inform the communities about the technical enhancement and new functions that have been added to the RadResponder Network since the 2021 Annual HPS meeting.
|