|Program name||Package id||Status||Status date|
|Package ID||Orig. computer||Test computer|
|CCC-0785/03||MacOS,Linux-based PC,PC Windows,UNIX W.S.|
Scale is a comprehensive modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). Scale provides a comprehensive, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. For over 30 years, regulators, licensees, and research institutions around the world have used Scale for safety analysis and design. Scale provides a 'plug-and-play' framework with 89 computational modules including 3 deterministic and 3 Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. Scale includes current nuclear data libraries and problem-dependent processing tools for continuous-energy and multigroup neutronics calculations, multigroup coupled neutron-gamma calculations, as well as activation and decay calculations. Scale includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. Scale's graphical user interfaces assist with accurate system modeling and convenient access to desired results. See the developers' website and the Scale 6 electronic notebook for news on Scale, updates, and tips on running the code.
Material Input and Problem-Dependent Cross-Section Data:
A foundation of Scale is MIPLIB (Material Information Processor Library). The purpose of MIPLIB is to allow users to specify materials using easily remembered and easily recognizable keywords that are associated with mixtures, elements, and nuclides provided in the Scale Standard Composition Library. MIPLIB also uses other keywords and simple geometry input specifications to prepare input for the modules that perform the problem-dependent cross-section processing. Even when performing multigroup calculations, Scale begins with continuous-energy cross-section data and generates problem-dependent multigroup data based on a pointwise spectrum generated with the CENTRM (Continuous Energy Transport Module) and PMC (Produce Multigroup Cross Sections) modules. A keyword supplied by the user selects the cross-section library from a standard set provided in Scale or designates the reference to a user-supplied library.
Criticality Safety Analysis:
The CSAS (Criticality Safety Analysis Sequence) control module provides for the calculation of the neutron multiplication factor of a system. Computational sequences accessible through CSAS provide automated problem-dependent processing of cross-section data and enable general analysis of a one-dimensional (1D) system model using deterministic transport with XSDRNPM or three-dimensional (3D) Monte Carlo transport solution using KENO V.a. CSAS also provides the capability to search on geometry spacing or nuclide concentrations, and provides problem-dependent cross-section processing without subsequent transport solutions for use in executing stand-alone functional modules. CSAS6 is a separate criticality control module that provides automated problem-dependent cross-section processing and Monte Carlo criticality calculations via the KENO-VI functional module that uses the Scale Generalized Geometry Package (SGGP). The Scale Material Optimization and Replacement Sequence (SMORES) is a Scale control module developed for 1D eigenvalue calculations to perform system criticality optimization. The STARBUCS (Standardized Analysis of Reactivity for Burnup Credit using Scale) control module has been developed to automate the generation of spatially varying nuclide compositions in a spent fuel assembly, and to apply the spent fuel compositions in a 3D Monte Carlo analysis of the system using KENO, primarily to assist in performing criticality safety assessments of transport and storage casks that apply burnup credit. The KMART (Keno Module for Activity-Reaction Rate Tabulation) module produces reaction rates and group collapsed data from KENO. The USLSTATS (Upper Subcritical Limit Statistics) tool provides trending analysis for bias assessment.
The MAVRIC (Monaco with Automated Variance Reduction Using Importance Calculations) fixed-source radiation transport sequence is designed to apply the multigroup fixed-source Monte Carlo code Monaco to solve problems that are too challenging for standard, unbiased Monte Carlo methods. The intention of the sequence is to calculate fluxes and dose rates with low uncertainties in reasonable times even for deep penetration problems. MAVRIC is based on the CADIS (Consistent Adjoint Driven Importance Sampling) methodology, which uses an importance map and biased source that are designed to work together. MAVRIC generates problem-dependent cross-section data and then automatically performs a coarse mesh, 3D discrete ordinates transport calculation using Denovo to determine the adjoint flux as a function of position and energy, and to apply the information to optimize the shielding calculation in Monaco. The SAS1 (Shielding Analysis Sequence No. 1) control module provides general 1D deterministic shielding capabilities, and QADS (Quick and Dirty Shielding) provides for 3D point-kernel shielding analysis.
Depletion, Decay, and Radioactive Source Term Analysis:
The ORIGEN (Oak Ridge Isotope Generation) code applies a matrix exponential expansion model to calculate time-dependent concentrations, activities, and radiation source terms for a large number of isotopes simultaneously generated or depleted by neutron transmutation, fission, and radioactive decay. Provisions are made to include continuous nuclide feed rates and continuous chemical removal rates that can be described with rate constants for application to reprocessing or other systems that involve nuclide removal or feed. ORIGEN includes the ability to utilize multigroup cross sections processed from standard ENDF/B evaluations. Within Scale, transport codes can be used to model user-defined systems, and the COUPLE code can be applied to calculate problem-dependent neutron-spectrum-weighted cross sections that are representative of conditions within any given reactor or fuel assembly, and convert these cross sections into a library that can be used by ORIGEN. Time-dependent cross-section libraries may be produced that reflect fuel composition variations during irradiation. An alternative sequence for depletion/decay calculations is ORIGEN-ARP, which interpolates pre-generated ORIGEN cross-section libraries versus enrichment, burnup, and moderator density.
The TRITON (Transport Rigor Implemented with Time-Dependent Operation for Neutronic Depletion) control module provides flexible capabilities to meet the challenges of modern reactor designs by providing 1D pin-cell depletion capabilities using XSDRNPM, two-dimensional (2D) lattice physics capabilities using the NEWT 2D flexible mesh discrete ordinates code, or 3D Monte Carlo depletion using KENO. With each neutron transport option in TRITON, depletion and decay calculations are conducted with ORIGEN. Additionally, TRITON can produce assembly-averaged few-group cross sections for use in core simulators. Improved resonance self-shielding treatment for nonuniform lattices can be achieved through use of the MCDancoff (Monte Carlo Dancoff) code that generates Dancoff factors for generalized 3D geometries.
Sensitivity and Uncertainty Analysis:
TSUNAMI-1D and -3D (Tools for Sensitivity and Uncertainty Analysis Methodology Implementation) are Scale control modules that facilitate the application of adjoint-based sensitivity and uncertainty analysis theory to criticality safety analysis. Additionally, a TSUNAMI-2D eigenvalue sensitivity analysis capability is available through the TRITON control module. TRITON also provides a generalized perturbation theory capability for 1D and 2D analysis that computes sensitivities and uncertainties for reactor responses such as reaction rate and flux ratios as well as homogenized few-group cross sections. TSAR (Tool for Sensitivity Analysis of Reactivity) provides sensitivity coefficients for reactivity differences, and TSUNAMI-IP (TSUNAMI Indices and Parameters) and TSURFER (Tool for Sensitivity and Uncertainty Analysis of Response Functions Using Experimental Results) provide code and data validation capabilities based on sensitivity and uncertainty data.
The cross-section data provided with Scale include comprehensive continuous-energy neutron and multigroup neutron and coupled neutron-gamma data based on ENDF/B-VI.8 and ENDF/B-VII.0. Additional ENDF/B-V multigroup neutron libraries are also available. The comprehensive ORIGEN data libraries are based on ENDF/B-VII and JEFF-3.0/A and include nuclear decay data, neutron reaction cross sections, neutron-induced fission product yields, delayed gamma-ray emission data, and neutron emission data. The photon yield data libraries are based on the most recent Evaluated Nuclear Structure Data File (ENSDF) nuclear structure evaluations. The libraries used by ORIGEN can be coupled directly with detailed problem-dependent physics calculations to obtain self-shielded problem-dependent cross sections based on the most recent evaluations of ENDF/B-VII. Scale also contains a comprehensive library of neutron cross-section-covariance data for use in sensitivity and uncertainty analysis.
Graphical User Interfaces:
Scale includes a number of graphical user interfaces to provide convenient means of generating input, executing Scale, and visualizing models and data. GeeWiz (Graphically Enhanced Editing Wizard) is a Windows user interface that provides a control center for setup, execution, and viewing results for most of Scale's computational sequences including CSAS, MAVRIC, TRITON, and TSUNAMI. GeeWiz is coupled with the KENO3D interactive visualization program for Windows for solid-body rendering of KENO geometry models. The ORIGEN-ARP user interface for Windows provides for rapid problem setup and plotting of results for spent fuel characterization. The Javapeno (Java Plots Especially Nice Output) multiplatform interface provides 2D and 3D plotting of cross-section and cross-section-covariance data, multigroup fluxes and reaction rates from KENO and KMART, sensitivity data from TSUNAMI, and pointwise fluxes from CENTRM. The MeshView multiplatform interface produces 2D contour views of mesh data and mesh results from Monaco and KENO, and ChartPlot provides for energy-dependent plots of Monaco results. The ExSITE tool provides a dynamic multiplatform interface for the sensitivity and uncertainty analysis tools TSUNAMI-IP, TSURFER, and TSAR. The USLSTATS multiplatform interface allows for trending analysis with integrated plotting, and VIBE (Validation Interpretation and Bias Estimation) assists with interpretation of sensitivity data and couples with the DICE database from the International Criticality Safety Benchmark Evaluation Program. Additionally, several codes provide HTML-formatted output, in addition to the standard text output, to provide convenient navigation through the computed results using most common Web browsers with interactive color-coded output and integrated data visualization tools.
The Scale system consists of easy-to-use analytical sequences, which are automated through control modules to perform the necessary data processing and manipulation of well-established computer codes, referred to as functional modules. Computations with Scale are typically characterized by the type of analysis to be performed (e.g., criticality, shielding, or lattice physics) and the geometric complexity of the system being analyzed. The user then prepares a single set of input in terms of easily visualized engineering parameters specified in a simplified, free-form format. The analytical sequence is defined by this single input specification. The Scale control modules use this information to derive additional parameters and prepare input for each of the functional modules necessary to achieve the desired results, especially with the Scale radiation transport codes that employ discrete ordinates, Monte Carlo, or hybrid methods.
Runtimes for sample problems vary from approximately 12 hours to 24 hours depending on the speed of the machine. Running times are extremely problem dependent and depend heavily on the sequence used and the cross-section library selected. They range from less than one minute for a simple 1D criticality or depletion/decay problem to several hours for a complex 3D shielding or sensitivity/uncertainty analysis or 3D Monte Carlo depletion case.
DATA LIBRARIES INCLUDED:
Scale Standard Composition Library
44-group cross sections based on ENDF/B-V
238-group cross sections based on ENDF/B-V
238-group cross sections based on ENDF/B-VI
238-group cross sections based on ENDF/B-VII
27n, 19g coupled cross sections based on ENDF/B-VII
200n, 47g coupled cross sections based on ENDF/B-VI
200n, 47g coupled cross sections based on ENDF/B-VII
ENDF/B-V continuous energy cross sections for CENTRM
ENDF/B-VI continuous energy cross sections for KENO and CENTRM
ENDF/B-VII continuous energy cross sections for KENO and CENTRM
Albedos and weighting functions for use by KENO
Various cross-section, decay, and yield libraries for ORIGEN-S
ORIGEN-ARP basic cross-section libraries:
Westinghouse CE 14x14, 16x16
Westinghouse 14x14, 15x15, 17x17, 17x17 OFA (Optimized Fuel Assembly)
GE 7x7, 8x8, 9x9, 10x10
ATRIUM-9 (9x9), ATRIUM-10 (10x10)
SVEA-64 (8x8), SVEA-100 (10x10)
VVER-440 flat enrichment (1.6% - 3.6%)
VVER-440 profiled enrichment, average 3.82%
VVER-440 profiled enrichment, average 4.25%
VVER-440 profiled enrichment, average 4.38%
CANDU 28- and 37-element bundles (previously released as RSICC data package DLC-210)
AGR (Advanced Gas Cooled Reactor)
Mixed oxide (MOX) fuel: 8x8, 9x9-1, 9x9-9, 10x10, 14x14, 15x15, 16x16, 17x17, 18x18
|Package ID||Status date||Status|
Scale 6.1 includes binary executable files for 32- and 64-bit Linux on AMD and Intel chipsets, 64-bit Linux for HP Itanium, Mac OS X Intel, and 32- and 64-bit Windows XP, Vista, and 7.
Minimum requirements: 2 GB RAM per CPU, 30 GB of disk space + additional space to store output results
Recommended requirements: 4 GB RAM per CPU, 30 GB of disk space + 100 GB of scratch space + additional space to store output results
Production requirement for large models: 64 GB RAM, 30 GB of disk space + 500 GB of scratch space + additional space to store output results
|Package ID||Computer language|
|CCC-0785/03||C-LANGUAGE, C++, FORTRAN-90, FORTRAN-95|
Included with the distribution are Windows executables (both 64- and 32-bit binaries) that were created using the Intel ifort Fortran compiler and icl C/C++ compiler for Windows version 11.1 on a 64-bit Intel Core 2 duo under Windows Vista. Linux (both 64- and 32-bit binaries) and Mac OSX executables created on the systems listed below are included. This version was tested on the following systems.
AMD Opteron running RedHat Enterprise Linux 4 with Intel ifort, icc, icpc Fortran/C/C++ compiler version 11.1
HPUX running RedHat Enterprise Linux 5 with Intel ifort, icc, icpc Fortran/C/C++ compiler version 11.1
Intel Mac OS X with Intel ifort, icc, icpc Fortran/C/C++ compiler version 11.1
Windows Vista Service Pack 1
If the user chooses to compile executables, note that the xkba module of Denovo uses several GNU General Public License (GPL) open-source vendors. Many of these vendors support options that are not part of the Scale 6.1 release (parallelism, parallel I/O, high end visualization support, etc.). The following GPL vendor software is required to build Denovo for Scale 6.1. The software is not included in the installation package, but is available for download:
For the Scale 6.1 build process and configuration options, please see the PDF readme distributed on the first DVD.
Note: Executables require the Java runtime environment for installation.
Keywords: Monte Carlo method, burnup, calculations, complex geometry, continuous energy, criticality, cross sections, depletion, discrete ordinates, dose, gamma ray source, isotope inventory, multigroup, neutron, plotting, reactor physics, sensitivity analysis, spent fuel characterisation, uncertainty analysis, validation.