Summer School CEAEDFINRIA 2011
June 27–July 8, 2011, CEA Cadarache, France
Uncertainty Quantification for Numerical Model Validation
SUMMER SCHOOL CEAEDFINRIA 2011
June 27^{th}–July 8^{th}, 2011, CEA Cadarache, Saint PaullezDurance, France
Organized by: Bertrand Iooss (EDF R&D, France) and Marc Sancandi (CEACESTA, France)
Contribution of François Hemez (Los Alamos National Laboratory, U.S.A.)
Synopsis:
The module “Uncertainty Treatment in Computer Experiments” introduces techniques used to quantify prediction uncertainty. Quantification includes the propagation and assessment of how much prediction uncertainty originates from a numerical simulation. It also includes understanding which phenomenology controls it (“mesh discretization, parameter variability, interaction effects?”). The module is brokenup into five lectures, long of 1.5 hours each: 1) Verification, 2) Sensitivity Analysis, 3) Sampling, 4) Testanalysis Correlation, and 5) Endtoend Example. The objective is to familiarize the students with techniques used to verify the performance of computer codes; assess the numerical uncertainty of discrete solutions; design computer experiments; perform analysisofvariance and effect screening studies; develop fastrunning, statistical emulators; propagate model parameter uncertainty through numerical simulations; compare measurements to predictions; and calibrate parameters. For simplicity, the application examples discussed emphasize Engineering Mechanics even though the techniques presented are generalpurpose and can, therefore, be applied to any numerical simulation. The material is extracted from a graduatelevel course taught at the University of California San Diego on the Verification and Validation (V&V) of computational models. The concepts learned during the first four lectures are practiced during handson, training sessions.
Short Bio:
François Hemez has been Technical Staff Member at Los Alamos National Laboratory since 1997. Prior to joining Los Alamos, he was research associate at the French National Center for Scientific Research (CNRS) from 19941997. François earned a Ph.D. from the University of Colorado in 1993 and graduated from Ecole Centrale Paris, France, in 1989. At Los Alamos, François spent seven years in the Engineering Division, one of which as leader of the Validation Methods team. In 2005, he joined the XDivision for nuclear weapons design. François has managed the verification project of the Advanced Scientific Computing (ASC) program for two years. He currently manages a $4Mperyear project to assess the predictive capability of ASC codes. His research interests revolve around the development of methods for Verification and Validation (V&V), uncertainty quantification and decisionmaking, and their application to engineering, wind energy, and weapon physics projects. He developed and taught the firstever, graduatelevel course offered in a U.S. University in the discipline of V&V (University of California San Diego, 2006). François received the Junior Research Award of the European Association of Structural Dynamics in 2005; two U.S. Department of Energy Defense Program Awards of Excellence for applying V&V to programmatic work at Los Alamos in 2006; and the D.J. DeMichele Award of the Society for Experimental Mechanics in 2010. François has authored 300+ technical publications or reports (including 23 peerreviewed papers), and given 120+ invited lectures or shortcourses, since 1994.
Tentative Outline of Lectures:
1) Code and Solution Verification (1½ hours)

Code verification

How to define benchmark problems

Method of Manufactured Solutions (MMS)

The concepts of Modified Equation Analysis (MEA), consistency, and convergence

Truncation error and the asymptotic convergence of numerical solutions

The Richardson extrapolation and solution verification

The Grid Convergence Index

Quantification of solution uncertainty
2) Designofexperiments, Sensitivity Analysis, and Metamodeling (1½ hours)

Description of the modeling uncertainty and lackofknowledge

Principles of the design of (physical or computer) experiments

Fullfactorial and fractional factorial designs

Orthogonal arrays, central composite designs

2^N and 2^(Nk) designs, statistical aliasing

Rationale for effect screening (“where is an observed variability coming from?”)

The concept of effect screening using a design of computer experiments

Local sensitivity analysis as a “primitive” screening technique

The analysisofvariance (ANOVA)

Main effect and total effect sensitivity indices

The concept of metamodeling using a design of computer experiments

Polynomial emulators

Kriging emulators

Estimation of quality of statistical emulators
3) Propagation of Probabilistic Uncertainty (1½ hours)

Sampling methods for the forward propagation of (probabilistic) uncertainty

Monte Carlo, Quasi Monte Carlo

Stratified sampling (Latin Hypercube Sampling)

Convergence of statistical estimates

Sampling methods for the inverse propagation of uncertainty

The MetropolisHastings algorithm and Markov Chain Monte Carlo (MCMC)

Fast probability integrators for reliability analysis
4) Testanalysis Correlation and Model Calibration (1½ hours)

The concepts of response features and correlation metrics

Advantages and limitations of the “viewgraph norm”

Statistical tests used to account for probabilistic uncertainty

Principal component decompositionbased metrics

Parameter calibration (“what is it? what are the dangers?”)
5) An Endtoend Example of Verification and Validation (1½ hours)

Introduction of an engineering example of transient dynamics simulation

Verification of the finite element software

Design and execution of computer experiments (predictions)

Design of physical experiments (measurements)

Downselecting of the statistically significant effects

Smallscale, validation experiments and the reduction of parameter uncertainty

Uncertainty propagation and final testanalysis correlation
Tentative Outline of Training Sessions:
1) Code Verification (1½ hours)

Verify the performance of a simple finite element model. Analyze a 1D beambending problem and compare the discrete solutions, obtained by varying the number of finite elements, to the exact (analytical) solution (Homework03).

Derive analytically the modified equation of a 1D, advectiondiffusion equation. Use the results to assess the behavior of truncation error (Homework04).
2) Solution Verification (1½ hours)

Analyze the asymptotic convergence of discrete solutions obtained by refining a computational mesh. Estimate the orderofconvergence of the numerical method, Grid Convergence Index, and level of prediction uncertainty. Application to the Fourier approximation of a discontinuous function (Homework05) or the finite element approximation of a frame structure (Homework06).
3) Designofexperiments, Sensitivity Analysis, and Metamodeling (1½ hours)

Propagate a designofexperiments through a finite element analysis. Use the simulation results to perform an analysisofvariance and screen the statistically significant effects. Develop a fastrunning, polynomial emulator of the finite element model. Application to a vibrating, massspring system (Homework10) or the scaled model of a threestory frame building (HomeworkTBD).
4) Propagation of Probabilistic Uncertainty and Model Calibration (1½ hours)

Use sampling techniques to propagate uncertainty forward through a finite element model. Quantify the prediction uncertainty and compare the statistics of predictions to those of physical measurements (testanalysis correlation). (HomeworkTBD.)

Calibrate parameters to improve the overall goodnessoffit of the model. Application to a vibrating, massspring system (Homework11) or the scaled model of a threestory frame building (HomeworkTBD).
Version 1.0 — December152010
