iacs.stonybrook.edu€¦  · web viewworkshop: sensitivity, error and uncertainty quantification...

21
Workshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data REPORT Basic Meeting Information Location and dates: The meeting was held at Stony Brook University, 7- 9 November 2015, using the new conference facilities at the Institute for Advanced Computational Science. Seventeen invited speakers and as many additional attendees participated in the meeting. Each invited speaker had 45 minutes for his/her presentation, with 15 minutes of discussion after the talk. A special poster session was organized on November 8. Meeting conclusions and further plans were discussed with all participants. Workshop web pages: http://www.iacs.stonybrook.edu/uq/pages/workshop IACS website: http://www.iacs.stonybrook.edu/ Meeting Chair: Predrag Krstic Meeting Co-Chairs: Robert Harrison and Richard Archibald Workshop Administrators: Sarena Romano and Lynn Allopenna Scientific Committee: Richard K. Archibald , Oak Ridge National Laboratory Bastiaan J. Braams , International Atomic Energy Agency Gordon W. F. Drake , University of Windsor Predrag Krstic , Stony Brook University (Committee chair) Robert J. Harrison , Stony Brook University/Brookhaven National Laboratory Petr Plechac , University of Delaware Daren Stotler , Princeton Plasma Physics Laboratory Contacts: P. Krstic ([email protected] , 865-603-2970) R. Harrison ([email protected] , 865-274-8544) S. Romano ([email protected] ) L. Allopenna ([email protected] ) 1

Upload: others

Post on 14-Jun-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

Workshop:Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data

REPORT

Basic Meeting Information Location and dates: The meeting was held at Stony Brook University, 7-9 November 2015, using the new conference facilities at the Institute for Advanced Computational Science. Seventeen invited speakers and as many additional attendees participated in the meeting. Each invited speaker had 45 minutes for his/her presentation, with 15 minutes of discussion after the talk. A special poster session was organized on November 8. Meeting conclusions and further plans were discussed with all participants.

Workshop web pages: http://www.iacs.stonybrook.edu/uq/pages/workshop IACS website: http://www.iacs.stonybrook.edu/

Meeting Chair: Predrag KrsticMeeting Co-Chairs: Robert Harrison and Richard Archibald Workshop Administrators: Sarena Romano and Lynn Allopenna

Scientific Committee: Richard K. Archibald, Oak Ridge National LaboratoryBastiaan J. Braams, International Atomic Energy AgencyGordon W. F. Drake, University of WindsorPredrag Krstic, Stony Brook University (Committee chair)Robert J. Harrison, Stony Brook University/Brookhaven National LaboratoryPetr Plechac, University of DelawareDaren Stotler, Princeton Plasma Physics Laboratory  Contacts:P. Krstic ([email protected], 865-603-2970)R. Harrison ([email protected], 865-274-8544)S. Romano ([email protected])L. Allopenna ([email protected])

Audio-visual equipment:Javier Dominguez ([email protected])

Financial Assistance for this conference was provided by the National Science Foundation, Grant Number PHY-1560572, and by the Institute for Advanced Computational Science of Stony Brook University.

1

Page 2: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

MOTIVATIONThe motivation to hold the workshop arose from the increasing need to move predictive simulation methodologies from investigations in basic science to their robust application within design and engineering processes, including consistent integration with data from observation or experiment. Anticipated to be the first in a series, the workshop focused on theoretical and computational data relevant to fusion and astrophysical plasmas and material design, where modeling codes mostly use theoretical atomic, molecular, optical and material interface data for which critical assessments of their accuracy are necessary. Attendees included researchers in computational, plasma, material, atomic, molecular and optical physics and chemistry, along with mathematicians and computational scientists whose focus was the mathematical and computational foundations of both epistemic and aleatoric uncertainty quantification (UQ). A primary goal and vision behind the workshop was to initiate multidisciplinary activity to extend established (and novel) techniques of UQ to new or so far nonstandard areas of application by stimulating interaction of the data users and producers with the already well developed mathematical and computational formalisms of UQ.

The problem of the uncertainty of all types, including sensitivity of the plasma and material modeling to the data quality, is strongly enhanced by the multiscale character of the underlying physics and chemistry, running from nm and fs to meters and years. Evaluation and subsequent choice and recommendation of the theoretically and computationally obtained atomic, molecular and material data for use in modeling of the nuclear fusion and astrophysical plasmas as well as in synthesis and design of new materials is critically dependent on the data validation, verification, the uncertainty analysis and quantification. Choice of the theoretical method, accuracy and time needed for calculation of the desired data is strongly conditioned by the sensitivity of the plasma or material model to the data uncertainty.

Expanding applicability of the UQ developed in mathematical/computational sciences to theoretical and computer simulation research in material, atomic, and plasma has been a long awaited development in the respective scientific communities. Supporting interactions between the communities is crucial and will increase the quality of the data and other scientific information disseminated by the relevant physics communities. Building collaboration across these disciplines will ensure that the advantages of the recent mathematical/computational development in UQ can be effectively utilized in atomic, plasma and material data science.

The science of UQ has undergone significant development in mathematical and computational sciences within the last decade, as seen by the publications and a number of scientific meetings devoted to this subject (for example, a series of SIAM conferences on UQ, the latest being UQ14, 2014. Savannah, GA, https://www.siam.org/meetings/uq14). Although this science has been well applied in climate research as well as in material characterization, analysis of uncertainties in theoretical physics and chemistry, in particular the analysis of sensitivity and errors of theoretical atomic, material and interfacial data for astrophysical and nuclear fusion plasma applications, have so far had minimal impact on the development of UQ science in these areas. This is visible in the conclusions of the recent Joint IAEA-ITAMP Technical Meeting on Uncertainty Assessment for Theoretical Atomic and Molecular Scattering Data, Cambridge, MA, July 2014, https://www-amdis.iaea.org/meetings/ITAMP.

Expected workshop outcomes included establishing a research agenda in the space and the germination of a multidisciplinary research community. This report summarizes the workshop (the agenda, conference organization, participants, activities, summary of survey data, presentations and discussions, conclusions and future plans, and the budget).

2

Page 3: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

CONFERENCE PRESENTATIONSFirst Day:

Roger Ghanem (Polynomial Chaos as a Comprehensive Modeling Tool for Complex Systems) presented a comprehensive framework, using Polynomial chaos, to quantify uncertainty in complex systems. The framework was built upon high-dimensional polynomial approximations specifically chosen to provide optimal approximations of the particular probability distribution of the complex system. The tools of adapted basis approaches and reduction of the null space between input and output of complex systems provided tractability of very large dimensional problems. He demonstrated this framework on the complex problem of designing composite cars, focusing on the material properties of various composites along with the simulation of manufacturing. In addition, he demonstrated this framework to modeling subsurface landscapes of the Gulf Coast given the limited information from boreholes.

Petr Plechac (Information-Theoretic Tools for Uncertainty Quantification of High Dimensional Stochastic Models) considered complicated, large dimensional, stochastic systems of equations that could arise in biology, reaction kinetics and materials science. Representation of rare events in these systems using the naïve approach of direct simulation, under all possible uncertain input and parameters, is not possible even for the largest of computing platforms. However, using path-dependent and risk sensitive functions, along with multi-resolution techniques, error estimation and UQ can be obtained. Tight bounds on rare events were demonstrated by using non-equilibrium statistics, Fisher information and goal-oriented quantities in order to characterize the statistics of defined observables. Demonstration of this was done on reaction networks for ethanol production.

Robert Moser (Reliability and Uncertainty in the Simulation of Tokamak Plasmas) focused on the challenging problem of developing robust UQ for the computational modeling of tokamak plasmas. The key difficulty with tokamak plasma is that only a hand full of high fidelity simulations can be performed each year at the leading supercomputing centers. Such a restriction on these high fidelity simulations requires advanced mathematical treatment to up-scaling of micro-physics in order to provide the most information for design, control and operation of current and future tokamak facilities. The march to create bigger and bigger tokamak facilities is spurred by the promise of greater energy output but creates problems that push the potential limits of even future exascale computing systems. This presentation showed that effective and reliable extrapolative predictions are possible for tokamak systems, providing rigorous methods to validate and build predictive assessments of these systems.

Udo von Touissaint (The Vlasov-Poisson Plasma Model with Uncertain Inputs: A Bayesian Modeling Approach) applied a Galerkin framework with spectral expansion to represent the random processes of noise in the Vlasov-Poisson model describing electrostatic plasma. While systematic UQ in plasma physics was historically limited to the parameter scans, this is now considered inadequate. For the intrusive and nonintrusive spectral methods, expansion coefficients can be done using collocation but do not tolerate much noise and very high dimension. For the Gaussian distributed noise, Hermite polynomials are the appropriate choice for an orthonormal basis system. The coefficients are derived from collocation points in a nonintrusive approach. Design of sampling space is very challenging – it must fill space to explore correlations and variation but stay within physically valid regions, and automation can push codes beyond the range of testing/design. Simulators model physical processes and can yield new insights. Emulator, a surrogate for a simulator, seeks statistical representation, often assuming underlying Gaussian processes (N3 scaling?).

3

Page 4: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

Panagiotis Angelikopoulos (Bayesian Uncertainty Quantification for Molecular Dynamics (MD) Simulations) focused on the challenging MD problem associated with liquid water simulations using the high-performance computing framework known as 4U. Within the MD community, there exists several orders of magnitude difference in the prediction and measurements of water flow through a carbon nanotube membrane. This problem is an ideal candidate for UQ methods in MD simulations. The talk demonstrated that the hierarchical Bayesian model that is developed and implemented by 4U can calibrate and produce weight model selection from a collection of various MD simulations. The talk showed that using this hierarchical Bayesian model, the uncertainties in MD properties, such as water contact angle and water transport, can be quantified and calibrated to experimental results.

Daniel Savin (Astrochemistry from the First Stars to the Origins of Organic Chemistry) talked about the importance of interstellar chemistry in the evolution of the universe from the formation of the first stars to the origins of life. He discussed the chain of chemical reactions leading to H2 formation, with uncertainties for the formation of the first stars as well as those responsible for the origin of life (like gas-phase reactions of C with H3+ which initiated the synthesis of complex organic molecules, and reactions of O with H3+ leading to the formation of water). Uncertainties in these data have hindered our understanding of the universe’s evolution, in particular of the pathway toward life. He stressed the importance of chemical reaction networks which would benefit from more systematic UQ and experimental data.

Francesco Rizzi (Uncertainty Quantification in Molecular Dynamics Simulations: Forward and Inverse Problem) outlined that one of the major factors in UQ for molecular dynamics (MD) simulations is the particular potential function used to compute atomic forces. First principle calculations in MD are cost prohibitive and, therefore, model reduction in potential function approximation can make tractable, complex and large biomolecule simulations. This talk presented a Bayesian regression using polynomial chaos expansions to develop a framework to isolate the impact of parametric uncertainty and molecular noise models in MD simulations. This talk demonstrated the suitability of this framework in predicting the major target observables on a variety of MD simulations.

Sophie Blondel (Uncertainty Quantification Effort within PSI-SciDAC) talked about uncertainties in plasma-surface interactions in fusion tokamaks, in particular their multiscale character in space and time, which extends over 10 orders of magnitude. Since classical molecular dynamics cannot handle long-time scales, especially at experimentally low fluxes of particles, the models, appropriate for both various time scales and their hand-shaking, can be introduced to span long-time scales, from atomistic to continuum ones. Sophie described the code development and benchmarking of Xolotl, a new continuum advection-reaction-diffusion cluster dynamics code to simulate the diverter surface response to fusion relevant plasma exposure, with initial focus to the tungsten exposed to 100 eV helium plasma. How do we robustly couple atomistic and continuum models? A big question remained unanswered: How do we construct such models with some rigor and connect them with the existing UQ mathematical tools (like those of Plechac)? The quantification of uncertainties is concentrated to the diffusion factors coming from the different cross potential functions, that will be propagated through Xolotl in a later phase.

Dimitrios Giannakis (Data-Driven Spectral Decomposition and Forecasting of Ergodic Dynamical Systems) talked about a method particularly valuable to weather and atmospheric modeling, namely, for a given set of time-order observations that is assumed to be derived from a vector-valued function, provide a nonparametric forecasting model whose computational cost is reduced by dimension reduction in the observation dataset. This talk approached this problem using operator theory and demonstrated that the undying physics of weather systems can be represented by a smooth, orthonormal basis of simple harmonic oscillators that is determined directly from observations using diffusion map algorithms. Using

4

Page 5: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

the time evolution of the derived basis, nonparametric forecasting models can be constructed for arbitrary probability densities and observations. The end result is a data-driven forecasting model for weather without modeling or even knowing the underlying equation of motion.

Second Day:

Choong-Seock Chang (Telescoping UQ Method for Extreme Scale Kinetic Simulation of Magnetic Fusion Plasma) talked about a new trial method, “telescoping” UQ, which is applied in gyrokinetic code XGC1. The more conventional UQ is studied in a reduced-size problem, calibrating the UQ scalability with the experimental results, and using the calibrated UQ to make predictions for future problems (ITER in the fusion case). However there are limits - reduced fidelity simulations can lose entire physical phenomena. Current ITER simulations take several months - about 5 per year, and the traditional statistical method relying on a large number of simulations cannot be used. So how do we research UQ if we cannot get statistics? Fortunately, the use of the so-called “first principles” equations tends not to have a large number of input parameters and could provide a means for simplifying this insurmountable task. Uncertain input data include atomic cross sections and material interaction data (recycling and sputtering). 6D simulations of plasma reduced to 5D by averaging over gyromotion - justified first principles approximation - enables a 100x increase in time step. In addition this is a highly multiscale problem - gyromotion is 10-9s, turbulence 10-3s, operation 100s.

Jean-Paul Allain (Challenges and Strategies to Experimental Validation of Multi-Scale Nuclear Fusion PMI Computational Modeling) described that plasma-material interface (PMI) is a key region in the device since material can be emitted both atomistically (evaporating, sputtering, etc.) and/or macroscopically (i.e. during disruptions or edge localized modes). There are critical knowledge gaps on the strong coupling between plasma and materials, which have to be bridged by the computational models. This fusion core performance has a sensitive dependence on the wall surface not yet fully understood. For example, surprisingly, recent JET-ILW experiments show much worse confinement than with carbon. Physical and chemical sputtering are very important - surface chemistry and composition are important including self-organization of structures. Due to the complexity of the processes, it is necessary to isolate phenomena and understand the effect of each without coupling to others. However, models alone cannot reach the level of maturity needed for predictive understanding without multiple validation steps. Understanding the morphological and topographical evolution of the plasma-material interface is nascent. A multiscale approach to validation - controlled micro-experiments - would provide more control and instrumentation than is possible in a full device and are in the spirit of isolating phenomena. The limiting step in this approach to a large degree depends on the sophistication and fidelity of surface response codes. Another limiting step is the large uncertainty inherent in many of the experimental measurements involved in PMI.

Kody J.H. Law (Multilevel Sequential Monte Carlo Samplers) talked about the prevalent use of the Monte Carlo (MC) method in UQ. However, due to the slow convergence of the MC method, the MultiLevel MC (MLMC) has provided a reduction of computational cost for problems with admit hierarchy of approximation levels. This talk presented the development of MultiLevel Sequential Monte Carlo samplers (MLSMC) which provide a significant reduction in cost as compared to MLMC or sequential MC. Specifically, this talk demonstrated that by using MLSMC, the inverse problem modeling associated with sub-surface modeling cost-to- ratio can be asymptotically the same as for a scalar random variable. Under specific circumstances, the optimal cost of MLSMC is of the same order as the cost of a single simulation at the finest level.

5

Page 6: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

Jonathan Tennyson (Uncertainty Quantification for Theoretical Atomic and Molecular Data) discussed the need to stop the practice and culture in theory and simulation in atomic/molecular physics/chemistry of not providing error bars or performing uncertainty analysis. It is expected that benchmark atomic and molecular calculations should follow accepted experimental practice and include an uncertainty estimate alongside any numerical values presented. This is particularly important when the computational data are used as the primary source of data, such as cross sections, for input into modeling codes for, for example, plasma and radiative transport studies. It is imperative that these data should be accompanied by estimated uncertainties. Since 2013 Phys. Rev A has introduced editorial policy to have uncertainty estimates required in theoretical data publications. Tennyson stressed that it is not only an issue of the math tools for UQ, it equally requires a change in culture. Among successful examples are about 20 contributions to the dissociation energy of water - theoretical prediction that included error bar (4cm-1) that was subsequently confirmed by an experiment that was within the error bars. Significant examples of very high activities are sources and sinks of CO2 in the atmosphere where there is a need of 0.5% accuracy in line intensities. Ab initio theory for solving multibody Schrodinger equation provides routes to the right answer with quantifiable errors. The challenge is that DFT doesn’t having a systematic path for improvement. While static, spectroscopic-state oriented calculations are reaching acceptable UQ, it is not the case for dynamic, scattering calculations. There, epistemic uncertainty due to model selection dominates scattering calculations. There are a lot of different processes, and there is no code that deals with all of these at the same time (mostly due to treating the nuclear motion, resonances, etc.).

Alexander Kramida (Critical Evaluation and Estimation of Uncertainties of Atomic Spectral Data at NIST) described the procedures used in the Atomic Spectroscopy Group at NIST for critical evaluation of experimental and theoretical data for atomic energy levels, wavelengths and radiative rates. Interestingly, some of the data date as far in the past as 1802. A need is recognized for more statistical and UQ analysis in evaluating the data. How do we estimate uncertainties in spectroscopic data? How do we compare with experiments?. The convergence can be reached with the addition of more configurations. He stressed a need to be particular about what data are validated or correlated. Some observables have a large energy dependence that can rescale errors by factors of thousands.

Matthew Dunlop (Bayesian Level Set Inversion) outlined a new level set inversion method using a Bayesian framework. Level set methods embed boundary information of images and data into a higher dimensional function, which is the signed distance function to the boundaries. Within the context of geometrically defined inverse problems, such as electrical impedance tomography, the goal is to determine an unknown, piecewise constant function from a finite set of indirect measurements. Using a Bayesian approach to build in probability distributions for the determination of the boundaries of these piecewise constant functions provides a more complete reconstruction for geometrically defined inverse problems given a limited set of measurements. This talk demonstrated the improvement of the level set method and outlined computational, faster approaches using a hierarchical Bayesian level set inversion method.

Alan Calder (Verification, Validation and Uncertainty Quantification in Astrophysics) talked about how the verification, validation, and UQ in astrophysics present challenges since stellar interiors cannot be fully reproduced in a terrestrial laboratory. Still, in order to make confident predictions, UQ is highly needed. To validate the calculated data we need experiments that can be performed on Earth, which stresses the importance of talking to the experimentalists when trying to validate a code. Alan stressed that agreement of a simulation with an experiment is not necessarily the end of validation and UQ: For example, if there are known missing physics and components, how do we interpret and use the

6

Page 7: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

agreement? Is this “agreement from the good reasons?” Therefore, one cannot use UQ as a “black box” in this area of physics.

Richard Archibald (Sparse Sampling Methods for Experimental Data from the DOE Facilities) discussed that designing computational simulations to best capture uncertainty for the extreme scale requires scalable algorithms to estimate error for a broad range of situations. In order to take full advantage of high performance computing, scalable error estimation of stochastic simulations needs to be developed on unstructured data, providing the ability of uncertainty quantification methods to operate on both measured data or on computations that are either guided or come from a legacy database. This talk described how these fast methods are adapting to high performance computing. The talk described the connection between polynomial approximation (or polynomial chaos methods) and Gaussian processes and provided specific examples of these UQ methods to the applications of climate modeling and neutron tomography.

Third Day:

Hyun-Kyung Chung (Internationally Coordinated Activities of Uncertainty Quantification and Assessment of Atomic, Molecular and Plasma-Surface Interaction Data for Fusion Applications) addressed the progress toward evaluating A+M collision data. The internationally coordinated activities at IAEA toward the UQ science of A+M/PMI data were reviewed. The IAEA A+M data unit is encouraging work to develop guidelines for critically accessing theoretical A+M structure and collision data, taking into account the processes and quantities of interest as well as specific theoretical methods employed in calculations. A joint ITAMP-IAEA workshop was organized in July 2014 to discuss sources of uncertainty in the physical models of interest: Electron/atom/molecule collisions as well as electronic structure calculations and PMI processes. It is important to connect theory and experiment – a good example is the failure of NIF.

7

Page 8: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

CONFERENCE CONCLUSIONS & FUTURE PLANS There is a profound lack of culture in uncertainty quantification in computer simulations for Atomic and Molecular Physics (AMO), Plasma Physics (PP) and Plasma-Material Interface (PMI). Quantifying uncertainties involves sensitivity analysis (for which parameters are most important), variability analysis (intrinsic variation associated with inherent randomness in a physical system), and epistemic uncertainty analysis (degree of confidence in data and models). While calculations without uncertainties were acceptable at the initial stages of development of the theoretical sciences, further progress now dictates a change in the paradigm: Calculated values without uncertainty estimates at best have diminished value for both fundamental science and certainly for engineering applications, and at worst are meaningless and even harmful and misleading. UQ has become a key for credible predictions and “decision making,” as well as a necessary ingredient for validating theory with experiment. Thus, all published simulation results should be accompanied by an analysis of uncertainties in much the same manner as already being done by the best experimental papers. Still, for the sensitivity and variability analyses the need for estimating uncertainties requires educating a theorist, as well as an experimentalist, in the field of applied statistics. Thus, closer interaction between AMO+PP+PMI scientists and applied statisticians, which was provided by this workshop, is highly desirable. Such interactions reveal to AMO+PP+PMI scientists, on the one hand, the plethora of well-developed methods for treating uncertainties and other statistical properties of quantities of interest and, on the other hand, communicate to applied statisticians new types of problems that could potentially lead to developing new methods and further advances in applied statistics.

The immaturity of UQ application in AMO physics is in part a multifaceted cultural issue arising from, for instance, the relationship of theorists to the errors and uncertainty of their own work; the lack of routine exchange of detailed information between different, mutually coupled  branches of physics; and the lack of routine collaboration among mathematicians and statisticians and computer scientists who are capable of developing and specializing UQ techniques for application to particular fields of physics. The terminology barrier here exists, and this is also a cultural issue. Superficially, this is also in part an issue of different scientific interests. While most of the current development/application of UQ is oriented toward complex physical and engineering systems controlled by the 3D PDEs that control fluid flow and mechanics structures, for example UQ of long-range forecasting of the coupled atmosphere-ocean system, the AMO, materials and plasma communities have equations in 3D, 6D, ND and even infinite dimensions with roots in quantum mechanics, statistical mechanics, magnetohydrodynamics, and so forth. Thus, from the point of view of computational complexity, AMO physics deals with problems unfamiliar to the UQ field, though it appears that the UQ field (in its present status) has the potential to be more immediately relevant to the work on PMI and PP than it does to work on AMO data. Still, PMI handling of multiple timescales and uncertainty propagation is a central concern. In a complex system like tokamak, reduced PP scale models are augmented with experimental validation. But there are limits - reduced fidelity simulations due to the computational limitations can lose entire physical phenomena. Thus current diverter-plasma simulations take several months - about 5 per year. So how do we study UQ if one cannot get statistics?

Of course, the answer is in improving epistemic uncertainty. But, for epistemic uncertainty there is no systematic approach – it is still treated on a case by case basis. However, for the ab initio quantum theory approaches, such as in AMO theory, UQ is almost always aligned with model choices and solutions, rather than with statistics or numerics of the solution. With multiscale exploration, the distinction between aleatoric (system sensitivity and variability) and epistemic uncertainties is blurred. What seems to be aleatoric at one scale becomes epistemic at the other scale. Of course, uncertainty on the finer scale becomes aleatoric. In the context of polynomial chaos expansions, the two types are handled simultaneously using polynomial expansions with random coefficients.

8

Page 9: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

This discussion gave rise to an important question at the meeting: How do we integrate UQ into a particular field of physics? While the UQ field is mature for developing general theory of parametric uncertainty, its application to particular physical models is in its infancy. There is a difference between math research and the application-driven development of techniques and the maturing of those techniques into practical tools. Both are needed, but their impact and timeline of impact are different. UQ is presently most applicable for reasonably well understood and modeled systems. However, all fields of science grow as they encounter new applications. In that sense there was no conceptual obstacle in any of the problems considered at this workshop. While many UQ approaches can be computationally challenging out of the box, experience suggests they can be adapted to become more efficient after being tailored to the structure of a given physical problem. For instance, perfectly parallel sampling algorithms for Bayesian posterior inference do exist and work in practice.

While workshops and events like this can make personal and technical connections between UQ and physics scientists, a real step forward would be building integrated and commonly funded collaborative teams, which would provide physics knowledge to mathematicians and mathematical skills to the physicists. Otherwise, incorrect interpretations and terminology barriers can lead to misapplications, thus producing false conclusions. While the math side has more formal definitions and language, these terms must be translated into each new application space – this all requires interdisciplinary communication and engagement. A plausible model for a collaborative approach across disciplines is the DOE SciDAC program including its UQ Institute, which is presently primarily focused on engineering applications.

There are a number of successful applications of UQ in climate, engineering, in the DOE ASC program and in the IAEA nuclear data program. Plasma edge simulations provide success stories, which are based on merging multiple data sources and diagnostics. As a result of this effort, the mentioned application has the best UQ model. Still, running the aforementioned model is a big problem, addressed usually by running with reduced resolution. For the AMO systems, computation is not the main barrier. Designing an adequate test for a particular model’s limits and expanding the model toward large systems presents real computational challenges. Thus, the Monte Carlo approach seems to be probably the only black box that is now ready to go – everything else seems to require expert collaboration between application and UQ-math. Still, there are success examples in AMO and plasma physics. These are atomic structure calculations (Gordon Drake, Klaus Bartschat, Igor Bray, Dmitry Fursa); examples that close the experiment-models-prediction cycle in employing AMO data in astrophysics (Daniel Savin); reduced-order modeling in plasma – making an intractable problem tractable (C.S. Chang); emulators and surrogate models – physics based-reduction ensuring overall features are correct with subsequent validation of assumptions (U. von Toussaint).

Different classes of challenges for applying UQ exist in various physics areas. For example, in some cases the boundary conditions were variable and unknown (e.g. in PMI and PP of fusion), which could have severe implications to the tokamak plasma physics predictions. In some other tokamak simulation cases the cost of a single simulation is exorbitant. In the latter case one must determine a compromise between a very detailed but barely feasible single simulation and multiple, less-resolved simulations that include UQ. The choice is not obvious: It seems that for the tokamak problems, some transformative algorithms (both physics and data-driven) are required. Another class of systems is one that is multiscale in space and time. Molecular dynamics cannot handle the low flux or long time scales (spanning 10+ orders of magnitude). How then do we robustly couple atomistic and continuum models in this case and propagate uncertainties in particular?

While it is in principle possible to combine models to span the length scales, how do we construct such models with some rigor? Can the approach of Plechác be applied in the real world where heuristics are used to “pass parameters” between scales? Models alone cannot reach the level of maturity needed for

9

Page 10: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

predictive understanding without multiple validation steps. PMI models are too limited, but understanding the morphological and topographical evolution of the plasma-material interface is nascent. Multiscale approaches to validation – controlled micro-experiments - provide more control and instrumentation than is possible in a full device and are in the spirit of isolating phenomena.

The participants of the workshop concluded that the “dynamics and energy in the room” and the need to establish collaboration between UQ and physics scientists suggested that the community should have regular meetings, probably once every one or two years. It was proposed to co-locate this workshop with SIAM-UQ meetings and to include classes and tutorials. It was also suggested that future meetings be co-located with APS-DPP, APS-DAMOP, APS-MATERIALS, and/or biannual PSI meetings, as well as to corroboratewith the IAEA technical meetings devoted to the UQ theme. Annual meetings seem to better suit developments in applied mathematics and current needs for UQ in physics. Co-locating the meeting in 2016 with either the SIAM annual meeting in Boston, MA (July 11-14) or with the APS-DPP annual meeting in San Jose, CA (October 31-November 4, 2016), and then alternating in 2017, is one possibility. The final decision on future workshops will be made by the Workshop Scientific Committee after a wide community discussion.

10

Page 11: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

Appendix 1Agenda 

Saturday, November 708:00 am-09:00 am Continental Breakfast/Registration

08:55 am- 09:00 am Opening (R. Harrison, P. Krstic)

Morning Session (Chair: J. Tennyson, U. of Delaware)09:00 am-09:45 am

Roger GhanemPolynomial Chaos as a Comprehensive Modeling Tool for Complex Systems

09:45 am-10:30 am

Petr PlechacInformation-Theoretic Tools for Uncertainty Quantification of High Dimensional Stochastic Models

10:30 am-11:00 am Coffee Break

11:00 am-11:45 pm

Robert MoserReliability and Uncertainty in the Simulation of Tokamak Plasmas

11:45 am-12:30 pm

Udo von TouissaintThe Vlasov-Poisson Plasma Model with Uncertain Inputs: A Bayesian Modeling Approach

12:30 pm- 02:00 pm Sandwich Lunch

Afternoon Session (Chair: P. Plechac, U.C. London)02:00 pm- 02:45 pm

Panagiotis AngelikopoulosBayesian Uncertainty Quantification for Molecular Dynamics Simulations

02:45 pm- 03:30 pm

Daniel SavinAstrochemistry from the First Stars to the Origins of Organic Chemistry

03:30 pm- 04:15 pm

Francesco RizziUncertainty Quantification in Molecular Dynamics Simulations: Forward and Inverse Problem,

04:15 pm- 04:45 pm Coffee Break

04:45 pm- 05:30 pm

Sophie BlondelUncertainty Quantification Effort within PSI-SciDAC

05:30 pm- 06:15 pm

Dimitrios GiannakisData-Driven Spectral Decomposition and Forecasting of Ergodic Dynamical Systems

 Sunday, November 808:00 am-09:00 am Continental Breakfast/Registration

Morning Session (Chair: D. Stotler, PPPL)09:00 am-09:45 am

Choong-Seock ChangTelescoping UQ Method for Extreme Scale Kinetic Simulation of Magnetic Fusion Plasma

09:45 am-10:30 am

Jean-Paul AllainChallenges and Strategies to Experimental Validation of Multi-Scale Nuclear Fusion PMI Computational Modeling

10:30 am-11:00 am Coffee Break

11:00 am-11:45 am

Kody J.H. LawMultilevel Sequential Monte Carlo Samplers

11:45 pm-02:00 pm Sandwich Lunch and Poster Session

11

Page 12: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

Afternoon Session (Chair: U. von Toussaint, IPP Garching, Germany)

02:00 pm-02:45 pm

Jonathan TennysonUncertainty Quantification for Theoretical Atomic and Molecular Data

02:45 pm-03:30 pm

Alexander KramidaCritical Evaluation and Estimation of Uncertainties of Atomic Spectral Data at NIST

03:30 pm-04:15 pm

Matthew DunlopBayesian Level Set Inversion

04:15 pm-04:45 pm Coffee Break

04:45 pm-05:30 pm

Alan CalderVerification, Validation and Uncertainty Quantification in Astrophysics

05:30 pm-06:15 pm

Richard ArchibaldSparse Sampling Methods for Experimental Data from the DOE FacilitiesPOSTER SESSION Sunday, 11:45-12:30 (Chair: Sophie Blondel, ORNL)Varis CareyTelescoping Methods for Uncertainty QuantificationOzgur CekmerUncertainty Quantification Analysis for XolotlScott FersonComputing with ConfidenceScott FersonSensitivity Analysis of Probabilistic ModelsMichael ProbstData Accuracy in Modeling and Computation: Some ExamplesMonday, November 9

08:00 am-09:00 am Continental Breakfast

Morning Session I (Chair: R. Archibald, ORNL)

09:00 am-09:45 am

Hyun-Kyung ChungInternationally Coordinated Activities of Uncertainty Quantification and Assessment of Atomic, Molecular and Plasma-Surface Interaction Data for Fusion ApplicationsMorning Session II (Chairs: TBD)

09:45 am-10:30 am Panel Discussion

10:30 am-12:00 pm Conference Conclusions & Future Plans

Appendix 2Budget2015 UQ ACCOUNTMASTER LIST OF ALL EXPENDITURES

1128723 73191Beginning Balance

12

Page 13: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

Date Category Vendor Charges Balance Description

$15,000.0011/6/15 Travel Hilton $1,008.00 $13,992.00 Lodging for UQ speakers11/6/15 Travel Hilton $3,528.00 $10,464.00 Lodging for UQ speakers11/6/15 Media Chris Rosaschi $1,210.00 $9,254.00 Graphic Design11/6/15 Media Eco printing $838.30 $8,415.70 Brochure11/6/15 Travel Spartan $1,974.40 $6,441.30 Ground transportation for UQ speakers11/6/15 Food Bliss $2,152.00 $4,289.30 Catering11/6/15 Travel Hilton $1,512.00 $2,777.30 Lodging for UQ speakers11/7/15 Travel Enterprise $126.96 $2,650.34 Rental cars for UQ speakers

11/7/15 TravelJonathan Tennyson $476.86 $2,173.48

For UQ Speakers - round trip airfare reimbursement

11/7/15 Travel Matthew Dunlop $30.50 $2,142.98For UQ Speakers - round trip LIRR ticket reimbursement

$2,142.98$12,857.02

Revenue Conf Participants $910.00 Participant Registration Revenue$3052.98

13

Page 14: iacs.stonybrook.edu€¦  · Web viewWorkshop: Sensitivity, Error and Uncertainty Quantification for Atomic, Plasma, and Material Data. REPORT. Ba. sic Meeting Information . Location

Appendix 3

List of Participants

Panagiotis Angelikopoulos CSE Lab, Institute of Computational Science, Zurich, Switzerland

Richard Archibald ORNL, Oak Ridge, TN, USA

Johan Bengtsson JB Optima, LLC

Sophie Blondel ORNL, Oak Ridge, TN, USA

Alan Calder Stony Brook University, Stony Brook, NY, USA

Varis Carey CU- Denver, Denver, CO, USA

Ozgur Cekmer ORNL, Oak Ridge, TN, USA

Choong-Seock Chang PPPL, Princeton, USA

Hyun-Kyung Chung IAEA, AT

Matthew Dunlop University of Warwick, Coventry, GB

Scott Ferson Applied Biomathematics, Setauket, NY, USA

Roger Ghanem University of Southern California, CA, USA

Dimitrios Giannakis New York University, NY, USA

Javier Dominguez-Gutierrez Stony Brook University, Stony Brook, NY USA,

Longtao Han Stony Brook University, Stony Brook, NY USA

Robert Harrison Stony Brook University, Stony Brook, NY, USA

Alexander Kramida NIST, MD, USA

Predrag Krstic Stony Brook University, Stony Brook, NY, USA

Kody J.H. Law KAUST, Saudi Arabia

Craig Michoski University of Texas, Austin, TX, USA

Robert Moser University of Texas, Austin, TX, USA

Petr Plechac University of Delaware, DE, USA,

Michael Probst University of Innsbruck, Innsbruck, AT

Maksim Rakitin Stony Brook University, Stony Brook, NY, USA

Francesco Rizzi Sandia NL, Albuquerque, NM, USA

Daniel Savin Columbia University, NY USA

Daren Stotler PPPL, Princeton, NJ, USA

Jonathan Tennyson University College of London, GB

Udo von Toussaint Max Planck Institute for Plasma Physics, Germany

Shinjae Yoo BNL, Upton, NY, USA

Yang Zhang EMNL, Stony Brook, NY, USA

14