Mini-course: Uncertainty Quantification and Approximation Theory for Parameterized PDEs
Time:08:30:14/11/2016 to 12:00:17/11/2016
Venue/Location: C2-714, VIASM
Invited Speakers: Lecturers (tentative): Clayton Webster, Hoang Tran, and Guannan Zhang (Oak Ridge National Laboratory, USA)
Content:
Abstract:
In this course, we will present a general class of numerical methods and analysis for simultaneously approximating a system of partial differential equations (PDEs), parameterized by many deterministic (e.g., spatial position, and velocity) and stochastic variables, ranging over a multi-dimensional domain, where the parametric dimension can be very large, or even infinite.
This tutorial will explore numerical and functional analysis techniques for solving such problems, including well-posedness and regularity of the resulting parameterized PDE problems. We will present a detailed overview and convergence analysis of several methods for quantifying the uncertainties associated with input information onto desired quantities of interest, forward and inverse uncertainty quantification (UQ) approaches, and necessary theoretical results from stochastic processes and random fields, error analysis, anisotropy, adaptive methods, high-dimensional best s-term approximation, compressed sensing, discrete least squares, deterministic and random sampling, and sparse grids.
Topics:
* Background and motivation (necessary probability and approximation theory, and results from PDEs)
* Overview of deterministic and stochastic parameterized PDEs (high and infinite dimensional problems, deterministic and random sampling methods, polynomial approximations)
*** Main topics ***
* (Quasi) Monte Methods
* L2-projection techniques and best approximations
* Reduced basis methods
* Interpolation and sparse grids
* Compressed sensing and discrete least squares
* Inverse problems and Bayesian inference
* Multilevel and multi index methods
* Other topics, applications, open discussions
* Opportunities for students and postdocs