Information Based Complexity
Special topics numerical analysis: Information Based Complexity
Lecturer: PD Peter Kritzer (RICAM)
Lecture no.: 325.310
ECTS: 3, hours: 2
Schedule: see KUSSS
Abstract: In many applications (e.g., in nance, physics, or computer science) we have to deal with multivariate continuous problems which are dened on spaces of functions depending on d variables. In many cases, d can be in the hundreds or thousands. The information complexity n(epsilon; d) of a d-variate problem is the minimal number of information operations that is needed to solve the problem within an error threshold of epsilon. Typical examples of information operations are, e.g., function values or linear functionals. The eld of information-based complexity, founded by Traub and Wozniakowski in the 1980s, deals with the analysis of the information complexity for multivariate problems, and, in particular, with the question how the information complexity depends on d and epsilon. A crucial question is, for instance, under which circumstances one can avoid a curse of dimensionality. In this introductory course, we will give an introduction to basic concepts in information-based complexity, and illustrate some of the theory by examples such as high-dimensional numerical integration or function approximation.