HDA 2025 10th Workshop on
High-Dimensional Approximation

8th to 12th September 2025 — Bonn, Germany

Welcome

The High-Dimensional Approximation (HDA) Workshop is a series of biennial international meetings covering current research on high-dimensional problems.

HDA2025 will cover a range of topics central to modern high-dimensional approximation and their applications.
Methods include, but are not limited to,

Applications in high-dimensions include, but are not limited to, Participants are welcomed from all around the world. A key feature of this workshop is that there are no parallel sessions. Participants are welcome to present work in progress, and time will be set aside for informal discussions. The number of talks will be limited. It is not essential that everyone gives a talk. Collaborators are encouraged to coordinate and elect a representative to present joint work.

Invited Speakers

We are happy to announce the list of invited speakers for HDA2025

Speaker Affiliation
Sergey Dolgov University of Bath
Omar Ghattas University of Texas at Austin
Takashi Goda University of Tokyo
Fred J. Hickernell Illinois Institute of Technology
Gitta Kutyniok Ludwig-Maximilians-Universität München
Benjamin Peherstorfer New York University
Peter Richtárik King Abdullah University of Science and Technology
Ian Sloan University of New South Wales
Michael Unser Ecole Polytechnique Fédérale de Lausanne

Program

The timetable and the book of abstracts are available now.

The workshop is scheduled to begin with lunch on Monday and end at lunchtime on Friday.

Timetable

Monday, September 08th, 2025
Time Speaker Talk
12:30 - 13:30 Registration & Lunch
13:30 - 13:50 Welcome
13:50 - 14:20 Yuya Suzuki Approximations of Differential Entropy in Bayesian Optimal Experimental Design
14:20 - 14:50 Jürgen Dölz Uncertainty quantification of spectral clusterings
14:50 - 15:30 Peter Richtarik From the Ball-proximal (Broximal) Point Method to Efficient Training of LLMs
15:30 - 16:00 Coffee break
16:00 - 16:30 Jacob Heieck Lyapunov Stability of Consensus-Based Optimization
16:30 - 17:00 Guanglian Li Gradient-enhanced sparse Hermite polynomial expansions for pricing and hedging high-dimensional American options
17:00 - 17:30 Priyanka Roy Gradient Descent Algorithm in Hilbert Spaces under Stationary Markov Chains with phi- and beta-Mixing
Tuesday, September 09th, 2025
Time Speaker Talk
09:00 - 09:30 Matthew Fernandes Surrogate Bayesian Inversion for a Class of Wave Configuration Parameters
09:30 - 10:00 Johannes Schmidt Sparse Grid for Multi-Level Combination Methods in Machine Learning
10:00 - 10:30 Julius von Smercek Gradient-Based Adaptive Refinement on Sparse Grids for Critical Event Estimation
10:30 - 11:00 Coffee break
11:00 - 11:40 Gitta Kutyniok to be announced
11:40 - 12:10 Holger Wendland Kernel-based approximation of high-dimensional functions with small efficient dimension
12:10 - 12:40 Rüdiger Kempf Revisiting the Tensor Product Multilevel Method: New Insights into Kernel-Based Sparse Grid Approximation
12:40 - 13:50 Lunch
13:50 - 14:20 Anna Little Functional Multi-Reference Alignment via Deconvolution
14:20 - 14:50 Shiv Mishra Consistent PINNs for parabolic PDEs
14:50 - 15:30 Michael Unser Variational Connection between Radial Basis Functions and Neural Networks
15:30 - 16:00 Coffee break
16:00 - 16:30 Josef Teichmann Path dependent time series models
16:30 - 17:00 Tino Ullrich Sparse sampling recovery of functions - square root lasso, orthogonal matching pursuit and instance optimality
17:00 - 17:30 Marcin Wnuk Adaptive and non-adaptive approximation of high-dimensional vectors
Wednesday, September 10th, 2025
Time Speaker Talk
09:00 - 09:30 Tùng Lê Quasi-Monte-Carlo method for Gevrey class functions governed by non-linear PDEs
09:30 - 10:00 Marc Schmidlin A framework for proving parametric regularity of high-dimensional problems
10:00 - 10:30 André-Alexander Zepernick Domain UQ for stationary and time-dependent PDEs using QMC
10:30 - 11:00 Coffee break
11:00 - 11:40 Ian Sloan Approximation of periodic functions in high dimensions
11:40 - 12:10 Felix Bartel Minimal Subsampled Rank-1 Lattices for Multivariate Approximation with Optimal Convergence Rate
12:10 - 12:40 Alexander Gilbert A novel CBC construction for lattice rules for L^2 approximation with POD and SPOD weights
12:40 - 13:50 Lunch
13:50 - 14:20 Yoshihito Kazashi (Near-)Optimality of Quasi-Monte Carlo methods and Suboptimality of the Sparse-Grid Gauss-Hermite Rule in Gaussian Sobolev Spaces
14:20 - 14:50 Michael Gnewuch Data Compression using Rank-1 Lattices for Parameter Estimation in Machine Learning
14:50 - 15:30 Takashi Goda Quasi-uniform quasi-Monte Carlo point sets and sequences
15:30 - 16:00 Coffee break
16:00 - 16:40 Benjamin Peherstorfer DICE: Discrete inverse continuity equation for learning population dynamics
16:40 - 17:10 Zexin Pan Dimension-independent convergence rates of median randomized nets
Thursday, September 11th, 2025
Time Speaker Talk
09:00 - 09:30 Peter Kritzer Median lattice rules for function approximation
09:30 - 10:00 Dirk Nuyens Least squares using Kronecker points
10:00 - 10:30 Yiqing Zhou Minimization of Costly Functions Using Sparse FFT-Based Interpolation
10:30 - 11:00 Coffee break
11:00 - 11:40 Omar Ghattas to be announced
11:40 - 12:10 Gregor Maier Near-Optimal Learning of Lipschitz Operators with respect to Gaussian Measures
12:10 - 12:40 Remo von Rickenbach On Sobolev and Besov Spaces With Hybrid Regularity
12:40 - 13:50 Lunch
13:50 - 14:20 Charles Miranda Properties and optimisation of compositional tensor networks
14:20 - 14:50 Yingda Cheng Low-rank Anderson Acceleration
14:50 - 15:30 Sergey Dolgov Deep tensor train approximation of transport maps for Bayesian inverse problems
15:30 - 16:00 Coffee break
16:00 - 16:30 Andreas Zeiser Discontinuous Galerkin discretization of conservative dynamical low-rank approximation schemes for the Vlasov-Poisson equation
16:30 - 17:00 Roman Khotyachuk Dimensionality reduction techniques for numerical solutions of the Elder problem
17:00 - 17:30 Dinh Dũng Sampling recovery in Bochner spaces and applications to parametric PDEs with random inputs
19:00 - 21:00 Workshop dinner
Friday, September 12th, 2025
Time Speaker Talk
09:00 - 09:30 Heinz-Jürgen Flad Higher Order Singularities in High Dimensions
09:30 - 10:00 Henrik Eisenmann Optimal solvers for infinite-dimensional sparse approximations in adaptive stochastic Galerkin finite element methods
10:00 - 10:30 Huqing Yang Sparse and low-rank approximations of parametric PDEs: the best of both worlds
10:30 - 11:00 Coffee break
11:00 - 11:40 Fred Hickernell The Quality of Lattice and Kronecker Sequences
11:40 - 12:10 Peter Zaspel Data-driven identification of port-Hamiltonian DAE systems by Gaussian processes
12:10 - 12:40 Jochen Garcke Passive and Model-Agnostic Sampling in Machine Learning Regression
12:40 - 13:50 Closing remarks & Lunch/Lunchbags

Registration

Registration is now closed.

Organization

You can contact us at ed tod nnob-inu.sni ta 5202adha tod b@foo.de.

Program Committee

Local Organization

Venue

The workshop will be held at the University Club of the University Bonn, located at the Rhine river close to the city centre.

Address: Konviktstr. 9, 53113 Bonn, Germany.

Hotels

For local information and hotel reservation please use the following links of the Bonn tourism agency.

Transportation

Financial Support

Financial support is given by the TRA Modelling (University of Bonn) and the Hausdorff Center for Mathematics (HCM) as part of the Excellence Strategy of the federal and state governments, as well as the CRC1639 NuMeriQS.