Skip to main content

Graduate seminar SS 21 Graduate Seminar on Numerical Simulation

The approximation theory of deep learning

Offered by
Prof. Michael Griebel
Assistant
Dinesh Kannan

Content: Probably the most important topic of state-of-the-art mathematical research in machine learning is the analysis of complex deep learning models. To this end, the capacity - or approximability - of deep learning model classes is an important indicator for the power of the underlying algorithm. We will discuss recent results on the approximation power of deep ReLU networks and variants thereof, which prove to be a necessary requirement for their success in deep learning.

Please send an email to ed tod nnob-inu tod sni ta nannaka tod b@foo tod de if you want to take part in this seminar. The slots will be given out an a first-come, first-served basis. There will be an initial (Online) meeting to discuss the potential topics and research papers (after 10th of Feb). Details to this meeting (Zoom link, time etc.) will be sent to the those who send an email confirming their interest in the seminar.

After that, the participants will be asked to suggest their top two or three choices, based on which we try to optimize and assign the topics to everyone.

Time: Thursdays at 12:15

Venue: Zoom meetings most probably, links and details to be communicated via email.

New Info:

Preliminary meeting on March 11th at 14:15. Please send an email to ed tod nnob-inu tod sni ta nannaka tod b@foo tod de if you would like to get the Zoom link, or if you are interested in the seminar, but can’t make it to this appointment.