Recording: https://disk.pku.edu.cn/link/AAEE6EE3DCAB384FE79258F879CF21499C
Abstract: The talk discusses estimation problem for complex high-dimensional models like Deep Neuronal Networks (DNN). The approach reduces the original problem to perturbed optimization. Under some mild conditions, we establish finite sample expansions for the loss of estimation and for the excess risk. This enables us to obtain sharp nonasymptotic risk bounds in terms of the so called efficient dimension of the problem. The results are specified to the case of nonlinear regression and DNN training.
Bio: Professor Vladimir Spokoiny is a top-level expert in Machine Learning, Statistics and Applied Analysis. Having graduated in 1981 from the Department of Technical Cybernetics at Moscow Institute of Railway Engineering, he obtained his PhD degree at Moscow State University under supervision of Prof. M.B. Maljutov and Prof. A.N. Shiryaev.
Prof. Spokoiny worked at Moscow State University, University Paris-Sud and University of Wurtzburg. Now he heads a research group at Weierstrass Institute combining this with a professor position at Humboldt University being is associated editor of Annals of Statistics and Statistics and Decisions journals.
His research interests are related with:
- adaptive nonparametric smoothing and hypothesis testing
- high dimensional data analysis
- statistical methods in finance
- image analysis, applications to medicine
- classification
- nonlinear time series.