您现在的位置: 首页» Research» Seminars

Research

Statistical estimation theory using perturbed optimization

  • Speaker:Vladimir Spokoiny (Weierstrass Institute for Applied Analysis and Stochastics)
  • Organizer:Beijing-Saint Petersburg Mathematics Colloquium
  • Start Time:2024-11-28 21:00
  • End Time:2024-11-28 22:00
  • Venue:Online

 

To Join Tencent Meeting: https://meeting.tencent.com/dm/TZbwDpT1L9GJ

Meeting ID: 564-6487-0548    Password: 202410

 

Abstract: The talk discusses estimation problem for complex high-dimensional models like Deep Neuronal Networks (DNN). The approach reduces the original problem to perturbed optimization. Under some mild conditions, we establish finite sample expansions for the loss of estimation and for the excess risk. This enables us to obtain sharp nonasymptotic risk bounds in terms of the so called efficient dimension of the problem. The results are specified to the case of nonlinear regression and DNN training.

 

Bio: Professor Vladimir Spokoiny is a top-level expert in Machine Learning, Statistics and Applied Analysis. Having graduated in 1981 from the Department of Technical Cybernetics at Moscow Institute of Railway Engineering, he obtained his PhD degree at Moscow State University under supervision of Prof. M.B. Maljutov and Prof. A.N. Shiryaev.

Prof. Spokoiny worked at Moscow State University, University Paris-Sud and University of Wurtzburg. Now he heads a research group at Weierstrass Institute combining this with a professor position at Humboldt University being is associated editor of Annals of Statistics and Statistics and Decisions journals.

His research interests are related with:

- adaptive nonparametric smoothing and hypothesis testing

- high dimensional data analysis

- statistical methods in finance

- image analysis, applications to medicine

- classification

- nonlinear time series.

 

TOP