6月5日美国约翰斯•霍普金斯大学朱志辉博士学术报告预告
作者:通信学科 发布日期:2019-06-03 浏览次数:

报告时间:65日下午1330

地点:信息大楼B513

题目:Nonconvex Approaches for Data Science

报告人:朱志辉博士,美国约翰斯·霍普金斯大学

 

报告摘要:As technological advances in fields such as the Internet, medicine, finance, and remote sensing have produced larger and more complex data sets, we are faced with the challenge of efficiently and effectively extracting meaningful information from large-scale and high-dimensional signals and data. Many modern approaches (e.g., deep learning) to addressing this challenge naturally involve nonconvex optimization formulations.  In theory, however, it is often difficult to find a global minimum for a general nonconvex problem; obtaining even a local minimizer is computationally hard.

 

In this talk, I will present specific benign geometric properties coupled with algorithmic development for nonconvex optimization: (i) a benign global landscape in the sense that all local minima are global and the other critical points that are not global solutions are strict saddles, ensuring convergence of iterative algorithms with arbitrary or random initializations; (ii) a sufficiently large basin of attraction around the global minima, enabling us to develop optimization algorithms that can rapidly converge to a global minimum with a data-driven initialization. I will discuss the efficiency of this geometric analysis framework in the context of training shallow neural networks, variants of low-rank matrix optimization problems in data science, blind deconvolution, robust principal component analysis, etc. At the end of the talk, I will discuss open problems and future directions to enrich this framework.

 

个人简介:Zhihui Zhu (朱志辉) is a Postdoctoral Fellow in the Mathematical Institute for Data Science at the Johns Hopkins University. He received his B.Eng. degree in communications engineering in 2012 from Zhejiang University of Technology (Jianxing Honors College), and his Ph.D. degree in electrical engineering in 2017 at the Colorado School of Mines, where his research was awarded a Graduate Research Award. His research interests include the areas of machine learning, signal processing, data science, and optimization. His current research largely focuses on the theory and applications of nonconvex optimization and low-dimensional models in large-scale machine learning and signal processing problems.