2020-09-08 | Xiao Wang:Network Architecture and Optimization: Methods of Dynamical Systems in Machine Learning

2020-09-08

                   

Abstract

The recent development of machine learning has provided new and challenging research problems in the areas of non-convex optimization and dynamical systems. We focus on two fundamental aspects of machine learning: training the network parameters and selecting the architecture of a neural network, where the techniques of differentiable and topological dynamical systems play an essential role. In the first part of the talk, we present a result that the meta first-order methods (gradient descent, mirror descent, manifold gradient descent, etc) avoid saddle points by introducing the global stability theory of dynamical systems. In the second part, we present a new result on the expressivity power of deep neural networks: the depth-width trade-offs for ReLU networks via Sharkovskii's Theorem. The related research has a variety of applications in deep learning, approximation theory, partial differential equations, etc, and has connections to Hilbert's 13th problem. As an open ended area in representation learning, there are a lot more open problems to be answered in the future.

 

Time

202098日(周二)10:00-11:00(北京时间)

 

Speaker

Xiao Wang got his Ph.D in differential geometry from the University at Buffalo, advised by Mohan Ramachandran, and B.S. from China University of Geosciences (Beijing). Xiao Wang is now a research fellow in Singapore University of Technology and Design (SUTD), he is interested in non-convex and manifold optimization, representation learning and game theory. Recently his research mainly focuses on the questions about equilibration, robustness and performance of algorithms and complex systems. Overall, his work combines and develops new techniques in algorithms, dynamical systems, optimization and learning theory.

 

Venue

Zoom ID: 91389575421

密码:123456

参会链接:https://zoom.com.cn/j/91389575421