学术报告第201801期-自动化学院

您好,欢迎来到华中科技大学自动化学院官网!

联系我们| English

当前位置: 首页 > 科学研究 > 学术报告 > 正文

学术报告第201801期

发布时间:2018-01-03 编辑:汪昭 来源:

报告题目:Achieving Acceleration in Distributed Gradient Methods

报告人:屈冠南

报告时间:201814 上午1000

报告地点:先进制造大楼东楼C402

报告摘要:

We consider the distributed optimization problem over a network, where the objective is to optimize a global function formed by a sum of local functions, using only local computation and communication. We develop an Accelerated Distributed Nesterov Gradient Descent (Acc-DNGD) method. When the objective function is convex and $L$-smooth, we show that it achieves a $O(\frac{1}{t^{1.4-\epsilon}})$ convergence rate for all $\epsilon\in(0,1.4)$. We also show the convergence rate can be improved to $O(\frac{1}{t^2})$ if the objective function is a composition of a linear map and a strongly-convex and smooth function. When the objective function is $\mu$-strongly convex and $L$-smooth, we show that it achieves a linear convergence rate of $O([ 1 - C (\frac{\mu}{L})^{5/7} ]^t)$, where $\frac{L}{\mu}$ is the condition number of the objective, and $C>0$ is some constant that does not depend on $\frac{L}{\mu}$.

报告人简介:

Guannan Qu received his B.S. degree in Electrical Engineering from Tsinghua University in Beijing, China in 2014. Since 2014 he has been a graduate student in the School of Engineering and Applied Sciences at Harvard University. His research interest lies in network control and optimization.