地点：Tencent会议 828 322 384
主讲人先容：罗珊，新加坡国立大学统计学博士，密歇根大学生物统计系访问学者。现为上海交通大学数学科学学院长聘副教授，博士生导师。主要研究领域为高维统计推断中的模型选择标准和变量选择方法、函数型数据和高阶数据分析等。科研成果主要发表在 JASA、Statistica Sinica、Journal of Multivariate Analysis、Computational Statistics and Data Analysis 等统计学期刊和机器学习顶会等。
内容先容：As a prevalent distributed learning paradigm, Federated Learning (FL) trains a global model on a massive amount of devices with infrequent communication. This paper investigates a class of composite optimization and statistical recovery problems in FL setting, whose loss function consists of a data-dependent smooth loss and a nonsmooth regularizer. Examples include sparse linear regression using Lasso, low-rank matrix recovery using nuclear norm regularization, etc. In the existing literature, federated composite optimization algorithms are designed only from an optimization perspective without any statistical guarantees. In addition, they do not consider commonly used (restricted) strong convexity in statistical recovery problems. We advance the frontiers of this problem from both optimization and statistical perspectives. From optimization upfront, we propose a new algorithm named Fast Federated Dual Averaging for strongly convex and smooth loss, which provably enjoys linear speedup and the result matches the best-known convergence rate without the regularizer. From statistical upfront, for restricted strongly convex and smooth loss, we design another algorithm, namely Multi-stage Federated Dual Averaging, and provide a high probability complexity bound with linear speedup up to statistical precision. Numerical experiments in both synthetic and real data demonstrate that our methods perform better than other baselines. To the best of our knowledge, this is the first work providing fast optimization algorithms and statistical optimality guarantees for composite problems in FL.