Campus Units

Statistics, Computer Science

Document Type

Conference Proceeding

Publication Version

Submitted Manuscript

Link to Published Version

https://arxiv.org/abs/1912.03208v1

Publication Date

2019

Journal or Book Title

arXiv

Abstract

Network-distributed optimization has attracted significant attention in recent years due to its ever-increasing applications. However, the classic decentralized gradient descent (DGD) algorithm is communication-inefficient for large-scale and high-dimensional network-distributed optimization problems. To address this challenge, many compressed DGD-based algorithms have been proposed. However, most of the existing works have high complexity and assume compressors with bounded noise power. To overcome these limitations, in this paper, we propose a new differential-coded compressed DGD (DC-DGD) algorithm. The key features of DC-DGD include: i) DC-DGD works with general SNR-constrained compressors, relaxing the bounded noise power assumption; ii) The differential-coded design entails the same convergence rate as the original DGD algorithm; and iii) DC-DGD has the same low-complexity structure as the original DGD due to a {\em self-noise-reduction effect}. Moreover, the above features inspire us to develop a hybrid compression scheme that offers a systematic mechanism to minimize the communication cost. Finally, we conduct extensive experiments to verify the efficacy of the proposed DC-DGD and hybrid compressor.

Comments

This is a pre-print of the proceeding Zhang, Xin, Jia Liu, Zhengyuan Zhu, and Elizabeth S. Bentley. "Communication-Efficient Network-Distributed Optimization with Differential-Coded Compressors." arXiv preprint arXiv:1912.03208 (2019).

Rights

Works produced by employees of the U.S. Government as part of their official duties are not copyrighted within the U.S. The content of this document is not copyrighted.

Language

en

File Format

application/pdf

Published Version

Share

 
COinS