HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

D2D^2D2: Decentralized Training over Decentralized Data

{Ce Zhang Ming Yan Hanlin Tang Ji Liu Xiangru Lian}

$D^2$: Decentralized Training over Decentralized Data

Abstract

While training a machine learning model using multiple workers, each of which collects data from its own data source, it would be useful when the data collected from different workers are unique and different. Ironically, recent analysis of decentralized parallel stochastic gradient descent (D-PSGD) relies on the assumption that the data hosted on different workers are not too different. In this paper, we ask the question: Can we design a decentralized parallel stochastic gradient descent algorithm that is less sensitive to the data variance across workers? In this paper, we present D2^22, a novel decentralized parallel stochastic gradient descent algorithm designed for large data variance xr{among workers} (imprecisely, “decentralized” data). The core of D2^22 is a variance reduction extension of D-PSGD. It improves the convergence rate from Oleft(sigmaoversqrtnT+(nzeta2)frac13overT2/3ight)Oleft({sigma over sqrt{nT}} + {(nzeta^2)^{frac{1}{3}} over T^{2/3}}ight)Oleft(sigmaoversqrtnT+(nzeta2)frac13overT2/3ight) to Oleft(sigmaoversqrtnTight)Oleft({sigma over sqrt{nT}}ight)Oleft(sigmaoversqrtnTight) where zeta2zeta^{2}zeta2 denotes the variance among data on different workers. As a result, D2^22 is robust to data variance among workers. We empirically evaluated D2^22 on image classification tasks, where each worker has access to only the data of a limited set of labels, and find that D2^22 significantly outperforms D-PSGD.

Benchmarks

BenchmarkMethodologyMetrics
multi-view-subspace-clustering-on-orlDCSC
Accuracy: 0.811

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
$D^2$: Decentralized Training over Decentralized Data | Papers | HyperAI