Distributed Optimization for Deep Learning with Gossip Exchange - ETIS, équipe MIDI Accéder directement au contenu
Article Dans Une Revue Neurocomputing Année : 2019

Distributed Optimization for Deep Learning with Gossip Exchange

Résumé

We address the issue of speeding up the training of convolutional neural networks by studying a distributed method adapted to stochastic gradient descent. Our parallel optimization setup uses several threads, each applying individual gradient descents on a local variable. We propose a new way of sharing information between different threads based on gossip algorithms that show good consensus convergence properties. Our method called GoSGD has the advantage to be fully asynchronous and decentralized.
Fichier principal
Vignette du fichier
2018NeuroComputingcord_sans marque.pdf (1.55 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01930346 , version 1 (15-01-2019)

Identifiants

Citer

Michael Blot, David Picard, Nicolas Thome, Matthieu Cord. Distributed Optimization for Deep Learning with Gossip Exchange. Neurocomputing, 2019, 330, pp.287-296. ⟨10.1016/j.neucom.2018.11.002⟩. ⟨hal-01930346⟩
254 Consultations
276 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More