Search Mailing List Archives
[theory-seminar] Theory Lunch 08/06: Margalit Glasgow
Noah Shutty
noaj at stanford.edu
Wed Aug 5 09:30:00 PDT 2020
Hi all,
Theory lunch will be happening tomorrow at
https://stanford.zoom.us/j/96247688402?pwd=MmxYTi9RTzFUKzFLd3Vab1VYTUcyQT09
(12pm
PT) and Margalit will tell us about *Approximate Gradient Codes with
Optimal Decoding*. An abstract can be found below.
*Abstract: *
Gradient codes are used in distributed machine learning problems to more
accurately compute batch gradients in the presence of slow machines, called
stragglers. A gradient code describes a pattern of replicating and
assigning data to each machine, and may guarantee that the full batch
gradient can be computed *exactly *or *approximately. *In this talk, I'll
discuss a new design for gradient codes based on expander graphs that well
approximate the full gradient with both random and adversarial stragglers.
The performance of these codes can be analyzed by studying the connected
components in randomly sparsified expander graphs! I'll also show that
using these codes yields good convergence guarantees for gradient descent
with stragglers.
Joint work with Mary Wootters
See you there!
Noah
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.stanford.edu/pipermail/theory-seminar/attachments/20200805/c7827169/attachment-0001.html>
More information about the theory-seminar
mailing list