Search Mailing List Archives

Limit search to: Subject & Body Subject Author
Sort by: Reverse Sort
Limit to: All This Week Last Week This Month Last Month
Select Date Range     through    

[theory-seminar] Theory Lunch 08/06: Margalit Glasgow

Noah Shutty noaj at
Wed Aug 5 09:30:00 PDT 2020

Hi all,

Theory lunch will be happening tomorrow at
PT) and Margalit will tell us about *Approximate Gradient Codes with
Optimal Decoding*. An abstract can be found below.

*Abstract: *
Gradient codes are used in distributed machine learning problems to more
accurately compute batch gradients in the presence of slow machines, called
stragglers. A gradient code describes a pattern of replicating and
assigning data to each machine, and may guarantee that the full batch
gradient can be computed *exactly *or *approximately. *In this talk, I'll
discuss a new design for gradient codes based on expander graphs that well
approximate the full gradient with both random and adversarial stragglers.
The performance of these codes can be analyzed by studying the connected
components in randomly sparsified expander graphs! I'll also show that
using these codes yields good convergence guarantees for gradient descent
with stragglers.
Joint work with Mary Wootters

See you there!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the theory-seminar mailing list