Search Mailing List Archives


Limit search to: Subject & Body Subject Author
Sort by: Reverse Sort
Limit to: All This Week Last Week This Month Last Month
Select Date Range     through    

[theory-seminar] Two Theory Seminars This Week: Saeed Seddighin on 12/5 and Sam Hopkins on 12/7

Ofir Geri ofirgeri at stanford.edu
Mon Dec 3 12:35:35 PST 2018


Hi all,


This week we will have two theory seminars:

  1.  Saeed Seddighin (University of Maryland) on Wednesday 12/5, 3:00 PM in Gates 463A.
  2.  Sam Hopkins (UC Berkeley) on Friday 12/7, 3:00 PM in Gates 392.

Please see the abstracts below. If you are interested in meeting with Sam Hopkins, please email Mary at marykw at stanford.edu<mailto:marykw at stanford.edu>

Hope to see you there!
Ofir

Fast and Parallel Algorithms for Edit Distance and Longest Common Subsequence
Wednesday 12/5, 3:00 PM, Gates 463A
Speaker: Saeed Seddighin (University of Maryland)

String similarity measures are among the most fundamental problems in computer science. The notable examples are edit distance (ED) and longest common subsequence (LCS). These problems find their applications in various contexts such as computational biology, text processing, compiler optimization, data analysis, image analysis, etc. In this talk, I'll present fast and parallel algorithms for both problems. In the first part of my talk, I will present an algorithm for approximating edit distance within a constant factor in truly subquadratic time. This question has been open for 3 decades and only recently we were able to give positive answers to it.

In the second part of my talk, I will present MPC algorithms for both edit distance and longest common subsequence. These algorithms can be seen as extensions of the previous ideas to the MPC model. The algorithms are optimal with respect to round complexity, time complexity, and approximation factor.

Mean Estimation with Sub-Gaussian Rates in Polynomial Time
Friday 12/7, 3:00 PM, Gates 392
Speaker: Sam Hopkins (UC Berkeley)

We study polynomial-time algorithms for a fundamental statistics problem: estimating the mean of a random vector from i.i.d. samples. Focusing on the heavy-tailed case, we assume only that the random vector X has finite mean and covariance. In this setting, the radius of confidence intervals achieved by the empirical mean are large compared to the case that X is Gaussian or sub-Gaussian. On the other hand, estimators based on high-dimensional medians can achieve tighter confidence intervals, at the cost of potential computational intractability.

We offer the first polynomial time algorithm to estimate the mean with sub-Gaussian-size confidence intervals under such mild assumptions. Our algorithm is based on a new semidefinite programming relaxation of a high-dimensional median. Previous estimators which assumed only existence of finitely-many moments of X either sacrifice sub-Gaussian performance or are only known to be computable via brute-force search procedures requiring time exponential in the dimension.

<mailto:marykw at stanford.edu>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.stanford.edu/pipermail/theory-seminar/attachments/20181203/1deb4e5d/attachment.html>


More information about the theory-seminar mailing list