From hongyang at cs.stanford.edu Sun Jun 2 22:25:56 2019
From: hongyang at cs.stanford.edu (Hongyang Zhang)
Date: Sun, 2 Jun 2019 22:25:56 -0700
Subject: [theory-seminar] University Oral Exam: Hongyang Zhang,
Monday June 3, 10:30am, Gates 463A
In-Reply-To:
References:
Message-ID:
Reminder that this is tomorrow morning at *10:30am, Gates 463A* (theory
seminar room)! Looking forward to seeing you all.
Best,
Hongyang
On Sat, May 25, 2019 at 10:01 PM Hongyang Zhang
wrote:
> University Oral Examination
>
> *Title: Algorithms and Generalization for Large-Scale Matrices and Tensors*
>
> Hongyang Zhang
> Computer Science Department
> Stanford University
>
> Advised by Ashish Goel and Gregory Valiant
>
> *Monday, June 3, 2019 at 10:30am* (refreshments served at 10:15am)
> *Gates Building, Room 463A*
>
> *Abstract:* Over the past decade, machine learning methods such as deep
> neural networks have made a huge impact on a variety of complex tasks. On
> the other hand, very little is understood about when and why these ML
> methods work in practice. Bridging this gap requires better understanding
> of the non-convex optimization paradigm commonly used in training deep
> neural networks, as well as better modeling of real world data. My thesis
> aims at providing principled algorithms and insights by examining
> analytically tractable objects such as matrices and tensors, that are
> intimately connected to neural networks.
>
> This talk will show a few results:
> i) we study gradient based optimization methods and their generalization
> performance (or sample efficiency) in over-parameterized matrix models. Our
> result highlights the role of the optimization algorithm in explaining
> generalization when there are more trainable parameters than the size of
> the dataset.
> ii) we consider the problem of predicting the missing entries of
> high-dimensional tensor data. We show an interesting representation-sample
> trade-off in the choice of tensor models for fitting the data.
> iii) we present new methods for the classic distance query problem that
> creates state of the art data structures on a variety of large-scale graph
> data.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From wyma at stanford.edu Mon Jun 3 14:24:27 2019
From: wyma at stanford.edu (Weiyun Ma)
Date: Mon, 3 Jun 2019 21:24:27 +0000
Subject: [theory-seminar] Theory Lunch 6/6 -- Mingda Qiao
Message-ID:
Hi everyone,
This Thursday at theory lunch, Mingda will tell us about "Selective Prediction." (See abstract below.)
As always, please join us from noon to 1pm at 463A. This will be the last theory lunch of the quarter. I'd love to continue theory lunch in the summer, starting sometime in July. Please watch for an email containing the sign-up information later.
----------------------------------------------------------
Selective Prediction
Speaker: Mingda Qiao
We consider a model of selective prediction, where the prediction algorithm is given a data sequence in an online fashion and asked to predict a pre-specified statistic of the upcoming data points. The algorithm is allowed to choose when to make the prediction as well as the length of the prediction window, possibly depending on the observations so far. We prove that, even without any distributional assumption on the input data stream, a large family of statistics can be estimated to non-trivial accuracy.
To give one concrete example, suppose that we are given access to an arbitrary binary sequence x_1, ..., x_n of length n. Our goal is to accurately predict the average observation, and we are allowed to choose the window over which the prediction is made: for some t < n and m <= n ? t, after seeing t observations we predict the average of x_{t+1}, ..., x_{t+m}. This particular problem was first studied in Drucker (2013) and referred to as the "density prediction game." We show that the expected squared error of our prediction can be bounded by O(1/log(n)) and prove a matching lower bound, which resolves an open question raised in Drucker (2013). This result holds for any sequence (that is not adaptive to when the prediction is made, or the predicted value), and the expectation of the error is with respect to the randomness of the prediction algorithm. Our results apply to more general statistics of a sequence of observations, and we highlight several open directions for future work.
Based on joint work with Jay Mardia and Gregory Valiant.
----------------------------------------------------------
Best,
Anna
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From wyma at stanford.edu Mon Jun 3 14:24:27 2019
From: wyma at stanford.edu (Weiyun Ma)
Date: Mon, 3 Jun 2019 21:24:27 +0000
Subject: [theory-seminar] Theory Lunch 6/6 -- Mingda Qiao
Message-ID:
Hi everyone,
This Thursday at theory lunch, Mingda will tell us about "Selective Prediction." (See abstract below.)
As always, please join us from noon to 1pm at 463A. This will be the last theory lunch of the quarter. I'd love to continue theory lunch in the summer, starting sometime in July. Please watch for an email containing the sign-up information later.
----------------------------------------------------------
Selective Prediction
Speaker: Mingda Qiao
We consider a model of selective prediction, where the prediction algorithm is given a data sequence in an online fashion and asked to predict a pre-specified statistic of the upcoming data points. The algorithm is allowed to choose when to make the prediction as well as the length of the prediction window, possibly depending on the observations so far. We prove that, even without any distributional assumption on the input data stream, a large family of statistics can be estimated to non-trivial accuracy.
To give one concrete example, suppose that we are given access to an arbitrary binary sequence x_1, ..., x_n of length n. Our goal is to accurately predict the average observation, and we are allowed to choose the window over which the prediction is made: for some t < n and m <= n ? t, after seeing t observations we predict the average of x_{t+1}, ..., x_{t+m}. This particular problem was first studied in Drucker (2013) and referred to as the "density prediction game." We show that the expected squared error of our prediction can be bounded by O(1/log(n)) and prove a matching lower bound, which resolves an open question raised in Drucker (2013). This result holds for any sequence (that is not adaptive to when the prediction is made, or the predicted value), and the expectation of the error is with respect to the randomness of the prediction algorithm. Our results apply to more general statistics of a sequence of observations, and we highlight several open directions for future work.
Based on joint work with Jay Mardia and Gregory Valiant.
----------------------------------------------------------
Best,
Anna
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From ccanonne at cs.stanford.edu Tue Jun 4 09:09:59 2019
From: ccanonne at cs.stanford.edu (=?UTF-8?Q?Cl=c3=a9ment_Canonne?=)
Date: Tue, 4 Jun 2019 09:09:59 -0700
Subject: [theory-seminar] ITCS 2020 Call for Papers
Message-ID:
Hi all,
I am forwarding the ITCS'20 (Innovations in Theoretical Computer
Science) announcement. ITCS is a great and really fun conference: I
encourage you to submit your work there!
As a bonus, for those graduating, recently graduated, or close to, it
features a (now traditional) "graduating bits" part:
> Participants near to graduation (on either side) will be given 4 minutes to present their results, research, plans, personality, and so on. This is one of the important traditions of ITCS, and not to be missed!
Best,
-- Cl?ment
-----------
Dear colleague,
We invite you to submit your papers to the 11th Innovations in
Theoretical Computer Science (ITCS). The conference will will be held at
the University of Washington in Seattle, Washington from January 12-14,
2020.
ITCS seeks to promote research that carries a strong conceptual message
(e.g., introducing a new concept, model or understanding, opening a new
line of inquiry within traditional or interdisciplinary areas,
introducing new mathematical techniques and methodologies, or new
applications of known techniques). ITCS welcomes both conceptual and
technical contributions whose contents will advance and inspire the
greater theory community.
Important dates
Submission deadline: September 9, 2019 (05:59pm PDT)
Notification to authors: October 31, 2019
Conference dates: January 12-14, 2020
See the website at http://itcs-conf.org/itcs20/itcs20-cfp.html for
detailed information regarding submissions.
Program committee
Nikhil Bansal, CWI + TU Eindhoven
Nir Bitansky, Tel-Aviv University
Clement Canonne, Stanford
Timothy Chan, University of Ilinois at Urbana-Champaign
Edith Cohen, Google and Tel-Aviv University
Shaddin Dughmi, University of Southern California
Sumegha Garg, Princeton
Ankit Garg, Microsoft research
Ran Gelles, Bar-Ilan University
Elena Grigorescu, Purdue
Tom Gur, University of Warwick
Sandy Irani, UC Irvine
Dakshita Khurana, University of Illinois at Urbana-Champaign
Antonina Kolokolova, Memorial University of Newfoundland.
Pravesh Kothari, Carnegie Mellon University
Rasmus Kyng, Harvard
Katrina Ligett, Hebrew University
Nutan Limaye, IIT Bombay
Pasin Manurangsi, UC Berkeley
Tamara Mchedlidze, Karlsruhe Institute of Technology
Dana Moshkovitz, UT Austin
Jelani Nelson, UC Berkeley
Merav Parter, Weizmann Institute
Krzysztof Pietrzak, IST Austria
Elaine Shi, Cornell
Piyush Srivastava, Tata Institute of Fundamental Research, Mumbai
Li-Yang Tan, Stanford
Madhur Tulsiani, TTIC
Gregory Valiant, Stanford
Thomas Vidick, California Institute of Technology (chair)
Virginia Vassilevska Williams, MIT
Ronald de Wolf, CWI and University of Amsterdam
David Woodruff, Carnegie Mellon University
From moses at cs.stanford.edu Tue Jun 4 21:22:54 2019
From: moses at cs.stanford.edu (Moses Charikar)
Date: Tue, 4 Jun 2019 21:22:54 -0700
Subject: [theory-seminar] Fwd: Combinatorics seminar and Colloquium talks by
Joel Spencer on Thursday at 2pm and 4:30pm
In-Reply-To:
References:
Message-ID:
Hi folks,
Joel Spencer is speaking at the combinatorics seminar this week (Thu at
2pm) and the Math colloquium at 4:30pm.
See the talk details below. Should be of interest to many of you.
Cheers,
Moses
---------- Forwarded message ---------
From: Jacob Fox
Date: Tue, Jun 4, 2019 at 7:41 PM
Subject: Combinatorics seminar and Colloquium talks by Joel Spencer on
Thursday at 2pm and 4:30pm
To: mathcolloq at lists.stanford.edu
On Thursday (June 6), Joel Spencer will be giving both the combinatorics
seminar at 2pm and the department colloquium at 4:30pm. The combinatorics
seminar information is below.
When: Thursday June 6 , 2pm-3pm
Room: 384-H
Speaker: Joel Spencer (NYU)
Title: Four Discrepancies
Abstract: Paul selects n vectors in n-space, all coordinates one or minus
one. Carole is a balancer, assigning signs to each vector yielding a signed
vector sum P. The value V, which Carole attempts to minimize, is the
maximal absolute value of the coordinates of P.
We consider four variants of this problem. Paul may play randomly (then
Carole minimizes the expectation of V) or adversarially. Carole may play
On-Line, selecting each sign immediately upon seeing its vector; or
Off-Line, waiting until Paul has given all the vectors before deciding on
the signs.
All four variants are interesting and will be discussed. The order of V is
known in all variants, though the constants remain elusive. We emphasize
new results, with Nikhil Bansal, for the random on-line variant.
The seminar webpage is:
http://mathematics.stanford.edu/combinatorics-seminar/
In addition, Joel Spencer will speak on "Probability, Combinatorics, . . .
and Logic" in the department colloquium at 4:30 in room 380-W.
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From rmhulett at stanford.edu Wed Jun 5 15:59:21 2019
From: rmhulett at stanford.edu (Reyna Marie Hulett)
Date: Wed, 5 Jun 2019 22:59:21 +0000
Subject: [theory-seminar] Quals Talk: Coding for DNA Storage
Message-ID:
Hi all,
I will be giving my quals presentation on Coding over Sets and Multisets for DNA Storage this Friday at 12:30 pm in Gates 463A. Feel free to come if you are interested. Lunch will be provided (thanks Mary!).
Best,
-Reyna
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From rmhulett at stanford.edu Fri Jun 7 12:24:31 2019
From: rmhulett at stanford.edu (Reyna Marie Hulett)
Date: Fri, 7 Jun 2019 19:24:31 +0000
Subject: [theory-seminar] Quals Talk: Coding for DNA Storage
In-Reply-To:
References:
Message-ID: <01d406d5-700c-4303-af66-617099836e7e@email.android.com>
Reminder: this is happening in 5 minutes!
On Jun 5, 2019 3:59 PM, Reyna Marie Hulett wrote:
Hi all,
I will be giving my quals presentation on Coding over Sets and Multisets for DNA Storage this Friday at 12:30 pm in Gates 463A. Feel free to come if you are interested. Lunch will be provided (thanks Mary!).
Best,
-Reyna
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From whkong at stanford.edu Tue Jun 11 16:02:37 2019
From: whkong at stanford.edu (Weihao Kong)
Date: Tue, 11 Jun 2019 23:02:37 +0000
Subject: [theory-seminar] Weihao's thesis defense and farewell lunch
Message-ID:
Hi friends,
I?m going to defend my thesis on June 17th at 10AM in Gates Building, 463A, and will organize a farewell lunch starting from 12:30PM in the same place. Please fill in this form if you plan to come to either of the event, so I can have an idea of the amount of food and drinks I should order.
https://docs.google.com/forms/d/e/1FAIpQLSfQry3B47-A972jvoVIqkgiREUflnOhhWNbqYlU9NbYyVAC8w/viewform?usp=pp_url
Thanks,
Weihao
From whkong at stanford.edu Sun Jun 16 23:38:15 2019
From: whkong at stanford.edu (Weihao Kong)
Date: Mon, 17 Jun 2019 06:38:15 +0000
Subject: [theory-seminar] University Oral Exam: Weihao Kong,
Monday June 17, 10:00AM, Gates 463A
In-Reply-To: <77132788-88A3-46CE-AB5B-4D577CEF71C4@stanford.edu>
References: <77132788-88A3-46CE-AB5B-4D577CEF71C4@stanford.edu>
Message-ID: <504CA9F5-2457-406C-B99B-F5BF0FD8E22D@stanford.edu>
Reminder, this is tomorrow morning at 10AM.
Thanks,
Weihao
> On Jun 10, 2019, at 11:26 PM, Weihao Kong wrote:
>
> University Oral Examination
>
> Title: The Surprising Power of Little Data.
>
> Weihao Kong
> Computer Science Department
> Stanford University
>
> Advised by Gregory Valiant
>
> Monday, June 17, 2019 at 10:00am
> Gates Building, Room 463A
>
> Abstract:
>
> Despite the rapid growth of the size of our datasets, the inherent complexity of the problems we are solving is also growing, if not at an even faster rate. This prompts the question of how to infer the most information from the available data.
>
> I will discuss several examples of my research that reveal a surprising ability to extract accurate information from modest amounts of data. The first setting that I discuss considers data provided by a large number of heterogeneous individuals, and we show that the empirical distribution of the data can be significantly "de-noised". The second setting considers estimating the covariance spectrum of a high-dimensional distribution, in the sublinear sample regime where the empirical distribution of the data is misleading. The final portion of my talk focuses on estimating "learnability": given too little data to learn an accurate prediction model, we can accurately estimate the value of collecting more data. Specifically, for some natural model classes, we can estimate the performance of the best model in the class, given too little data to find any model in the class that would achieve good prediction error. We extend our techniques for estimating learnability to more general stochastic optimization problems, including these in the contextual bandit setting. In most of these settings, our algorithms are provably information-theoretically optimal and are also highly practical.
>
>
From wyma at stanford.edu Sat Jun 22 22:28:15 2019
From: wyma at stanford.edu (Weiyun Ma)
Date: Sun, 23 Jun 2019 05:28:15 +0000
Subject: [theory-seminar] Theory Lunch -- Summer Edition
Message-ID:
Hi everyone,
Hope your summer is starting off well!
This is to announce that theory lunch for the summer quarter will start on 7/11. We will not meet on 6/27 because of STOC or on 7/4 because of the Independence Day holiday. If you'd like to give a talk or organize some activities, please sign up using the following link:
https://docs.google.com/document/d/1S0QcDMTn-JRaP1cRFRihZyeNfHUmpYLkyORKMcGIBgY/edit
Please also share this information and the above sign-up link with any visitors that you know. We'd love to hear from them as well!
Best,
Anna
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From wyma at stanford.edu Sat Jun 22 22:28:15 2019
From: wyma at stanford.edu (Weiyun Ma)
Date: Sun, 23 Jun 2019 05:28:15 +0000
Subject: [theory-seminar] Theory Lunch -- Summer Edition
Message-ID:
Hi everyone,
Hope your summer is starting off well!
This is to announce that theory lunch for the summer quarter will start on 7/11. We will not meet on 6/27 because of STOC or on 7/4 because of the Independence Day holiday. If you'd like to give a talk or organize some activities, please sign up using the following link:
https://docs.google.com/document/d/1S0QcDMTn-JRaP1cRFRihZyeNfHUmpYLkyORKMcGIBgY/edit
Please also share this information and the above sign-up link with any visitors that you know. We'd love to hear from them as well!
Best,
Anna
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From samkwong at stanford.edu Mon Jun 24 17:15:53 2019
From: samkwong at stanford.edu (Samuel Kwong)
Date: Tue, 25 Jun 2019 00:15:53 +0000
Subject: [theory-seminar] RSVP to attend the WHIL AI Seminar Thursday,
June 27 @ Stanford EHS
Message-ID:
Stanford?s Workplace Health Innovation Lab (WHIL) invites you to attend a special seminar.
TOPIC: Selfie: Self-supervised Pretraining for Image Embedding
PRESENTER: Trieu H. Trinh (Google Brain)
DATE/TIME: Thursday, June 27, 2019 @ 11:00 AM - 12:00 PM
LOCATION: Stanford University Environmental Health & Safety ? 484 Oak Road, Stanford, CA 94305 (El Capitan Conference Room, Rm 118)
ABSTRACT: Selfie is a pretraining technique, which stands for SELFsupervised Image Embedding. Selfie generalizes the concept of masked language modeling to continuous data, such as images. Given masked-out patches in an input image, the method learns to select the correct patch, among other ?distractor? patches sampled from the same image, to fill in the masked location. This classification objective sidesteps the need for predicting exact pixel values of the target patches. The pretraining architecture of Selfie includes a network of convolutional blocks to process patches followed by an attention pooling network to summarize the content of unmasked patches before predicting masked ones. During finetuning, it reuses the convolutional weights found by pretraining. Selfie is evaluated on three benchmarks (CIFAR-10, ImageNet 32 ? 32, and ImageNet 224 ? 224) with varying amounts of labeled data, from 5% to 100% of the training sets. This pretraining method provides consistent improvements to ResNet-50 across all settings compared to the standard supervised training of the same network. Notably, on ImageNet 224 ? 224 with 60 examples per class (5%), the method improves the mean accuracy of ResNet-50 from 35.6% to 46.7%, an improvement of 11.1 points in absolute accuracy. This pretraining method also improves ResNet-50 training stability, especially on low data regime, by significantly lowering the standard deviation of test accuracies across datasets.
LUNCH PROVIDED, RSVP ONLY
RSVP: CHRISTINE LABSON AT CLABSON at STANFORD.EDU
ABOUT US: Stanford?s new Workplace Health Innovation Lab (WHIL) works at the intersection of medicine and computer science to improve workplace safety through the application of novel technologies and the creation of new clinical paradigms. Our multidisciplinary approach partners our physicians specializing in Occupational Medicine with leading and emerging computer vision and AI experts to provide real-world solutions to problems in workplace health and safety. (Learn more at WHIL.stanford.edu)
--
Samuel Kwong
Stanford University | Class of 2020
B.S. Candidate | Computer Science
M.S. Candidate | Computer Science
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
-------------- next part --------------
A non-text attachment was scrubbed...
Name: WHIL AI Seminar_ June 27.pdf
Type: application/pdf
Size: 130447 bytes
Desc: WHIL AI Seminar_ June 27.pdf
URL: