Skip to main content

Beyond Berry Esseen: Structure and Learning of Sums of Random Variables

Konstantinos Daskalakis ( MIT )

It is well-known that ~1/eps^2 coin tosses are necessary and sufficient to estimate the bias of a coin. But how many samples are needed to learn the distribution of the sum of n independent Bernoullis? I will show that the answer is still O(1/eps^2), independent of n, and will generalize this bound to sums of independent bounded integer random variables (SIIRVs). I will argue that the celebrated Berry-Esseen theorem and its variants fall short from characterizing the general structure of SIIRVs, and offer stronger finitary central limit theorems, which tightly characterizing their structure and have the aforementioned implications to learning.

Speaker bio

Constantinos Daskalakis is the x-window consortium associate professor of computer science at MIT. He holds a diploma in electrical and computer engineering from the National Technical University of Athens, and a Ph.D. in electrical engineering and computer sciences from UC-Berkeley. His research interests lie in theoretical computer science and its interface with economics and probability. Daskalakis has been honored with the 2007 Microsoft Graduate Research Fellowship, the 2008 ACM Doctoral Dissertation Award, the Game Theory and Computer Science Prize from the Game Theory Society, the 2010 Sloan Fellowship in Computer Science, the 2011 SIAM Outstanding Paper Prize, the 2011 Ruth and Joel Spira Award for Distinguished Teaching, and the 2012 Microsoft Research Faculty Fellowship. He is also a recipient of Best Paper awards at the ACM Conference on Economics and Computation in 2006 and in 2013.

 

 

Share this: