<div dir="ltr"><div class="m_4743305257957148649gmail-ajy"><img class="m_4743305257957148649gmail-ajz" id="m_4743305257957148649gmail-:42u" src="https://mail.google.com/mail/u/0/images/cleardot.gif" alt=""><br><br></div><div class="m_4743305257957148649gmail-ajy">Hi Everyone,<br><br></div><div>Yu Lu will talk in the YPNG seminar this week. This is a continuation of the work he<br>presented a few weeks ago.<br></div><div><br></div><div>Title: Statistical and Computational Guarantees of Lloyd's Algorithm</div><div><i><b><br></b></i></div><div>Abstract: <span style="font-size:9pt">Clustering is a fundamental problem in statistics and machine
learning. Lloyd’s <br>algorithm, proposed in 1957 by Stuart Lloyd, is still
possibly the most widely used clustering <br>algorithm in practice due
to its simplicity and empirical performance. </span>However, there has
<br>been little theoretical investigation on the statistical and
computational guarantees of <br>Lloyd's algorithm. In this paper, we show
that under a suitable initialization, Lloyd's <br>algorithms converges to an
exponential small clustering error after an order of log n<br>iterative
steps for clustering mixture of sub-Gaussians, where n is the sample
size. <br>Implications of our results on random initialization are
discussed. </div><div><br></div>See you Friday at 11am in the Stat's classroom.<br><br>Regards,<br>sekhar</div>