<div dir="ltr"><div class="gmail-group-header" style="box-sizing:inherit;font-family:Mallory,Verdana,Arial,Helvetica,sans-serif;font-size:17px"><span class="gmail-field gmail-field-name-title gmail-field-type-ds gmail-field-label-hidden" style="box-sizing:inherit"><span style="box-sizing:inherit"><span class="gmail-odd gmail-first gmail-last" style="box-sizing:inherit"><h1 class="gmail-title" style="box-sizing:inherit;font-weight:300;padding:0px;font-feature-settings:"kern","liga","dlig";font-size:1.76471em;line-height:normal;font-stretch:normal;color:rgb(0,60,118);text-transform:uppercase;display:inline-block">ILIAS ZADIK</h1></span></span></span>, <span class="gmail-field gmail-field-name-field-university gmail-field-type-text gmail-field-label-hidden" style="box-sizing:inherit;margin-left:5px"><span style="box-sizing:inherit"><span class="gmail-odd gmail-first gmail-last" style="box-sizing:inherit">MIT</span></span></span><div class="gmail-field gmail-field-name-field-abstract-title gmail-field-type-text gmail-field-label-hidden" style="box-sizing:inherit"><div class="gmail-field-items" style="box-sizing:inherit"><div class="gmail-field-item even" style="box-sizing:inherit;font-size:20px;font-weight:600;line-height:1.2;margin-bottom:1em;margin-top:0.5em">The price of computational efficiency in high-dimensional estimation</div></div></div></div><div class="gmail-group-left" style="box-sizing:inherit;float:left;width:auto;padding-right:15.3125px;max-width:30%;font-family:Mallory,Verdana,Arial,Helvetica,sans-serif;font-size:17px"><div class="gmail-field gmail-field-name-field-image gmail-field-type-image gmail-field-label-hidden" style="box-sizing:inherit"><div class="gmail-field-items" style="box-sizing:inherit"><div class="gmail-field-item even" style="box-sizing:inherit"><img src="https://statistics.yale.edu/sites/default/files/styles/user_picture_node/public/s_zadik_ilias_0.jpg?itok=KL-5ZR-G" width="400" height="480" alt="" style="box-sizing: inherit; border: 0px; max-width: 100%; height: auto; vertical-align: bottom;"></div></div></div></div><div class="gmail-group-right" style="box-sizing:inherit;float:left;width:auto;max-width:65%;padding-left:22.9766px;font-family:Mallory,Verdana,Arial,Helvetica,sans-serif;font-size:17px"><div class="gmail-field gmail-field-name-field-event-time gmail-field-type-datetime gmail-field-label-hidden" style="box-sizing:inherit"><div class="gmail-field-items" style="box-sizing:inherit"><div class="gmail-field-item even" style="box-sizing:inherit;color:rgb(0,60,118);font-size:18px;line-height:1.4"><span class="gmail-date-display-single" style="box-sizing:inherit">Thursday, February 23, 2023<span class="gmail-date-display-range" style="box-sizing:inherit;float:left;width:419.359px"><span class="gmail-date-display-start" style="box-sizing:inherit">10:30AM</span> to <span class="gmail-date-display-end" style="box-sizing:inherit">11:30AM</span></span></span></div></div></div><div class="gmail-field gmail-field-name-field-location gmail-field-type-location gmail-field-label-hidden" style="box-sizing:inherit"><div class="gmail-field-items" style="box-sizing:inherit"><div class="gmail-field-item even" style="box-sizing:inherit"><div class="gmail-location gmail-vcard" style="box-sizing:inherit"><div class="gmail-adr" style="box-sizing:inherit"><span class="gmail-fn" style="box-sizing:inherit">Dunham Lab. Room 220</span> <span class="gmail-map-icon" style="box-sizing:inherit;margin-left:0.25em;font-size:0.925em;line-height:1.55;letter-spacing:0.05em;word-spacing:0.05em;text-transform:lowercase;font-feature-settings:"smcp""><a href="http://maps.google.com/?q=10+Hillhouse+Avenue%2C+2nd+Floor%2C+New+Haven%2C+%2C+%2C+us" style="box-sizing:inherit;outline:none;line-height:inherit;color:rgb(40,109,192)">see map</a> </span><div class="gmail-street-address" style="box-sizing:inherit">10 Hillhouse Avenue, 2nd Floor</div><span class="gmail-locality" style="box-sizing:inherit">New Haven</span></div></div></div></div></div><div class="gmail-field gmail-field-name-field-website gmail-field-type-link-field gmail-field-label-hidden" style="box-sizing:inherit"><div class="gmail-field-items" style="box-sizing:inherit"><div class="gmail-field-item even" style="box-sizing:inherit"><a href="https://iliaszadik.github.io/" style="box-sizing:inherit;text-decoration-line:none;outline:none;line-height:1.5;color:rgb(0,60,118);font-size:16px">Website</a></div></div></div></div><div class="gmail-group-footer" style="box-sizing:inherit;clear:both;padding-top:15px;font-family:Mallory,Verdana,Arial,Helvetica,sans-serif;font-size:17px"><div class="gmail-field gmail-field-name-body gmail-field-type-text-with-summary gmail-field-label-above" style="box-sizing:inherit"><div class="gmail-field-label" style="box-sizing:inherit;font-weight:bold">Information and Abstract: </div><div class="gmail-field-items" style="box-sizing:inherit"><div class="gmail-field-item even" style="box-sizing:inherit"><p style="box-sizing:inherit;margin:0px 0px 1em;padding:0px">In recent years we have experienced a remarkable growth on the number and size of available datasets. Such growth has led to the intense and challenging pursuit of estimators which are provably both computationally efficient and statistically accurate. Notably, the analysis of polynomial-time estimators has revealed intriguing phenomena in several high dimensional estimation tasks, such as their apparent failure of such estimators to reach the optimal statistical guarantees achieved among all estimators (that is the presence of a non-trivial “computational-statistical trade-off”). </p><p style="box-sizing:inherit;margin:0px 0px 1em;padding:0px">In this talk, I will present new such algorithmic results for the well-studied planted clique model and for the fundamental sparse regression model. For planted clique, we reveal the surprising severe failure of the Metropolis process to work in polynomial-time, even when simple degree heuristics succeed. In particular, our result resolved a well-known 30-years old open problem on the performance of the Metropolis process for the model, posed by Jerrum in 1992. For sparse regression, we show the failure of large families of polynomial-time estimators, such as MCMC and low-degree polynomial methods, to improve upon the best-known polynomial-time regression methods. As an outcome, our work offers rigorous evidence that popular regression methods such as LASSO are optimally balancing their computational and statistical recourses.</p></div></div></div></div></div>