[Sds-seminars] [Sds-announce] S&DS Seminar: Thuy-Duong "June" Vuong, 03/24/25, 4pm-5pm, KT Rm. 1327, "Efficiently learning and sampling from multimodal distributions using data-based initialization"
Torres, Elizavette
elizavette.torres at yale.edu
Mon Mar 17 14:18:38 EDT 2025
Department of Statistics and Data Science<https://statistics.yale.edu/>
Thuy-Duong "June" Vuong, Miller Institute, Berkeley
[https://statistics.yale.edu/sites/default/files/styles/user_picture_node/public/citations_3.jpg?itok=28rR9-dp]
Date: Monday, March 24, 2025
Time: 4:00PM to 5:00PM
Location: Kline Tower, 13th Floor, Rm. 1327 See map<http://maps.google.com/?q=219+Prospect+Street%2C+New+Haven%2C+CT%2C+06511%2C+us>
219 Prospect Street
New Haven, CT 06511
Webcast Option: https://yale.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=ea4e0f03-5e28-48a7-b343-b233012bce08
Title: Efficiently learning and sampling from multimodal distributions using data-based initialization
Information and Abstract: Learning to sample is a central task in generative AI: the goal is to generate (infinitely many more) samples from a target distribution $\mu$ given a small number of samples from $\mu.$ It is well-known that traditional algorithms such as Glauber or Langevin dynamics are highly inefficient when the target distribution is multimodal, as they take exponential time to converge from a \emph{worst case start}, while recently proposed algorithms such as denoising diffusion (DDPM) require information that is computationally hard to learn. In this talk, we propose a novel and conceptually simple algorithmic framework to learn multimodal target distributions by initializing traditional sampling algorithms at the empirical distribution. As applications, we show new results for two representative distribution families: Gaussian mixtures and Ising models. When the target distribution $\mu$ is a mixture of $k$ well-conditioned Gaussians, we show that the (continuous) Langevin dynamics initialized from the empirical distribution over $\tilde{O}(k/\epsilon^2)$ samples, with high probability over the samples, converge to $\mu” in $\tilde{O}(1)$-time; both the number of samples and convergence time are optimal. When $\mu$ is a low-complexity Ising model, we show a similar result for the Glauber dynamics with approximate marginals learned via pseudolikelihood estimation, demonstrating for the first time that such low-complexity Ising models can be efficiently learned from samples.”
Based on joint work with Frederic Koehler and Holden Lee.
3:30pm - Pre-talk meet and greet teatime - 219 Prospect Street, 13 floor, there will be light snacks and beverages in the kitchen area.
For more details and upcoming events visit our website at https://statistics.yale.edu/calendar.
Department of Statistics and Data Science
Yale University
Kline Tower
219 Prospect Street
New Haven, CT 06511
https://statistics.yale.edu/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.yale.edu/pipermail/sds-seminars/attachments/20250317/b86c51e7/attachment.html>
-------------- next part --------------
--
Sds-announce mailing list
Sds-announce at mailman.yale.edu
https://mailman.yale.edu/mailman/listinfo/sds-announce
More information about the Sds-seminars
mailing list