[Sds-seminars] S&DS In-Person Seminar, Jiaoyang Huang, 12/19, 4pm, "Efficient derivative-free Bayesian inference for large-scale inverse problems"
elizavette.torres at yale.edu
elizavette.torres at yale.edu
Wed Dec 14 12:39:57 EST 2022
<https://statistics.yale.edu/> <https://statistics.yale.edu/>
Department of Statistics and Data Science
In-Person seminars will be held at Dunham Lab, 10 Hillhouse Ave., Room 220,
with an option of remote participation via zoom.
<x-apple-data-detectors://10/> 3:30pm - Pre-talk meet and greet, DL Suite
222, Room 228
JIAOYANG HUANG, University of Pennsylvania
Date: Monday, December 19, 2022
Time: 4:00PM to 5:00PM
Location: Dunham Lab. <http://maps.google.com/?q=10+Hillhouse+Avenue%2C+Rm.
+220%2C+New+Haven%2C+CT%2C+06511%2C+us> see map
10 Hillhouse Avenue, Rm. 220
New Haven, CT 06511
<https://statistics.wharton.upenn.edu/profile/huangjy/> Website
Title: Efficient derivative-free Bayesian inference for large-scale inverse
problems
Information and Abstract:
We consider Bayesian inference for large-scale inverse problems, where
computational challenges arise from the need for the repeated evaluations of
an expensive forward model, which is often given as a black box or is
impractical to differentiate. In this talk I will propose a new
derivative-free algorithm Unscented Kalman Inversion, which utilizes the
ideas from Kalman filter, to efficiently solve these inverse problems.
First, I will explain some basics about Variational Inference under general
metric tensors. In particular, under the Fisher-Rao metric, the Gaussian
Variational Inference leads to the natural gradient descent. Next, I will
discuss two different views of our algorithm. It can be obtained from a
Gaussian approximation of the filtering distribution of a novel mean field
dynamical system. And it can also be viewed as a derivative-free
approximation of the natural gradient descent. I will also discuss
theoretical properties for linear inverse problems. Finally, I will discuss
an extension of our algorithm using Gaussian mixture approximation, which
leads to the Gaussian Mixture Kalman Inversion, an efficient derivative-free
Bayesian inference approach capable of capturing multiple modes. I will
demonstrate the effectiveness of this approach in several numerical
experiments with multimodal posterior distributions, which typically
converge within O(10) iterations.
This is based on joint works with Yifan Chen, Daniel Zhengyu Huang,
Sebastian Reich and Andrew Stuart.
Zoom Option: Link: Join from PC, Mac, Linux, iOS or Android: <https://yale.
zoom.us/j/92411077917?>
https://yale.zoom.us/j/92411077917?pwd=aXhnTnFGRXFoaTVDczNjeFFKeWpTQT09
Password: 24
Or Telephone:203-432-9666 (2-ZOOM if on-campus) or 646 568 7788
Meeting ID: 924 1107 7917
Department of Statistics and Data Science
Yale University
24 Hillhouse Avenue
New Haven, CT 06511
t 203.432.0666
f 203.432.0633
For more details and upcoming events visit our website at
<http://statistics.yale.edu/> http://statistics.yale.edu/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.yale.edu/pipermail/sds-seminars/attachments/20221214/5f285a25/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.jpg
Type: image/jpeg
Size: 2925 bytes
Desc: not available
URL: <http://mailman.yale.edu/pipermail/sds-seminars/attachments/20221214/5f285a25/attachment.jpg>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image002.jpg
Type: image/jpeg
Size: 26228 bytes
Desc: not available
URL: <http://mailman.yale.edu/pipermail/sds-seminars/attachments/20221214/5f285a25/attachment-0001.jpg>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image004.jpg
Type: image/jpeg
Size: 6306 bytes
Desc: not available
URL: <http://mailman.yale.edu/pipermail/sds-seminars/attachments/20221214/5f285a25/attachment-0002.jpg>
More information about the Sds-seminars
mailing list