Lianghao Cao
computational science and engineering researcher
My last name is pronounced ts'ao
I am a Postdoctoral Scholar Research Associate in Computing and Mathematical Sciences at Caltech, sponsored by Andrew M. Stuart. Before this position, I was a Postdoctoral Fellow at The Oden Institute, The University of Texas at Austin, where I obtained a Ph.D. in Computational Science, Engineering, and mathematics under the supervision of J. Tinsley Oden and Omar Ghattas.
My research addresses issues at the heart of computational engineering, sciences, and medicine: understanding, enhancing, and controlling the quality, validity, and reliability of simulation-based predictions of complex physical systems. I have worked on various uncertainty quantification and optimization problems associated with models governed by parametric partial differential equations. I have extensively worked on application problems associated with computational polymer science. I am currently working on constitutive modeling for solid mechanics.
More on my research page and publication page.
In 2025, I will be looking for full-time positions related to computational science and engineering based in the USA.
I am interested in
A tenure-track position at a R1 research university.
A research position at a national lab or a company with a clear focus on addressing grand challenges in science, engineering, and medicine.
Please feel free to contact me on potential positions.
Links: [Google scholar] [LinkedIn] [CV]
Email: lianghao@caltech.edu
Updates
Nov. 2024
Our preprint titled "LazyDINO: Fast, scalable and efficiently amortized Bayesian inversion via structure-exploiting and surrogate-driven measure transport" is now available on arXiv. This is joint work with Joshua Chen, Michael Brennan, Tom O'Leary-Roseberry, Youssef Marzouk, and Omar Ghattas. [link]
Oct. 2024
DC and I organizd a minisymposium and a miniposterium at SIAM MDS24. The minisymposium (MS5, Monday 9:00--10:40 am @ 218) is focused on Scientific Machine Learning for Inference and Control of High-dimensional Systems. The miniposterium (Monday, 4:30 PM - 6:00 PM @ Grand Ballroom A-B) focused on derivative-informed operator learning for large-scale uncertainty quantification and optimization.
Mar. 2024
Our preprint titled "Derivative-informed neural operator acceleration of geometric MCMC for infinite-dimensional Bayesian inverse problems" is now available on arXiv. This is joint work with Tom O'Leary-Roseberry and Omar Ghattas. [link]
Feb. 2024
Our manuscript, "Bayesian model calibration for block copolymer self-assembly: Likelihood-free inference and expected information gain via measure transport", is now published in Journal of Computational Physics. Congratulations to Ricardo, Josh, Fengyi, Omar, Youssef, and Dr. Oden! [link]
You can see more updates here.
Talks
Oct. 2024
I spoke at SIAM MDS24 in MS5, Monday 10:20 am @ 218, on derivative-informed operator learning and its application to accelerating large-scale Bayesian inverse problems. [slides][preprint]
Sep. 2024
I gave a talk at MORe 2024 on derivative-informed operator learning and its application to accelerating large-scale Bayesian inverse problems. [slides][preprint]
Dec. 2023
I gave an Oden Institute Seminar talk on Dec. 12. I presented ongoing work on efficient geometric MCMC enabled by derivative-informed neural operators. The paper will be available on arXiv soon! Please contact me directly if you'd like to learn more about this work.
Nov. 2023
I gave a talk at the SIAM TX-LA regional meeting in MS29. I presented ongoing work on efficient geometric MCMC for PDE-constrained Bayesian inversion enabled by neural operators with parametric derivative training (DINO). Please contact me directly if you'd like to learn more about this work.
You can see my old talks and slides here.