Lianghao Cao
computational science and engineering researcher
My last name is pronounced ts'ao
Check out our paper on amortized solutions of model-constrained Bayesian inverse problems using surrogate-driven measure transport.
computational science and engineering researcher
My last name is pronounced ts'ao
I am a Postdoctoral Scholar Research Associate in Computing and Mathematical Sciences at Caltech, sponsored by Andrew M. Stuart. Before this position, I obtained a Ph.D. in Computational Science, Engineering, and Mathematics under the supervision of J. Tinsley Oden and Omar Ghattas from the Oden Institute at The University of Texas at Austin.
My research addresses issues at the heart of computational engineering and sciences: understanding, enhancing, and controlling the quality, validity, and reliability of simulation-based predictions of complex physical systems. I have worked on various uncertainty quantification and optimization problems associated with models governed by parametric partial differential equations. I have extensively worked on application problems associated with computational polymer science. I am currently working on constitutive modeling for solid mechanics.
More on my research page and publication page.
I am interested in
A tenure-track position at an R1 research university.
A research position at a national lab or a company with a clear focus on addressing grand challenges in science, engineering, and medicine.
Please feel free to contact me about potential positions.
Links: [Google scholar] [LinkedIn] [CV]
Email: lianghao@caltech.edu
Mar. 2025
George, Margaret, and I are organizing a mini-symposium at SIAM CSE25. The mini-symposium (MS249, Thursday 2:10–3:25 pm and 4:20–5:35 pm) is focused on data-driven methods for multiscale modeling and homogenization. You are cordially invited to attend!
Feb. 2025
Our preprint titled "Learning Memory and Material Dependent Constitutive Laws" is now avaiable on arXiv. This is joint work with Kaushik Bhattacharya, George Stepaniants, Andrew Stuart, and Margaret Trautner. [link]
Dec. 2024
The special issue on "Scientific Machine Learning" is now available online in Foundations of Data Science, a publication of the American Institute of Mathematical Sciences. I am one of the guest editors on this special issue. [link]
Nov. 2024
Our preprint titled "LazyDINO: Fast, scalable, and efficiently amortized Bayesian inversion via structure-exploiting and surrogate-driven measure transport" is now available on arXiv. This is joint work with Joshua Chen, Michael Brennan, Tom O'Leary-Roseberry, Youssef Marzouk, and Omar Ghattas. [link]
You can see more updates here.
July 2025
I will give two talks at USNCCM18 on learning history- and microstructure-dependent constitutive laws: (i) in MS 233 I will present on-going work on Bayesian optimal experimental design, and (ii) in the special session on SciML, I will present operator learning methods with a focus on theoretical results.
Mar. 2025
I presented (on Zoom) at the Frontiers in Scientific Machine Learning Seminar Series hosted by University of Michigan on Mar. 28 at 12–1 PM EST. The talk was on derivative-informed surrogate models (i.e., DINO) and their applications in accelerating Bayesian inversion. [slides with typos fixed!]
Mar. 2025
I presented at SIAM CSE25 in MS281 on Friday at 9:50 am @ 206 on learning history- and microstructure-dependent constitutive laws using operator learning and optimal experimental design. [slides][preprint]
Oct. 2024
I spoke at SIAM MDS24 in MS5 on Monday at 10:20 am @ 218 on derivative-informed operator learning and its application to accelerating large-scale Bayesian inverse problems. [slides][preprint]
You can see my old talks and slides here.