Lianghao Cao
A Computational Science, Engineering, and Mathematics Researcher
A Computational Science, Engineering, and Mathematics Researcher
I am a Postdoctoral Scholar Research Associate in Computing and Mathematical Sciences at the California Institute of Technology, hosted by Andrew M. Stuart. Before this position, I obtained a Ph.D. in Computational Science, Engineering, and Mathematics under the supervision of J. Tinsley Oden and Omar Ghattas at the Oden Institute of the University of Texas at Austin. A brief professional bio can be found here.
My research combines mechanistic modeling, uncertainty quantification, and scientific machine learning to enhance the reliability of physical simulations and support risk-aware decision-making. I have experience in using modeling and simulation to address challenges in polymer science, multiscale inelasticity, epidemiology, and geophysics. Read more about my research here.
I am applying to tenure-track positions in engineering, statistics/data science, or computational mathematics. Please also feel free to contact me about industry research positions.
Predictive Modeling of
Inhomogenous Polymers
Numerical predictions of block copolymer thin film pattern before and after assimilating microscopy data
Fast and Scalable Methods for Uncertainty Quantification
Riemannian Manifold Hamiltonian Monte Carlo Sampling
Multiscale Modeling and Simulation
Micromechanics simulation of a two-scale viscoelastic material
Design Advanced Mechanical Testing of Materials
Image data obtained from uniaxial tests of a viscoelastic material
March 2026
Our preprint titled "Optimal Experimental Design for Reliable Learning of History-Dependent Constitutive Laws" is now available on arXiv. This is a joint work with Kaushik Bhattacharya and Andrew Stuart. [link]
March 2026
Our paper titled "LazyDINO: Fast, Scalable, and Efficiently Amortized Bayesian Inversion via Structure-Exploiting and Surrogate-Driven Measure Transport" is published in the Journal of Machine Learning Research. This is a joint work with Joshua Chen (joint first author), Michael Brennan, Thomas O'Leary-Roseberry, Youssef Marzouk, and Omar Ghattas. [link]
December 2025
Our preprint titled "Derivative-Informed Fourier Neural Operator: Universal Approximation and Applications to PDE-Constrained Optimization" is now available on arXiv. This is a joint work with Boyuan John Yao, Dingcheng Luo, Nikola Kovachki, Thomas O'Leary-Roseberry, and Omar Ghattas. [link]
March 2026
I will give a talk at the Department of Mathematics at Ohio State University. The topic is learning history-dependent constitutive laws, with a focus on representation theory, model complexity, identifiability, and well-posedness of internal-variable models.
March 2026
I will give a talk at SIAM UQ26 in MS 89 (Monday, March 23, 6:00–6:25 PM) about my ongoing work regarding amortized sequential Bayesian inference and experimental design. See the abstract of the talk here.
February 2026
I will give a colloquium talk at the Department of Mathematics and Statistics at Hunter College on February 5th. The topic is operator learning for predictive scientific computing.
See past updates and presentations here.