Instructors: Ulisses Braga-Neto and Ming Zhong
Course Description: This graduate course provides an introduction to the algorithmic and computational foundations of Scientific Machine Learning (SciML). The course starts with an introduction to scientific machine learning, followed by a brief review of traditional machine learning. The course covers the basics of scientific computation, including ODE and PDE numerical methods, and the basic SciML algorithms: automatic differentiation, physics-informed neural networks, and physics-informed Gaussian processes. Applications to forward prediction, inverse modeling, and uncertainty quantification are presented. Additional material will be discussed through student presentations of selected publications in the area. The course is integrated with, and benefits from, the educational activities of the TAMIDS SciML Thematic Lab.
Acknowledgment: The development of this course is supported by TAMIDS through its Data Science Course Development Program.
Lecture 1: Introduction to Scientific Machine Learning
- An overview of scientific machine learning and scientific laws expressed through equations, including several examples of applications.
- Lecture Slides (this will play animations if opened in Keynote or Powerpoint)
- Lecture Slides (static PDF)
- Additional resources:
- Rackauckas: The Use and Practice of Scientific Machine Learning
- DOE Report: Basic Research Needs for Scientific Machine Learning
- Karniadakis et al.: Physics-Informed Machine Learning
Lecture 2: Review of Machine Learning
- A review of the basic concepts of machine learning, followed by an overview of least-squares parametric regression using linear and neural network models, and nonparametric regression using Gaussian processes.
- Lecture Slides (PDF)
- Assignment 1
- Additional resources:
- Bishop: Pattern Recognition and Machine Learning
- Jazwinski: Stochastic Processes and Filtering Theory
- Rasmussen and Williams: Gaussian Processes for Machine Learning
- Braga-Neto: Fundamentals of Pattern Recognition and Machine Learning
Lecture 3: Introduction to Scientific Computing
- An introduction to the basic notions of differential equations and their numerical solutions using classical discretization schemes, with an emphasis on the methods most closely related and relevant to scientific machine learning.
- Part 1 – Basic Concepts and Ordinary Differential Equations: Lecture Slides (PDF)
- Part 2 – Partial Differential Equations: Lecture Slides (PDF)
- Assignment 2
- Additional resources:
Lecture 4: Automatic Differentiation
- Introduction to forward-mode and reverse-mode automatic differentiation with examples, and a detailed exposition of backpropagation training for neural networks.
- Lecture Slides (PDF)
- Additional Resources:
- Rall: “Perspectives on Automatic Differentiation: Past, Present, and Future?”. Chapter 1 of Automatic Differentiation: Applications, Theory, and Implementations.
- Baydin et al.: Automatic Differentiation in Machine Learning: a Survey.
Lecture 5: Forward Modeling with Physics-Informed Neural Networks (PINNs)
- Description of the basic physics-informed neural network (PINN) algorithm. PINN training: Initialization, gradient descent with adaptive learning rate and momentum, learning rate schedule, second-order methods. Tensorflow 2.x implementation of a PINN to solve the Burgers PDE. Self-Adaptive PINNs.
- Lecture Slides (PDF)
- SciML_Burgers.ipynb (Jupyter notebook on google colab)
- Additional Resources:
- Raissi, Perdikaris, and Karniadakis: Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.
- McClenny and Braga-Neto: Self-Adaptive Physics-Informed Neural Networks using a Soft Attention Mechanism.
- Geron: Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow, 2nd Edition.
Lecture 6: Forward Modeling with Physics-Informed Gaussian Processes (PIGPs)
- Introduction to Reproducible Kernel Hilbert Spaces (RKHS). Solving nonlinear PDEs by Gaussian Processes as optimal recovery in a RKHS. Transformation into finite-dimensional optimization and solution using the Gauss-Newton Method. Solution of Heat, Darcy, Eikonal, and Viscous Burgers PDEs. Alternative approaches.
- Lecture Slides (PDF)
- SciML_Elliptic.ipynb, SciML_Heat.ipynb (Jupyter notebooks on google colab)
- Additional Resources:
- Chen, Hosseini, Owhadi, Stuart, Solving and Learning Nonlinear PDEs with Gaussian Processes.
- Raissi, Perdikaris, Karniadakis, Machine Learning of Linear Differential Equations using Gaussian Processes.
- Raissi, Perdikaris, Karniadakis, Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations.
- Long, Wang, Krishnapriyan, Kirby, Zhe, Mahoney, AutoIP: A United Framework to Integrate Physics into Gaussian Processes.
Lecture 7: Inverse Problems
- Introduction to the theory of inverse problems involving differential equations. Integral equations. Criteria for well-posedness. Tikhonov regularization. Solving inverse problems using PINNs and PIGPs.
- Part 1 – Theoretical Concepts and PINNs for Inverse Problems: Lecture Slides (PDF)
- Part 2 – PIGPs for Inverse Problems: Lecture Slides (PDF)
- SciML_Burgers_Inverse.ipynb (Jupyter notebook on google colab)
- Assignment 3
- Additional Resources:
- Isakov, Inverse Problems for Partial Differential Equations
- Hananoglu, Romanov, Introduction to Inverse Problems for Differential Equations
- Raissi, Yazdani, Karniadakis, Hidden Fluid Mechanics: Learning velocity and pressure fields from flow visualizations
- Raissi, Perdikaris, and Karniadakis: Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.
Lecture 8: Uncertainty Quantification
- Uncertainty in practical systems. Aleatoric and epistemic uncertainty. Physics-informed Surrogate models. Bayesian models. Uncertainty quantification by Bayesian PINN, ensembles of PINNs, and PIGPs.
- Lecture Slides (PDF)
- Additional Resources:
- Psaros, Meng, Zuo, Guo, Karniadakis, Uncertainty Quantification in Scientific Machine Learning: Methods, Metrics, and Comparisons
- Sudret, Marelli, Wiart, Surrogate models for uncertainty quantification: An overview
- Neal, Bayesian Learning for Neural Networks
- Yang, Meng, Karniadakis, B-PINNs: Bayesian Physics-Informed Neural Networks for Forward and Inverse PDE Problems with Noisy Data
- Davi, Braga-Neto, PSO-PINN: Physics-Informed Neural Networks Trained with Particle Swarm Optimization