Instructor: Ulisses Braga-Neto
Course Description: This course introduces the foundations of Scientific Machine Learning (SciML), a rapidly developing area that brings together the fields of Machine Learning and Scientific Computation. After an introduction to the field, we review classical ODE and PDE discretization methods. Next, we discuss automatic differentiation and introduce the main physics-informed methods in SciML, namely, PDE-constrained neural network regression and PDE-constrained Gaussian Process regression. We then present data-driven SciML methods, including operator learning, foundation models, and data-driven reduced-order models. We include throughout the application of SciML methods in forward prediction, inverse modeling, and uncertainty quantification.
Acknowledgment: The development of this course was supported by TAMIDS through its Data Science Course Development Program.
Class Contents: (under construction)
Lecture 1: Introduction to Scientific Machine Learning
- An overview of scientific machine learning and scientific laws expressed through equations, including several examples.
- Lecture Slides (this will play animations if opened in Keynote)
- Lecture Slides (static PDF)
- Additional resources:
- Rackauckas: The Use and Practice of Scientific Machine Learning
- Braga-Neto: Fundamentals of Pattern Recognition and Machine Learning, 2nd Edition (Chapter 12)
- DOE Report: Basic Research Needs for Scientific Machine Learning
- Karniadakis et al.: Physics-Informed Machine Learning
- Evans: Partial Differential Equations
Lecture 2: Introduction to ODE and PDE Discretization Methods
- An introduction to the classical discretization methods for ODEs and PDEs, including Runge-Kutta methods for ODEs, finite-difference methods for steady-state PDEs, and semidiscretization methods for time-evolution PDEs, including the method of lines and Galerkin semidiscretization. The lecture includes basic elements of the theory of order, convergence, and stability for ODEs and time-evolution PDEs.
- Lecture Slides (PDF)
- Reference:
Lecture 3: Automatic Differentiation
- Introduction to forward-mode and reverse-mode automatic differentiation, with application to the differentiation of neural networks.
- Lecture Slides (PDF)
- References:
- Rall: “Perspectives on Automatic Differentiation: Past, Present, and Future?”. Chapter 1 of Automatic Differentiation: Applications, Theory, and Implementations.
- Baydin et al.: Automatic Differentiation in Machine Learning: a Survey.
Lecture 4: Physics-Informed Neural Networks
- A survey of methods for constraining neural networks via partial differential equations (including ODEs and algebraic equations, as special cases). Topics: regression, including least-squares parametric regression, neural network regression, neural network training, backpropagation, soft constraint via physics-informed neural networks (PINN); JAX implementation of PINN; multi-objective loss function and self-adaptive PINN; PINN for inverse problems; PINNs with hard constraints; Hybrid PINNs: discrete PINNs with Euler and Runge-Kutta semi-discretization.
- Lecture Slides (PDF)
- PINN_Burgers.ipynb (JAX implementation of PINN)
- PINN_Burgers_Inverse.ipynb (JAX implementation of inverse PINN)
- References:
- Bishop: Deep Learning
- Braga-Neto: Fundamentals of Pattern Recognition and Machine Learning, 2nd Edition
- Sapunov: Deep Learning with JAX
- Raissi, Perdikaris, and Karniadakis: Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.
- McClenny and Braga-Neto: Self-Adaptive Physics-Informed Neural Networks.
- Isakov, Inverse Problems for Partial Differential Equations