AI Bootcamp XI - Gaussian Processes
Powell-Booth 100 (Seminar Room)
- Internal Event
Welcome to the Eleventh EAS Gaussian Processes Bootcamp, scheduled for June 9th to June 12th, 2025. This intensive, four-day workshop is designed for researchers and practitioners who want to gain a deep understanding of Gaussian process (GP) models and learn how to apply them to real-world problems.
What to Expect:
- Daily Structure
Each day will consist of:- One to two lectures introducing foundational GP concepts (theoretical motivations, kernel design, inference).
- Two or more practical, hands-on sessions where you'll implement GP models in Python (using libraries such as GPy, scikit-learn, or GPflow) and apply them to example datasets from various domains.
- Topics Covered
- Core GP Concepts & Kernels: Understand Gaussian processes as distributions over functions, explore common kernels (e.g., RBF, periodic), and learn how to combine kernels for richer models.
- Exact Inference & Hyperparameter Tuning: Derive and implement the posterior mean/variance formulas,.
- Real-World Applications: See concrete examples in Bayesian optimization (robotics) and active learning
- Hands‐On Implementation & Best Practices: Throughout, write Python code (GPy, scikit‐learn, GPflow/GPyTorch) to fit models, visualize uncertainty, and build small Bayesian optimization loops
Objective:
By the end of this bootcamp, you will be able to:
- Derive and implement the GP regression and classification posterior formulas.
- Choose and design appropriate covariance functions for different problem settings.
- Perform hyperparameter learning in practice and understand computational bottlenecks.
- Apply sparse/approximate methods to scale GPs to larger datasets.
- Integrate GP models into real-world workflows, such as Bayesian optimization or uncertainty quantification in scientific experiments.
Prerequisites:
To get the most out of this bootcamp, you should be comfortable with:
- Linear Algebra: Vectors, matrices, eigenvalues/eigenvectors, norms, and basic matrix decompositions
- Multivariable Calculus: Partial derivatives, gradients, and understanding of how to compute derivatives of scalar and vector-valued functions.
- Probability Theory: Familiarity with random variables, Gaussian distributions, joint/marginal distributions, and Bayesian inference concepts.
- Python Programming: Basic syntax and experience with NumPy. During Day 1, we will cover additional libraries like Matplotlib, GPy, and scikit-learn.
Deadline for Registration: by 1159PM Pacific Time on June 6th.
Registration link: https://caltech.instructure.com/enroll/JPE69L
The Quiz link is https://caltech.instructure.com/courses/8761/assignments/72657
For more information, please contact Reza Sadri, Director by email at [email protected] or visit https://aibootcamp.caltech.edu.
Event Series
AI Bootcamp