Online Workshop
Online Workshop Every Week

Join our free weekly interactive learning sessions.

Master AI/ML with instant feedback and personalized learning

"Cogito, ergo sum" (I think, therefore I am)

β€” RenΓ© Descartes

RenΓ© Descartes
Free Problems
8 RG Flow of the Neural Tangent Kernel (PDLT)
This problem set covers the RG flow analysis of the Neural Tangent Kernel (NTK) in deep neural networks. The problems progress from basic concepts to advanced analytical derivations, focusing on the statistical properties of the NTK, its layer-to-layer evolution, and the finite-width effects that govern gradient-based learning dynamics.
34 pts Hard 97 ntk-definition gradient-descent neural-networks +7
7 Gradient-Based Learning (PDLT)
This problem set covers key concepts from Chapter 7 on Gradient-Based Learning, including supervised learning, gradient descent optimization, the neural tangent kernel (NTK), and their roles in training neural networks. The problems progress from basic conceptual understanding to advanced analytical applications of gradient-based learning theory.
23 pts Medium 96 supervised-learning conditional-distribution discriminative-model +7
Bayesian Learning in Deep Neural Networks
This problem set explores Bayesian learning in deep neural networks based on the research paper "6 Bayesian Learning (PDLT)". The problems cover Bayesian probability theory, model fitting, model comparison, and the differences between infinite-width and finite-width neural networks in the Bayesian learning framework. You'll examine how Bayesian inference provides a principled approach to learning, how evidence calculations prefer critical initialization, and how finite-width networks enable representation learning through neural associations.
31 pts Medium 99 bayesian-probability inference product-rule +7
Effective Theory of Preactivations at Initialization
This problem set covers key concepts from Chapter 5: "Effective Theory of Preactivations at Initialization" focusing on criticality analysis, kernel recursions, universality classes, and finite-width effects in deep neural networks. The problems progress from fundamental concepts to advanced analytical derivations.
39 pts Hard 97 criticality initialization deep-learning +7
RG Flow of Preactivations in Deep Neural Networks
This problem set explores the key concepts from the research paper "4 RG Flow of Preactivations (PDLT)" which develops an effective theory for understanding how preactivation distributions evolve through layers in deep neural networks. The problems cover Gaussian distributions in the first layer, emergence of non-Gaussianity in deeper layers, the large-width expansion, and the connection to renormalization group flow in physics.
23 pts Medium 104 neural-networks preactivations gaussian-distribution +7
Deep Linear Networks at Initialization
This problem set explores the key concepts from the research paper "3 Effective Theory of Deep Linear Networks at Initialization (PDLT)". The problems cover criticality, fluctuations, non-Gaussian statistics, and the emergent depth-to-width ratio in deep linear networks. Work through these problems to understand how initialization hyperparameters affect network behavior and how finite-width effects emerge in deep networks.
42 pts Medium 100 deep-linear-networks activation-functions toy-models +7
Premium Problems
Python I/O and Data Pipeline Assessment - Part 4
20 questions focused on PyTorch Dataset/DataLoader design: map/iterable datasets, transforms, custom collate/padding, worker seeding/sharding, num_workers/pin_memory/prefetch_factor, caching, memmap/shared memory, batching by size, profiling, and performance tuning.
10.00 60 pts Medium 98 torch.utils.data.dataset pytorch dataset +7
Chapter 02 - Numeric Python
This problem set covers key concepts from Chapter 2: Vectors, Matrices, and Multidimensional Arrays. The problems test understanding of NumPy array fundamentals, including array creation, indexing, slicing, operations, and vectorized computing. Each question is designed to reinforce the core concepts presented in the chapter.
5.00 26 pts Medium 97 numpy-arrays array-attributes shape +7
USAAIO 2025 R1P3 - Logistic Regression Implementation
This problem focuses on implementing logistic regression from scratch using the Titanic dataset. You will work through data pre-processing, mathematical derivations, and implement both gradient descent and Newton's method for logistic regression. The dataset contains passenger information from the Titanic, and your goal is to predict survival based on various features.
10.00 48 pts Easy 93 data-loading pandas data-exploration +7
USAAIO 2025 R1P2 - Basics of Neural Network - From Linear Regression to DNN Training
This problem is about the basics of neural network. Each part has its particular purpose to intentionally test you something. Do not attempt to find a shortcut to circumvent the rule. And all coding tasks shall run on CPUs, **not GPUs**.
10.00 36 pts Easy 96 learning-rate-scheduler pytorch optimization +12
USAAIO 2025 R1P1 - Fibonacci Matrix Form
Let us consider the following sequence: $$ F_n = F_{n-1} + F_{n-2},\ \forall\ n \ge 2. $$
8.00 27 pts Medium 96 fibonacci sequence linear algebra matrix form +7
IAIO 2024 Part 2 - Machine Learning Algorithms and Deep Learning
This problem covers the remaining categories of the 2024 International Artificial Intelligence Olympiad (IAIO), focusing on machine learning algorithms and deep learning. You'll work through practical implementations of k-means clustering, deep learning architectures, and advanced machine learning theory including kernel methods and the Perceptron algorithm. The problems cover: - K-means clustering algorithm implementation and convergence - Deep learning architectures (DALL-E, Transformers) - Perceptron algorithm and kernel methods - Mathematical proofs and theoretical analysis - Parameter counting and computational complexity
10.00 44 pts Hard 99 k-means clustering euclidean distance machine learning +7

Knowledge Graphs

USA AI Olympiad

Explore competitive programming and AI contest preparation concepts

Grade 5 Math

Discover elementary mathematics concepts and learning paths

Featured PDFs

View All PDFs
Cover of System Design Interview: An Insider's Guide Volume 2
System Design Interview: An Insider's Guide Volume 2
116 questions 348 pts
Cover of System Design Interview: An Insider's Guide
System Design Interview: An Insider's Guide
108 questions 317 pts
Cover of UNICALLI: A UNIFIED DIFFUSION FRAMEWORK FOR COLUMN-LEVEL GENERATION AND RECOGNITION OF CHINESE CALLIGRAPHY
UNICALLI: A UNIFIED DIFFUSION FRAMEWORK FOR COLUMN-LEVEL GENERATION AND RECOGNITION OF CHINESE CALLIGRAPHY
10 questions 38 pts
Cover of The Principles of Deep Learning Theory
The Principles of Deep Learning Theory
107 questions 418 pts

Featured Books

View All Books
Cover of Acing the System Design Interview
Acing the System Design Interview
153 questions 456 pts
Cover of Numerical Python: Scientific Computing and Data Science Applications with Numpy, SciPy and Matplotlib
Numerical Python: Scientific Computing and Data Science Applications with Numpy, SciPy and Matplotlib
190 questions 543 pts
Cover of Hands-On Machine Learning with Scikit-Learn and PyTorch
Hands-On Machine Learning with Scikit-Learn and PyTorch
200 questions 554 pts
Cover of Deep Reinforcement Learning Hands-On - Third Edition
Deep Reinforcement Learning Hands-On - Third Edition
222 questions 720 pts

Featured Videos

View All Videos
Cover of Flow-Matching vs Diffusion Models explained side by side
Flow-Matching vs Diffusion Models explained side by side
10 questions 29 pts
Cover of Attention in transformers, step-by-step | Deep Learning Chapter 6
Attention in transformers, step-by-step | Deep Learning Chapter 6
10 questions 30 pts
Cover of Knowledge Distillation: How LLMs train each other
Knowledge Distillation: How LLMs train each other
10 questions 27 pts
Cover of Diffusion Model
Diffusion Model
10 questions 32 pts