My reviews of some courses that I took in 2023 and 2024.
Covers many interesting papers in the field -
Raft,
Zookeeper,
Spanner, …
Lab exercises were fun, and mostly implemented Raft in Go.
Covers many simple but clever tricks for estimating values, inferring relationships between variables, and proving statements.
Almost no prerequisite knowledge needed.
Covers convexity, duality, Newton’s method, barrier method, and many applications.
Prof. Stephen Boyd’s lectures were surprisingly interesting as he often shares personal experience of practical applications.
Discrete math portion covers logic, proofs, stable matching, graphs, modulo arithmetic, coding theory, combinatorics and computability*.
Probability portion covers random variables, covariance, correlation, confidence inequalities, conditional probability and Markov chains.
No-frills and well paced.
*I recommend reading The Annotated Turing on computability (and its history).
An introduction to DL that is rich in both theory and practice (with PyTorch), with lots of pretty visualizations.
Covers the basics* (backpropagation, CNNs, optimizers, …), RNNs, transformers, autoencoders,
GANs, self-supervised learning and energy-based models.
Taught in part by Yann LeCun.
*Alternatively, the Deep Learning Book is great for learning DL basics.
Covers MDP, Bellman equations, DP, TD-learning, SARSA, Q-learning, Actor-Critic, MCTS.
The final assignment applies RL to the Easy21 game. Taught by David Silver.
I took an old (2015) iteration of this course which is outdated by now. (E.g. Didn’t cover the very well-known PPO)
Covers building a computer (which eventually plays Tetris) from a bunch of NAND gates. Hardware Design Language (HDL) was written and tested in Nand2Tetris’ online IDE.
Very interesting how just a few primitives — NAND gates, Data Flip-Flops (for registers), ROM, and peripherals (screen and keyboard) — is sufficient to build a computer.
I also found Algorithmica and CS:APP to be great resources for learning more about low-level CS.