Tinygrad is a Machine learning framework written in Python; similar to PyTorch, TensorFlow and Keras. It was written by George Hotz.
Tinygrad is developed and hosted on Github, the repo is tinygrad/tinygrad.
Documentation
The official Tinygrad documentation is hosted at docs.tinygrad.org.
Performance
Aside from being able to work with many backends, another goal of Tinygrad is having good performance for training and inference.
Beam search
A given operation can be executed with very different “dimensions” on a GPU. Tinygrad has an option called BEAM
that can try different combinations and pick the most performant one. Once a kernel is searched and executes, it is cached so subsequent runs don’t have the search penalty.
TinyJit
This is similar to torch.compile
I think. It JITs a given function to execute faster without going between Python and GPU. Everything that is JIT’ed can run without the slow back-and-forth.