.

Tinygrad


Reading time: less than 1 minute

Tinygrad is a Machine learning framework written in Python; similar to PyTorch, TensorFlow and Keras. It was written by George Hotz.

Tinygrad is developed and hosted on Github, the repo is tinygrad/tinygrad.

Documentation

The official Tinygrad documentation is hosted at docs.tinygrad.org.

Performance

Aside from being able to work with many backends, another goal of Tinygrad is having good performance for training and inference.

A given operation can be executed with very different “dimensions” on a GPU. Tinygrad has an option called BEAM that can try different combinations and pick the most performant one. Once a kernel is searched and executes, it is cached so subsequent runs don’t have the search penalty.

TinyJit

This is similar to torch.compile I think. It JITs a given function to execute faster without going between Python and GPU. Everything that is JIT’ed can run without the slow back-and-forth.

Citation

If you find this work useful, please cite it as:
@article{yaltirakli,
  title   = "Tinygrad",
  author  = "Yaltirakli, Gokberk",
  journal = "gkbrk.com",
  year    = "2025",
  url     = "https://www.gkbrk.com/tinygrad"
}
Not using BibTeX? Click here for more citation styles.
IEEE Citation
Gokberk Yaltirakli, "Tinygrad", August, 2025. [Online]. Available: https://www.gkbrk.com/tinygrad. [Accessed Aug. 16, 2025].
APA Style
Yaltirakli, G. (2025, August 16). Tinygrad. https://www.gkbrk.com/tinygrad
Bluebook Style
Gokberk Yaltirakli, Tinygrad, GKBRK.COM (Aug. 16, 2025), https://www.gkbrk.com/tinygrad

Comments

© 2025 Gokberk Yaltirakli