As the capabilities of large language models (LLMs) have continued to progress, AI research has generally become less accessible to those outside of frontier labs. Although a variety of open-source LLMs are publicly available, there are two key issues that have consistently impeded progress in open research:
The performance gap between closed and open models.
The prevalence of open-weight models, and the scarcity of fully-open models.
Saved: January 12, 2026
ai
llm
open source
This post introduces all of the core hardware concepts and programming techniques that underpin state-of-the-art (SOTA) NVIDIA GPU matrix-multiplication (matmul) kernels.
Saved: January 12, 2026
ai
gpu