HeadlinesBriefing favicon HeadlinesBriefing.com

C‑Only Transformer Engine TRiP Offers Full Training and Vision

Hacker News •
×

Developer carlovalenti released TRiP, a self‑contained C engine that implements transformer inference, training, tokenization, chat and vision in a handful of source files. Built over 18 months from March 2024 to August 2025 during evenings and weekends, the project targets education rather than performance competition. It supports Gemma 1.x, Llama 2, PaliGemma and GPT‑2, handling SafeTensors, Karpathy’s formats and multiple weight precisions.

The codebase stays under 10 KB per file, with explicit forward and backward kernels for matmul, softmax, RMSNorm and attention. Training runs full backpropagation using AdamW, cosine‑annealed learning rates and gradient clipping, while inference offers greedy, top‑k and nucleus sampling. Memory‑mapped weights let modest RAM host larger models, though CPU‑only bfloat16 gains remain negligible on typical modern x86 chips today systems.

Installation requires only gcc 13+, OpenMP, libjpeg and libx11; no CMake, Python or external frameworks are needed. Users can compile with a single make command, then run chat, decode prompts or launch multimodal inference on JPEG images via the PaliGemma interface. Distributed under CC BY‑NC 4.0, TRiP serves as a hands‑on reference for researchers and anyone dissecting transformer internals in depth and learning for today.