AI agents call git constantly. Status, diff, log, show. I pulled data from 3,156 real coding sessions and git accounted for roughly 459,000 tokens of output. That’s 7.4% of all shell commands. Codex is even worse (over 10% of its bash calls are git).
Makes sense though right? git’s output was designed for humans. Verbose headers, instructional text, column padding, decorative formatting. It’s the informational equivalent of wrapping every answer in a gift bag with tissue paper. Machines don’t need the tissue paper or the gift bag. Every extra token costs money and adds latency.
So I built nit. A native git replacement written in Zig that talks directly to the git object database via libgit2. Defaults tuned for machines.
The Numbers
Token savings (nit compact vs git default):
Command git tokens nit tokens Savings status ~125 ~36 71% log -20 ~2,273 ~301 87% diff ~1,016 ~657 35% show —stat ~260 ~118 55%
Across real session data, nit’s compact defaults would save 150-250K tokens. That’s something… oh, and did I mention it’s faster?
100 hyperfine runs on a real repo:
Command git nit Speedup status 13.7ms 8.4ms 1.64x diff 14.3ms 9.9ms 1.44x show 10.2ms 7.3ms 1.39x
How It Works
... continue reading