I shipped gni-compression to npm two days ago. One of the first questions I got (from myself, running benchmarks at midnight): does it work on anything other than chat data? Short answer: not yet. Long answer: I found out exactly why, and it led me somewhere more interesting than I expected. After the npm launch I ran GN against Silesia — the standard general text compression benchmark suite. Dick
TL;DR: Mistral Medium 3.5 is a 128B open-weight model released April 29, 2026, with a 256K context window, configurable reasoning, and native multimodal input. It scores 77.6% on SWE-Bench Verified — close but not ahead of Claude Sonnet 4.6 — and ships alongside Vibe, a cloud coding agent that submits pull requests directly to GitHub without you babysitting it. API pricing is $1.50 per million inp