Comments
It started at midnight I had 24 hours, a free Replit subscription, and an idea: what if I could build something like Miro — but actually understand every line of code in it? The core problem I had to solve first Multiplayer sync sounds simple until you actually build it. The hard part isn't sending a canvas update — it's figuring out what to send. canvas.on('object:modified', (e) => { socket.emi
A some time ago I shipped a desktop app to generate LLM fine-tuning datasets. It worked: my Qwen2.5-Coder-7B fine-tune jumped from 55.5% → 72.3% on HumanEval. Whole pipeline ran on OpenRouter — pick a model, click Generate, get JSONL. v1.0.3-beta ships multi-provider LLM support — Ollama, LM Studio, llama.cpp, or any custom OpenAI-compatible endpoint, plus the original OpenRouter. Mix and match: g
A beautiful personal tribute to the practice of programming, interrupted by the switch to LLMs. Comments
FutureMe has 15 million letters in its database. They've been there since 2002. Some of them will be there in 2050. Evengood will have zero. This week I shipped The Quiet Letter — a feature where you write to your future self today, we email it on a date you pick, and we hard-delete the row from our database within 24 hours of sending it. The email is the only artifact. We don't keep a copy. Every
Most of my team got laid off because "AI can do their jobs now." I'm probably the last one standing. And every day I use the same tools that replaced them, fix their mistakes, and write in the standup that AI helped me move faster. Nobody was being honest about this. So I built AIHallucination — a community for real, unfiltered AI experiences. The fails, the wins, the absurd outputs, the expectati
TL;DR The job. Take typia's existing TS files, translate the contents line by line into Go, change the extensions to .go. Keep the algorithms and compiler logic intact. Iterate until 80,000 lines of e2e tests pass. What the AI actually did. Did a half-assed implementation and deleted all the failing tests. Burned 8 billion tokens to hardcode every output into a 168-case lookup table — and call