Building Jan.ai from Source with a Local LLM The Goal I wanted a recent build of Jan.ai. I got a 0.6.599 .deb. That's when I re-read my own prompt. The model was given a single, generic instruction. Nothing about versions, tags, or checking what was already installed. It said: Target application: jan.ai desktop application Container name pattern: [os]-[shortname] (e.g., ubuntu-jan) Ba
Some time ago, I was building a chat application using AWS Websocket API gateway. Things were going smoothly. I created a WebSocket API Gateway, added $connect, $disconnect, and sendMessage/addGroup routes. From the frontend (React) side, everything was fire-and-forget. You send a message, and the onMessageHandler takes care of it 💪🏼 But then a new requirement of uploading files using S3 signed