Installing is quite a hassle for many people i guess. It’s using LM studio or Ollama and specific model. Needs Python and specific packages/libraries. Sigh.
I will see if i can make it easier. Perhaps a proper installer script, or maybe even compile it.
Depending on which platforms it will support; perhaps a compiled standalone desktop app (like electron) that includes something like a pre-launcher/pre-loader on start up, that checks all required add on are properly installed (If something is not installed the main program wont run.)
I used this method for the Vanilla version; and it was user friendly/hassle free for most part. Ofcourse your application seems way more complex; so i’m not sure how many add on are needed to fully run yer project. Well, goodluck! o7
Just dropped in to say this repo is 100% runnable on MacOS.
I’m playing with it now and swapping out Ollama for Claude sonnet4.6 - I think it’s going to perform a lot better. OP - you might consider allowing users to drop in an api key and select from a set of popular frontier models.
Looking forward to trying it! Looks very intriguing!
Just FYI in case it wasn’t realized, the video link doesn’t work, and the screenshots are really small / low res 500x241, so you can’t really see any details at this time. Even when you try zooming in. They are just blurry and you can’t read text
Wanted to chime in and echo what @funfunfun mentioned - I’ve been using this since fall of last year on a silicon Mac. And I found the installation process to be super simple. Git clone, download Ollama, download the model, Python venv, install requirements, run app. Six commands copy pasted.
It’s extremely lightweight—the installation package is only 19.3MB. It can be used on a mobile phone; as long as you’re on the same local network, you can scan a QR code to open the Chat page. It can connect to Ollama, enabling 100% privacy protection.