TextGen is now a native desktop app. Open-source alternative to LM Studio (formerly text-generation-webui).
TextGen (oobabooga's text-generation-webui) relaunches as a native desktop app for Windows, Linux, and macOS, replacing its browser-based UI with an Electron-based portable build.
Excerpt
Hi all,
I have been making a lot of updates to my project, and I wanted to share them here.
TextGen (previously text-generation-webui, also known as my username oobabooga or ooba) has been in development since December 2022, before LLaMa and llama.cpp existed.
In the last two months, the project has evolved from a web UI to a **no-install desktop app** for Windows, Linux, and macOS with a polished UI. I have created a very minimal and elegant Electron integration for that. (Did you know LM Studio is also a web UI running over Electron? Not sure many people know that.)
https://preview.redd.it/tk8oibhgjw0h1.png?width=1686&format=png&auto=webp&s=95c70f769766466885c8fdc6e7211525a371a920
It works like this:
1. You download a *portable build* from the [releases page](https://github.com/oobabooga/textgen/releases)
2. Unzip it
3. Double-click textgen
4. A window appears
There is no installation, and no files are ever created outside the extracted folder. It's fully self-contained. All your chat histories and settings are stored in a `user_data` folder shipped with the build.
There are builds for CUDA, Vulkan, CPU-only, Mac (Apple Silicon and Intel), and ROCm.
Some differentiating features:
* Full privacy. Unlike LM Studio, it doesn't phone home on every launch with your OS, CPU architecture, app version, and inference backend choices. Zero outbound requests.
* ik\_llama.cpp builds (LM Studio and Ollama only ship vanilla llama.cpp). ik\_llama.cpp has new quant ty
Read at source: https://www.reddit.com/r/LocalLLaMA/comments/1tbyyee/textgen_is_now_a_native_desktop_app_opensource/