Locally Uncensored - AGPL-3.0 local AI app built with Tauri v2 instead of Electron

Wait 5 sec.

been working on this for a while and figured r/opensource would appreciate the approach. Locally Uncensored is a desktop app for running AI locally - chat, image gen, video gen, all without cloud dependencies. the thing that might actually interest people here is the tech stack and licensing decisions. built it with Tauri v2 (Rust backend) instead of Electron. the difference is night and day - the whole app is like 15MB instead of 200MB+, way less RAM, and Rust handles the backend stuff that would otherwise be a node.js mess. no chromium bundled, just your system webview. it connects to 12 different backends (ollama, llama.cpp, koboldcpp, vllm, tabbyapi, etc) and has plug-and-play ComfyUI integration for image and video generation. basically tried to make local AI less painful to set up. licensed under AGPL-3.0 because i wanted to make sure any modifications stay open. if someone builds on this they have to share back. MIT felt too permissive for something that could easily get wrapped into a proprietary product. the AGPL ensures the community benefits from any improvements. v2.3.0 just dropped with ComfyUI integration, image-to-image, image-to-video (runs on 6GB VRAM), and support for newer models like GLM, Qwen, Gemma 4. contributions welcome - theres plenty of rough edges left and i'd rather have community input than build everything in isolation. github: https://github.com/PurpleDoubleD/locally-uncensored   submitted by   /u/GroundbreakingMall54 [link]   [comments]