Thread

Zero-JS Hypermedia Browser

Relays: 5
Replies: 0
Generated: 23:02:12
ChatterUI - A simple app for LLMs https://github.com/Vali-98/ChatterUI https://t.me/chatterui ChatterUI is a native mobile frontend for LLMs. Run LLMs on device or connect to various commercial or open source APIs. ChatterUI aims to provide a mobile-friendly interface with fine-grained control over chat structuring. Features: Run LLMs on-device in Local Mode Connect to various APIs in Remote Mode Chat with characters. (Supports the Character Card v2 specification.) Create and manage multiple chats per character. Customize Sampler fields and Instruct formatting Integrates with your device’s text-to-speech (TTS) engine Usage: Download and install latest APK from the releases page. https://github.com/Vali-98/ChatterUI/releases/latest iOS is Currently unavailable due to lacking iOS hardware for development Local Mode: ChatterUI uses a llama.cpp under the hood to run gguf files on device. https://github.com/ggerganov/llama.cpp A custom adapter is used to integrate with react-native: cui-llama.rn https://github.com/Vali-98/cui-llama.rn To use on-device inferencing, first enable Local Mode, then go to Models > Import Model / Use External Model and choose a gguf model that can fit on your device's memory. The importing functions are as follows: Import Model: Copies the model file into ChatterUI, potentially speeding up startup time. Use External Model: Uses a model from your device storage directly, removing the need to copy large files into ChatterUI but with a slight delay in load times. After that, you can load the model and begin chatting! Note: For devices with Snapdragon 8 Gen 1 and above or Exynos 2200+, it is recommended to use the Q4_0 quantization for optimized performance. Remote Mode: Remote Mode allows you to connect to a few common APIs from both commercial and open source projects. Open Source Backends: koboldcpp text-generation-webui Ollama Dedicated API: OpenAI Claude (with ability to use a proxy) Cohere Open Router Mancer AI Horde Generic backends: Generic Text Completions Generic Chat Completions These should be compliant with any Text Completion/Chat Completion backends such as Groq or Infermatic. Custom APIs: Is your API provider missing? ChatterUI allows you to define APIs using its template system. Read more about it here! https://github.com/Vali-98/ChatterUI/discussions/126
2025-09-06 06:47:03 from 1 relay(s)
Login to reply