Textgen is a local LLM interface designed for text and image generation, tool-calling, and training models. It operates entirely offline, ensuring user privacy with no telemetry or external dependencies.
Key features:
- Multiple backends - Supports various models including llama.cpp and Transformers.
- Chat and instruct modes - Facilitates conversation with custom characters and instruction-following.
- Vision capabilities - Allows image attachments for enhanced understanding.
- File handling - Users can upload text files and documents for discussion.
- Training options - Fine-tune models on custom datasets with support for resuming interrupted runs.
- Image generation - Dedicated features for generating images using diffusers models.
- Privacy-focused - Operates 100% offline with no external requests.
- User-friendly setup - Quick installation with zero setup options available.
- Custom tool-calling - Models can execute functions during chat, enhancing interactivity.
- Dark/light themes - User interface supports customizable themes and syntax highlighting.
Textgen is suitable for developers and researchers looking for a robust, private solution for LLM tasks.