Popular Tags
No tags found in this context
Community curated code
Shimmy is a Rust-based inference server providing local, OpenAI-compatible endpoints for machine learning models.
oMLX is an LLM inference server optimized for Apple Silicon, enabling efficient model management from the macOS menu bar.