Community curated code
Shimmy is a Rust-based inference server providing local, OpenAI-compatible endpoints for machine learning models.

Translate books and documents using AI with no size limits, preserving formatting and context.
LangExtract is a Python library for extracting structured data from unstructured text using LLMs, offering precise source grounding and visualization.
Prompt Optimizer enhances AI prompts for better results, supporting multiple platforms and advanced features for secure and efficient usage.

Pydantic AI is a Python framework for building robust generative AI applications with seamless observability and type safety.

Memori is a memory infrastructure that enhances AI interactions by providing persistent state management and seamless integration.

MLflow is an open source platform for managing AI applications, enabling teams to optimize and monitor production-quality models.

llmfit optimizes large language models for your hardware, ensuring efficient performance and compatibility.
RTK is a CLI proxy that reduces LLM token consumption by 60-90%, enhancing developer productivity with minimal overhead.
GoModel is a lightweight AI gateway that integrates multiple AI services through a unified API, enhancing accessibility and usability.