chat-ui

The open source codebase powering HuggingChat

open-sourceagent-frameworks
10.6k
Stars
+23
Stars/month
1
Releases (6m)

Star Growth

+5 (0.0%)
10.4k10.6k10.8kMar 27Apr 1

Overview

chat-ui是一个开源的LLM聊天界面,基于SvelteKit构建,为Hugging Face的HuggingChat提供技术支持。该工具专注于OpenAI兼容的API集成,支持多种LLM提供商包括Hugging Face Inference Providers、llama.cpp服务器、Ollama、OpenRouter和Poe等。chat-ui提供完整的聊天功能,包括用户管理、聊天历史记录、设置配置和文件处理,所有数据存储在MongoDB中。作为一个成熟的开源解决方案,它简化了LLM聊天应用的部署,让开发者可以快速搭建自己的AI聊天服务。该项目拥有超过1万GitHub星标,证明了其在开源社区的受欢迎程度和可靠性。

Deep Analysis

Key Differentiator

Powers HuggingChat (huggingface.co/chat) — the most production-proven open-source chat UI with unique LLM Router for automatic model selection and native MCP tool support, unlike simpler UIs it handles multi-user, multi-model deployments

Capabilities

  • Chat interface for LLMs (powers HuggingChat)
  • OpenAI-compatible API support for any provider
  • LLM Router with smart model selection (Omni)
  • MCP tool integration
  • MongoDB-backed chat history and user management
  • Theming and customization
  • Docker deployment with bundled MongoDB

🔗 Integrations

Any OpenAI-compatible APIHugging Face Inference ProvidersOllamallama.cppOpenRouterMongoDBMCP servers

Best For

  • Self-hosting a ChatGPT-like interface for open-source models
  • Organizations wanting HuggingChat-quality UI for their own LLMs

Not Ideal For

  • Embedding AI chat into existing React/Vue applications
  • Teams needing integrated authentication out of the box

Languages

TypeScriptSvelte

Deployment

Dockernpm (self-hosted)SvelteKit build

Pricing Detail

Free: Fully open source (Apache 2.0)
Paid: N/A — free

Known Limitations

  • Only supports OpenAI-compatible APIs — legacy provider integrations removed
  • Requires MongoDB for persistence (embedded fallback for dev only)
  • SvelteKit-based — less familiar than React for many teams
  • No built-in authentication system

Pros

  • + OpenAI协议兼容性强,支持众多LLM提供商,包括本地和云端服务
  • + 经过实战验证,为HuggingChat等生产环境提供技术支持,稳定性高
  • + 完全开源且可自部署,提供完整的数据控制权和定制能力

Cons

  • - 仅支持OpenAI兼容的API,不支持其他协议格式的LLM服务
  • - 需要配置MongoDB数据库,增加了部署的复杂性
  • - 移除了提供商特定的集成功能,可能限制某些高级特性的使用

Use Cases

  • 企业内部部署私有化AI聊天服务,确保数据安全和合规性
  • 开发者构建基于LLM的聊天应用原型或产品
  • 为本地部署的LLM模型(如llama.cpp、Ollama)提供Web界面

Getting Started

1. 创建.env.local文件并配置OPENAI_BASE_URL和OPENAI_API_KEY;2. 运行npm install安装依赖,然后执行npm run dev启动开发服务器;3. 在浏览器中打开应用开始与LLM聊天

Compare chat-ui