Configure LLM - Large Language Model
In this app, LLM is used for several purposes:
- Extracting knowledge from docs;
- Generating responses to user queries.
Configure the LLM
After login with admin account, you can configure the LLM in the admin panel.
- Go to the admin panel;
- Click on the
LLMs
tab; - Click on the
+ New
button to add a new LLM;
- Input your LLM information and click
Create LLM
button;
Supported LLMs
Currently we support the following LLMs:
- OpenAI (opens in a new tab)
- Gemini (opens in a new tab)
- OpenAI Like
- OpenRouter (opens in a new tab)
- Default config:
{"api_base": "https://openrouter.ai/api/v1/"}
- Default config:
- BigModel (opens in a new tab)
- Default config:
{ "api_base": "https://open.bigmodel.cn/api/paas/v4/", "is_chat_model": true }
- OpenRouter (opens in a new tab)
- Bedrock (opens in a new tab)
- Anthropic Vertex AI (opens in a new tab)
- Ollama (opens in a new tab)
- Default config:
{"base_url": "http://localhost:11434"}
- Default config: