Configure LLM - Large Language Model
In this app, LLM is used for several purposes:
- Extracting knowledge from docs;
- Generating responses to user queries.
Configure the LLM
After login with admin account, you can configure the LLM in the admin panel.
- Go to the admin panel;
- Click on the
LLMs
tab; - Click on the
+ New
button to add a new LLM;
- Input your LLM information and click
Create LLM
button; - Done!
💡
If you want to use the new LLM while answering user queries, you need switch to Chat Engines
tab and set the new LLM as LLM.
Supported LLMs
Currently we support the following LLMs:
- OpenAI (opens in a new tab)
- Gemini (opens in a new tab)
- OpenAI Like
- OpenRouter (opens in a new tab)
- Default config:
{"api_base": "https://openrouter.ai/api/v1/"}
- Default config:
- BigModel (opens in a new tab)
- Default config:
{ "api_base": "https://open.bigmodel.cn/api/paas/v4/", "is_chat_model": true }
- OpenRouter (opens in a new tab)
- Bedrock (opens in a new tab)
- Anthropic Vertex AI (opens in a new tab)
- Ollama (opens in a new tab)
- Default config:
{"base_url": "http://localhost:11434"}
- Default config: