Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

0509 faq #693

Merged
merged 2 commits into from
May 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,12 +58,12 @@

## 📌 Latest Features

- 2024-05-08 Integrates LLM DeepSeek.
- 2024-05-08 Integrates LLM DeepSeek-V2.
- 2024-04-26 Adds file management.
- 2024-04-19 Supports conversation API ([detail](./docs/conversation_api.md)).
- 2024-04-16 Integrates an embedding model 'bce-embedding-base_v1' from [BCEmbedding](https://github.com/netease-youdao/BCEmbedding), and [FastEmbed](https://github.com/qdrant/fastembed), which is designed specifically for light and speedy embedding.
- 2024-04-11 Supports [Xinference](./docs/xinference.md) for local LLM deployment.
- 2024-04-10 Adds a new layout recognition model for analyzing Laws documentation.
- 2024-04-10 Adds a new layout recognition model for analyzing legal documents.
- 2024-04-08 Supports [Ollama](./docs/ollama.md) for local LLM deployment.
- 2024-04-07 Supports Chinese UI.

Expand Down
15 changes: 8 additions & 7 deletions docs/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -367,11 +367,11 @@ You can use Ollama to deploy local LLM. See [here](https://github.com/infiniflow
2. Right click the desired knowledge base to display the **Configuration** dialogue.
3. Choose **Q&A** as the chunk method and click **Save** to confirm your change.

### 7 Do I need to connect to Redis?
### 7. Do I need to connect to Redis?

No, connecting to Redis is not required.

### 8 `Error: Range of input length should be [1, 30000]`
### 8. `Error: Range of input length should be [1, 30000]`

This error occurs because there are too many chunks matching your search criteria. Try reducing the **TopN** and increasing **Similarity threshold** to fix this issue:

Expand All @@ -382,14 +382,15 @@ This error occurs because there are too many chunks matching your search criteri

![topn](https://github.com/infiniflow/ragflow/assets/93570324/7ec72ab3-0dd2-4cff-af44-e2663b67b2fc)

### 9 How to upgrade RAGFlow?
### 9. How to upgrade RAGFlow?

You can upgrade RAGFlow to either the dev version or the latest version
You can upgrade RAGFlow to either the dev version or the latest version:

- Dev versions are for developers and contributors. They are published on a nightly basis and may crash because they are not fully tested. We cannot guarantee their validity and you are at your own risk trying out latest, untested features.
- The latest version is stable and reliable, and works best with RAGFlow users.
- The latest version refers to the most recent, officially published release. It is stable and works best with regular users.


Update RAGFlow to dev version:
To upgrade RAGFlow to the dev version:

1. Pull the latest source code
```bash
Expand All @@ -411,7 +412,7 @@ This error occurs because there are too many chunks matching your search criteri
docker compose -f docker-compose-CN.yml up -d
```

To upgrade RAGFlow to latest version:
To upgrade RAGFlow to the latest version:

1. Update **ragflow/docker/.env** as follows:
```bash
Expand Down