Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SQLite unable to get context on Wiki with more than > 2.400 pages #25

Open
eduardomozart opened this issue Nov 21, 2024 · 3 comments
Open
Labels
bug Something isn't working

Comments

@eduardomozart
Copy link
Contributor

Issue Description

Hello splitbrain,
Thank you for your plugin. I have two wikis: one with 1.500 pages and another with 2.462 pages. Not sure if it's related to the number of pages or the length of it's contents. The one with 2.462 pages generated a SQLite file with about ~250 MB, and the other one has ~150 MB. Both WIkis contains the same contents, the difference is that some namespaces aren't synced between them.
Both are using the same engines:
image
The 1.500 pages Wiki generates responses using SQLite as expected (with some warnings about "slow query" on SQLite plugin at DokuWiki log viewer), but the other one, doesn't.
Debugging the ''helper.log'' file, it seems that it doesn't find any context on the larger Wiki to the same question ("O que é o ClearPass?" is the question), but both have the "clearpass" namespace synced with the same contents.
I believe it's related to the SQLite because when I changed the Storage to Qdrant both Wiki starts working as expected.
I can provide the SQLite and the data/ folder of both wikis if needed for troubleshooting, please let me know a secure channel to provide them if needed since I can't provide it's contents pubicly.

@eduardomozart eduardomozart added the bug Something isn't working label Nov 21, 2024
@splitbrain
Copy link
Member

Did the clustering step actually succeed for the second wiki? Have a look at the raw database and check if your chunks have clusters assigned.

TBH, at these sizes I would recommend going with a proper vector storage instead of sqlite.

@eduardomozart
Copy link
Contributor Author

It doesn't seems to have chunks assigned (SELECT * FROM clusters; - the cluster tables are empty) and I don't know why. In the working Wiki, there's clusters assigned. I may try to clean and generate the SQLite database again in the non-working Wiki.

@splitbrain
Copy link
Member

try running the maintenance command again. most likely it ran out of memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants