Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Maximum update depth exceeded. #397

Open
cognitiveRobot opened this issue Oct 28, 2024 · 3 comments
Open

Error: Maximum update depth exceeded. #397

cognitiveRobot opened this issue Oct 28, 2024 · 3 comments
Assignees

Comments

@cognitiveRobot
Copy link

I am building an RAG application and the following error is blocking my deployment.
Error message received:

react-dom.development.js:26793 Uncaught (in promise) Error: Maximum update depth exceeded. This can happen when a component repeatedly calls setState inside componentWillUpdate or componentDidUpdate. React limits the number of nested updates to prevent infinite loops.
    at throwIfInfiniteUpdateLoopDetected (react-dom.development.js:26793:11)
    at getRootForUpdatedFiber (react-dom.development.js:7627:3)
    at enqueueConcurrentRenderForLane (react-dom.development.js:7549:10)
    at forceStoreRerender (react-dom.development.js:12049:14)
    at handleStoreChange (react-dom.development.js:12028:7)
    at eval (index.mjs:155:42)
    at Array.setter (index.mjs:456:21)
    at eval (index.mjs:141:25)
    at mutateByKey (index.mjs:399:21)
    at internalMutate (index.mjs:302:12)
    at eval (index.mjs:380:76)
    at onUpdate (index.mjs:260:7)
    at processDataProtocolResponse (index.mjs:939:5)
    at async callChatApi (index.mjs:1031:14)
    at async getStreamedResponse (index.mjs:222:10)
    at async processChatStream (index.mjs:1243:42)
    at async eval (index.mjs:338:9)

Set up details:

Next.js 14.2.14
"@llamaindex/core": "^0.2.6"
"ai": "3.3.42",
"ajv": "^8.12.0",

Model: `llama-3.1-8b-instant` 
Model Provider: `qroq`.

To reproduce:

  1. Run npx create-llama@latest from v0.3.8
  2. Follow default setup with python back-end.
  3. Send query couple of times.

Can you please help me to fix the issue?

@marcusschiesser
Copy link
Collaborator

It's caused by this vercel/ai issue: vercel/ai#1610 - we'll add a workaround to the https://github.com/run-llama/chat-ui components

@cognitiveRobot
Copy link
Author

@marcusschiesser Thanks for the reply. What is the expected date of this workaround?

@marcusschiesser
Copy link
Collaborator

It's only in dev mode so it's not a blocker, but we should have the workaround next week

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants