Skip to content

Add support for thinking LLMs directly in gr.ChatInterface #12552

Add support for thinking LLMs directly in gr.ChatInterface

Add support for thinking LLMs directly in gr.ChatInterface #12552

Triggered via pull request January 9, 2025 23:36
@abidlabsabidlabs
synchronize #10305
thought-ci
Status Cancelled
Total duration 1m 21s
Artifacts 1

test-functional.yml

on: pull_request
Matrix: test
Fit to window
Zoom out
Zoom in

Annotations

2 errors and 5 warnings
functional-test-SSR=true
Canceling since a higher priority waiting request for '10305-10305/merge-functional' exists
functional-test-SSR=false
Canceling since a higher priority waiting request for '10305-10305/merge-functional' exists
changes
Unexpected input(s) 'sha', 'source_repo', 'source_branch', 'pr_number', 'should_run', 'labels', 'run_id', 'gradio_version', valid inputs are ['path']
functional-test-SSR=true
No files were found with the provided path: ./test-results. No artifacts will be uploaded.
functional-test-SSR=true
No files were found with the provided path: ./test-results. No artifacts will be uploaded.
functional-test-SSR=false
No files were found with the provided path: ./test-results. No artifacts will be uploaded.
functional-test-SSR=false
No files were found with the provided path: ./test-results. No artifacts will be uploaded.

Artifacts

Produced during runtime
Name Size
changes
319 Bytes