Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(llmobs): submit spans for streamed calls #10908

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

sabrenner
Copy link
Contributor

@sabrenner sabrenner commented Oct 2, 2024

What does this PR do?

Builds off of #10672, extending support for llm.stream, chat_model.stream, and chain.stream to LLM Observability. This PR tags those spans appropriately for LLM Observability, and marks them to submit to LLM Observability.

Checklist

  • PR author has checked that all the criteria below are met
  • The PR description includes an overview of the change
  • The PR description articulates the motivation for the change
  • The change includes tests OR the PR description describes a testing strategy
  • The PR description notes risks associated with the change, if any
  • Newly-added code is easy to change
  • The change follows the library release note guidelines
  • The change includes or references documentation updates if necessary
  • Backport labels are set (if applicable)

Reviewer Checklist

  • Reviewer has checked that all the criteria below are met
  • Title is accurate
  • All changes are related to the pull request's stated goal
  • Avoids breaking API changes
  • Testing strategy adequately addresses listed risks
  • Newly-added code is easy to change
  • Release note makes sense to a user of the library
  • If necessary, author has acknowledged and discussed the performance implications of this PR as reported in the benchmarks PR comment
  • Backport labels are set in a manner that is consistent with the release branch maintenance policy

Copy link
Contributor

github-actions bot commented Oct 2, 2024

CODEOWNERS have been resolved as:

releasenotes/notes/llmobs-langchain-streamed-calls-23a13029ac5d8907.yaml  @DataDog/apm-python
ddtrace/contrib/internal/langchain/patch.py                             @DataDog/ml-observability
ddtrace/contrib/internal/langchain/utils.py                             @DataDog/ml-observability
ddtrace/llmobs/_integrations/langchain.py                               @DataDog/ml-observability
tests/contrib/langchain/test_langchain_llmobs.py                        @DataDog/ml-observability
tests/snapshots/tests.contrib.langchain.test_langchain_community.test_streamed_chain.json  @DataDog/apm-python
tests/snapshots/tests.contrib.langchain.test_langchain_community.test_streamed_chat.json  @DataDog/apm-python
tests/snapshots/tests.contrib.langchain.test_langchain_community.test_streamed_json_output_parser.json  @DataDog/apm-python
tests/snapshots/tests.contrib.langchain.test_langchain_community.test_streamed_llm.json  @DataDog/apm-python

@datadog-dd-trace-py-rkomorn
Copy link

Datadog Report

Branch report: sabrenner/langchain-streamed-responses-llmobs
Commit report: b2eef10
Test service: dd-trace-py

✅ 0 Failed, 1196 Passed, 0 Skipped, 30m 23.23s Total duration (6m 25.45s time saved)

@pr-commenter
Copy link

pr-commenter bot commented Oct 2, 2024

Benchmarks

Benchmark execution time: 2024-10-02 20:31:35

Comparing candidate commit b2eef10 in PR branch sabrenner/langchain-streamed-responses-llmobs with baseline commit 7e8eb0b in branch main.

Found 0 performance improvements and 0 performance regressions! Performance is the same for 371 metrics, 53 unstable metrics.

@sabrenner sabrenner marked this pull request as ready for review October 2, 2024 20:51
@sabrenner sabrenner requested review from a team as code owners October 2, 2024 20:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant