-
Notifications
You must be signed in to change notification settings - Fork 582
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
(docs) Use Presidio across Anthropic, Bedrock, VertexAI, Azure OpenAI, etc. w/ LiteLLM Proxy #1421
Conversation
Thanks! We'll review this soon |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great, left a few nits
Thanks for the contribution!
@SharonHart @omri374 updated based on feedback - any remaining blockers for merge? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Great addition.
I would suggest to add an explanation on the top specifying the user cases for using presidio with litellm, and how this flow works in high level.
/azp run |
Azure Pipelines successfully started running 1 pipeline(s). |
Hey @omri374 @SharonHart updated based on feedback - is this okay to merge? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Great contribution and addition
Hey @omri374 @SharonHart i still don't see litellm on the docs - https://microsoft.github.io/presidio/samples/ Did i put this somewhere wrong? |
Hey @krrishdholakia, the website is now updated and contains the litellm docs. Thanks for the note! |
Change Description
Adds a sample doc for setting up Presidio (docker image) with LiteLLM Proxy for implementing PII Masking across 100+ LLMs.
Issue reference
n/a
Checklist
cc: @omri374 @SharonHart