Skip to content
This repository has been archived by the owner on Nov 21, 2024. It is now read-only.

Latest commit

 

History

History
30 lines (25 loc) · 748 Bytes

README.md

File metadata and controls

30 lines (25 loc) · 748 Bytes

LLM Endpoint load testing using Locust

This code repository performs testing of LLM API endpoints using locust.

Clients Implemented

  • OpenAI
  • OctoAI
  • Amazon Bedrock

Each of these clients uses each providers' SDK to make requests to various model endpoints.

Installation

  1. Clone the repo
    git clone https://github.com/ergodicio/locustllms.git
  2. Create a new virtual environment using venv:
    python -m venv myenv
    source venv/bin/activate
    pip install -r requirements.txt

Usage

  1. Make sure environment variables are configured for each service. OpenAI, OctoAI, and boto3
  2. From the terminal
    locust
  3. Go to http://0.0.0.0:8089