Skip to content

The project is a Go application that interacts with a large language model API like glm by sending text requests and returning generated responses.

License

Notifications You must be signed in to change notification settings

vvxf/zhipuai_go

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

zhipuai_go Project

zhipuai_go is a Go-based application that provides an interface to interact with glm Language Model API. This project is designed to be lightweight, efficient, and easy to use for handling text-based requests and generating responses.

Features

  • Integration with a Language Model API for generating responses.
  • JSON-based request and response format.
  • Easy configuration through an INI file.

Requirements

  • Go 1.16 or later.
  • A valid API key for the Language Model API.

Installation

  1. Clone the repository:
    git clone https://github.com/vvxf/zhipuai_go.git
    cd zhipuai_go
  2. Install dependencies:
    go mod tidy
  3. Configure the application by creating a config.ini file in the conf directory with the following content:
    [api]
    url = "YOUR_API_URL"
    key = "YOUR_API_KEY"

Usage

package main

import (
    "fmt"
    "github.com/vvxf/zhipuai_go/api"
)

func main() {
    apiURL := "https://open.bigmodel.cn/api/paas/v4/chat/completions"
    apiKey := ""

    // init application
    appService := api.NewLLMApplicationService(apiURL, apiKey)

    resp, _ := appService.HandleRequest("glm-4-flash", []api.Message{
        {
            Role:    "user",
            Content: "Hello, llm!",
        },
    })

    fmt.Println(resp.Choices[0].Message.Content)
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

For any questions or suggestions, please open an issue on GitHub or reach out to us at zhipuaigo @ outlook.com. Thank you for using zhipuai_go!

About

The project is a Go application that interacts with a large language model API like glm by sending text requests and returning generated responses.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages