Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ Add profiles support #119

Merged
merged 4 commits into from
Mar 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ require (
github.com/dave/jennifer v1.7.0
github.com/go-go-golems/bobatea v0.0.5
github.com/go-go-golems/clay v0.1.9
github.com/go-go-golems/glazed v0.5.9
github.com/go-go-golems/glazed v0.5.10
github.com/huandu/go-clone v1.7.2
github.com/iancoleman/strcase v0.3.0
github.com/invopop/jsonschema v0.12.0
Expand Down
4 changes: 2 additions & 2 deletions go.sum
Original file line number Diff line number Diff line change
Expand Up @@ -119,8 +119,8 @@ github.com/go-go-golems/bobatea v0.0.5 h1:JYlgmMeG5A3/rM+AyTPfAbhIbfFUpL7U/A7ZgV
github.com/go-go-golems/bobatea v0.0.5/go.mod h1:SG1cXuzm0Bp48EJ8UAuANSBYYw++FQ4ImvHj7tXEzJQ=
github.com/go-go-golems/clay v0.1.9 h1:CveV2T+HdCIzPMeoGb8btcsXgKZuc8XUYNEJX+jn9YQ=
github.com/go-go-golems/clay v0.1.9/go.mod h1:ovEpMRRJQ3ndnoc9qKh9T+TTPAtWYvyHjupwo191GpU=
github.com/go-go-golems/glazed v0.5.9 h1:Ih9t9g4WEIYLcZ35mhkx0dPTUqGFMG51QvPaoUgowlU=
github.com/go-go-golems/glazed v0.5.9/go.mod h1:K1600pUk7xB/LKmvIafRWyfAdxE1sboruqQ9Jia8V9M=
github.com/go-go-golems/glazed v0.5.10 h1:+aNrpcV/MapKbczWy3gUEbtsrKCCRdv4sBGwe/Aob50=
github.com/go-go-golems/glazed v0.5.10/go.mod h1:K1600pUk7xB/LKmvIafRWyfAdxE1sboruqQ9Jia8V9M=
github.com/go-go-golems/sqleton v0.2.4 h1:qsgX0RxBXdjOC/+zmRrVvlsbBT8HCM2otLh/bV/f5uU=
github.com/go-go-golems/sqleton v0.2.4/go.mod h1:GAGCz4/wsFwzN5mUA3ARmJfqanYu1k5yP4rCUiTeWgs=
github.com/go-openapi/errors v0.20.3 h1:rz6kiC84sqNQoqrtulzaL/VERgkoCyB6WdEkc2ujzUc=
Expand Down
7 changes: 7 additions & 0 deletions misc/profiles.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
test:
ai-chat:
ai-engine: gpt-4-turbo-preview-yolo

test2:
ai-chat:
ai-engine: gpt-4-turbo-preview-yolo2
32 changes: 32 additions & 0 deletions pkg/cmds/cobra.go
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
package cmds

import (
"fmt"
"github.com/go-go-golems/geppetto/pkg/steps/ai/settings"
"github.com/go-go-golems/geppetto/pkg/steps/ai/settings/claude"
"github.com/go-go-golems/geppetto/pkg/steps/ai/settings/openai"
Expand All @@ -10,6 +11,7 @@ import (
"github.com/go-go-golems/glazed/pkg/cmds/middlewares"
"github.com/go-go-golems/glazed/pkg/cmds/parameters"
"github.com/spf13/cobra"
"os"
)

func BuildCobraCommandWithGeppettoMiddlewares(
Expand All @@ -29,6 +31,9 @@ func GetCobraCommandGeppettoMiddlewares(
cmd *cobra.Command,
args []string,
) ([]middlewares.Middleware, error) {
// if we want profile support here, we would have to check for a --profile and --profile-file flag,
// then load the file (or the default file), check for the profile values, then apply them before load-parameters-from-file

middlewares_ := []middlewares.Middleware{
middlewares.ParseFromCobraCommand(cmd,
parameters.WithParseStepSource("cobra"),
Expand All @@ -43,6 +48,33 @@ func GetCobraCommandGeppettoMiddlewares(
middlewares.LoadParametersFromFile(commandSettings.LoadParametersFromFile))
}

xdgConfigPath, err := os.UserConfigDir()
if err != nil {
return nil, err
}

// TODO(manuel, 2024-03-20) I wonder if we should just use a custom layer for the profiles, as we want to load
// the profile from the environment as well. So the sequence would be defaults -> viper -> command line
defaultProfileFile := fmt.Sprintf("%s/pinocchio/profiles.yaml", xdgConfigPath)
if commandSettings.ProfileFile == "" {
commandSettings.ProfileFile = defaultProfileFile
}
if commandSettings.Profile == "" {
commandSettings.Profile = "default"
}
middlewares_ = append(middlewares_,
middlewares.GatherFlagsFromProfiles(
defaultProfileFile,
commandSettings.ProfileFile,
commandSettings.Profile,
parameters.WithParseStepSource("profiles"),
parameters.WithParseStepMetadata(map[string]interface{}{
"profileFile": commandSettings.ProfileFile,
"profile": commandSettings.Profile,
}),
),
)

middlewares_ = append(middlewares_,
middlewares.WrapWithWhitelistedLayers(
[]string{
Expand Down
87 changes: 87 additions & 0 deletions pkg/doc/topics/profiles.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
---
Title: Using profiles in Pinocchio
Slug: profiles
Short: |
Configure and use different profiles in Pinocchio to override layer parameters.
Topics:
- configuration
Commands:
- pinocchio
Flags:
- profile
IsTopLevel: true
ShowPerDefault: true
SectionType: GeneralTopic
---

# Profile Configuration in Pinocchio

Pinocchio allows users to override layer parameters using profiles. This
functionality makes it easy to switch between different configurations (for example,
different api keys or url endpoints).

For example, this allows us to use the openai api-type with different urls in order to use ollama or anyscale, for
example.

## Configuring Profiles

Profiles are defined in a YAML configuration file, typically located at `~/.config/pinocchio/profiles.yaml` on Linux
systems (or the equivalent path on macOS). Each profile specifies a set of parameters that can be used to override the
default settings.

Here's an example `profiles.yaml` file:

```yaml
mixtral:
openai-chat:
openai-base-url: https://api.endpoints.anyscale.com/v1
openai-api-key: XXX
ai-chat:
ai-engine: mistralai/Mixtral-8x7B-Instruct-v0.1
ai-api-type: openai

mistral:
openai-chat:
openai-base-url: https://api.endpoints.anyscale.com/v1
openai-api-key: XXX
ai-chat:
ai-engine: mistralai/Mistral-7B-Instruct-v0.1
ai-api-type: openai

zephir:
openai-chat:
openai-base-url: https://api.endpoints.anyscale.com/v1
openai-api-key: XXX
ai-chat:
ai-engine: HuggingFaceH4/zephyr-7b-beta
ai-api-type: openai
```

## Selecting a Profile

To select a profile for use, you can set the `PINOCCHIO_PROFILE` environment variable, use the `--profile` flag on the
command line, or set the profile value in `~/.pinocchio/config.yaml`.

### Using the Environment Variable

```bash
export PINOCCHIO_PROFILE=mistral
pinocchio [command]
```

### Using the Command Line Flag

```bash
pinocchio --profile mistral [command]
```

### Setting in `config.yaml`

Add the following to your `~/.pinocchio/config.yaml`:

```yaml
profile: mistral
```

After setting the desired profile, Pinocchio will use the parameters defined within that profile for all operations.

1 change: 1 addition & 0 deletions pkg/steps/ai/settings/openai/chat.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ flags:
- name: openai-logit-bias
# TODO(manuel, 2023-03-28) We currently only have map[string]string for keyValue, but we need map[string]int
# See https://github.com/go-go-golems/geppetto/issues/48

type: keyValue
help: OpenAI chat completion logit bias
default: {}
Expand Down
Loading