Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: various fixes #2320

Merged
merged 42 commits into from
Dec 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
4d91c6f
chore: docs update
Dec 23, 2024
b4be44b
add: helpful commands to justfile
Dec 23, 2024
aef012b
Try force rebuilding system prompt on block update
mattzh72 Dec 23, 2024
9ab394e
Run lint
mattzh72 Dec 23, 2024
87772d1
chore: next
Dec 23, 2024
c79eeac
fix: add tests to cypress
Dec 23, 2024
4359463
Merge branch 'main' into matt/let-649-fix-updating-agent-refresh-blocks
4shub Dec 23, 2024
966d33b
feat: update nx to latest
Dec 26, 2024
c452bd8
run isort on apps/core
carenthomas Dec 27, 2024
461ad00
run black, add isort config to pyproject.toml
carenthomas Dec 27, 2024
4417c56
change delete agent return type to none
carenthomas Dec 26, 2024
6c6faf9
remove lettaresponse spec hardcode and add references to pydantic
carenthomas Dec 26, 2024
6a6dab0
rebase, run formatter
carenthomas Dec 27, 2024
eae74f0
fix example
sarahwooders Dec 27, 2024
0269649
Merge branch 'main' into remove-example
sarahwooders Dec 27, 2024
63ecbee
add e2b envs for unit test yml
carenthomas Dec 27, 2024
7a2c266
update install args in yml
carenthomas Dec 27, 2024
67b0525
fix response args
carenthomas Dec 27, 2024
3228ca9
fix: change delete agent response type
carenthomas Dec 27, 2024
10603be
fix: add e2b envs and install args for unit test yml
carenthomas Dec 27, 2024
6b82b40
feat: add desktop ui app
Dec 27, 2024
cff2e7d
chore: init
Dec 27, 2024
46d2150
chore: I did not change those
Dec 27, 2024
a2decd5
make docs generation idempotent
carenthomas Dec 28, 2024
52f8049
add project id to create agent request
carenthomas Dec 29, 2024
c858c80
Finish
mattzh72 Dec 30, 2024
3205a71
fix: patch bug in json generator for composio
cpacker Dec 30, 2024
3d5fd03
fix: strip print
cpacker Dec 30, 2024
87d039b
chore: convert testing back
cpacker Dec 30, 2024
141e5e2
fix: test
cpacker Dec 30, 2024
36805b7
fix: test
cpacker Dec 30, 2024
b6321b1
Run lint
mattzh72 Dec 30, 2024
7a799c2
fix: added extra asserts to tests to make clear what the expected beh…
cpacker Dec 30, 2024
feedb52
Merge branch 'main' into matt/let-671-for-local-sandbox-using-local-e…
cpacker Dec 30, 2024
6557981
Mock e2b api none for test_function_return_limit
mattzh72 Dec 30, 2024
09dd49e
Mock more e2b none
mattzh72 Dec 30, 2024
397dd7a
wip debug
mattzh72 Dec 31, 2024
00c3ab5
Update
mattzh72 Dec 31, 2024
d3d7d57
Log out sandbox details for e2b
mattzh72 Dec 31, 2024
e8a1c6d
Add back more e2b
mattzh72 Dec 31, 2024
9187b99
Merge pull request #491 from letta-ai/matt/let-671-for-local-sandbox-…
mattzh72 Dec 31, 2024
0e3168f
Merge commit '2ee4f842dc73bea12dbaa2b126ef7a5c9fd7e995'
sarahwooders Dec 31, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 3 additions & 26 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -2,43 +2,20 @@
Example enviornment variable configurations for the Letta
Docker container. Un-coment the sections you want to
configure with.

Hint: You don't need to have the same LLM and
Embedding model backends (can mix and match).
##########################################################


##########################################################
OpenAI configuration
##########################################################
## LLM Model
#LETTA_LLM_ENDPOINT_TYPE=openai
#LETTA_LLM_MODEL=gpt-4o-mini
## Embeddings
#LETTA_EMBEDDING_ENDPOINT_TYPE=openai
#LETTA_EMBEDDING_MODEL=text-embedding-ada-002

# OPENAI_API_KEY=sk-...

##########################################################
Ollama configuration
##########################################################
## LLM Model
#LETTA_LLM_ENDPOINT=http://host.docker.internal:11434
#LETTA_LLM_ENDPOINT_TYPE=ollama
#LETTA_LLM_MODEL=dolphin2.2-mistral:7b-q6_K
#LETTA_LLM_CONTEXT_WINDOW=8192
## Embeddings
#LETTA_EMBEDDING_ENDPOINT=http://host.docker.internal:11434
#LETTA_EMBEDDING_ENDPOINT_TYPE=ollama
#LETTA_EMBEDDING_MODEL=mxbai-embed-large
#LETTA_EMBEDDING_DIM=512

# OLLAMA_BASE_URL="http://host.docker.internal:11434"

##########################################################
vLLM configuration
##########################################################
## LLM Model
#LETTA_LLM_ENDPOINT=http://host.docker.internal:8000
#LETTA_LLM_ENDPOINT_TYPE=vllm
#LETTA_LLM_MODEL=ehartford/dolphin-2.2.1-mistral-7b
#LETTA_LLM_CONTEXT_WINDOW=8192
# VLLM_API_BASE="http://host.docker.internal:8000"
42 changes: 0 additions & 42 deletions .github/workflows/letta-web-openapi-saftey.yml

This file was deleted.

4 changes: 3 additions & 1 deletion .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
GEMINI_API_KEY: ${{ secrets.GEMINI_API_KEY }}
GROQ_API_KEY: ${{ secrets.GROQ_API_KEY }}
E2B_API_KEY: ${{ secrets.E2B_API_KEY }}
E2B_SANDBOX_TEMPLATE_ID: ${{ secrets.E2B_SANDBOX_TEMPLATE_ID }}

on:
push:
Expand Down Expand Up @@ -61,7 +63,7 @@ jobs:
with:
python-version: "3.12"
poetry-version: "1.8.2"
install-args: "-E dev -E postgres -E external-tools -E tests"
install-args: "-E dev -E postgres -E external-tools -E tests -E cloud-tool-sandbox"
- name: Migrate database
env:
LETTA_PG_PORT: 5432
Expand Down
41 changes: 23 additions & 18 deletions alembic/versions/08b2f8225812_adding_toolsagents_orm.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,40 +5,45 @@
Create Date: 2024-12-05 16:46:51.258831

"""

from typing import Sequence, Union

from alembic import op
import sqlalchemy as sa

from alembic import op

# revision identifiers, used by Alembic.
revision: str = '08b2f8225812'
down_revision: Union[str, None] = '3c683a662c82'
revision: str = "08b2f8225812"
down_revision: Union[str, None] = "3c683a662c82"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None


def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('tools_agents',
sa.Column('agent_id', sa.String(), nullable=False),
sa.Column('tool_id', sa.String(), nullable=False),
sa.Column('tool_name', sa.String(), nullable=False),
sa.Column('id', sa.String(), nullable=False),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=True),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=True),
sa.Column('is_deleted', sa.Boolean(), server_default=sa.text('FALSE'), nullable=False),
sa.Column('_created_by_id', sa.String(), nullable=True),
sa.Column('_last_updated_by_id', sa.String(), nullable=True),
sa.ForeignKeyConstraint(['agent_id'], ['agents.id'], ),
sa.ForeignKeyConstraint(['tool_id'], ['tools.id'], name='fk_tool_id'),
sa.PrimaryKeyConstraint('agent_id', 'tool_id', 'tool_name', 'id'),
sa.UniqueConstraint('agent_id', 'tool_name', name='unique_tool_per_agent')
op.create_table(
"tools_agents",
sa.Column("agent_id", sa.String(), nullable=False),
sa.Column("tool_id", sa.String(), nullable=False),
sa.Column("tool_name", sa.String(), nullable=False),
sa.Column("id", sa.String(), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.text("now()"), nullable=True),
sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.text("now()"), nullable=True),
sa.Column("is_deleted", sa.Boolean(), server_default=sa.text("FALSE"), nullable=False),
sa.Column("_created_by_id", sa.String(), nullable=True),
sa.Column("_last_updated_by_id", sa.String(), nullable=True),
sa.ForeignKeyConstraint(
["agent_id"],
["agents.id"],
),
sa.ForeignKeyConstraint(["tool_id"], ["tools.id"], name="fk_tool_id"),
sa.PrimaryKeyConstraint("agent_id", "tool_id", "tool_name", "id"),
sa.UniqueConstraint("agent_id", "tool_name", name="unique_tool_per_agent"),
)
# ### end Alembic commands ###


def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('tools_agents')
op.drop_table("tools_agents")
# ### end Alembic commands ###
151 changes: 79 additions & 72 deletions alembic/versions/54dec07619c4_divide_passage_table_into_.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,101 +5,108 @@
Create Date: 2024-12-14 17:23:08.772554

"""

from typing import Sequence, Union

from alembic import op
from pgvector.sqlalchemy import Vector
import sqlalchemy as sa
from pgvector.sqlalchemy import Vector
from sqlalchemy.dialects import postgresql

from alembic import op
from letta.orm.custom_columns import EmbeddingConfigColumn

# revision identifiers, used by Alembic.
revision: str = '54dec07619c4'
down_revision: Union[str, None] = '4e88e702f85e'
revision: str = "54dec07619c4"
down_revision: Union[str, None] = "4e88e702f85e"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None


def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
'agent_passages',
sa.Column('id', sa.String(), nullable=False),
sa.Column('text', sa.String(), nullable=False),
sa.Column('embedding_config', EmbeddingConfigColumn(), nullable=False),
sa.Column('metadata_', sa.JSON(), nullable=False),
sa.Column('embedding', Vector(dim=4096), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=True),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=True),
sa.Column('is_deleted', sa.Boolean(), server_default=sa.text('FALSE'), nullable=False),
sa.Column('_created_by_id', sa.String(), nullable=True),
sa.Column('_last_updated_by_id', sa.String(), nullable=True),
sa.Column('organization_id', sa.String(), nullable=False),
sa.Column('agent_id', sa.String(), nullable=False),
sa.ForeignKeyConstraint(['agent_id'], ['agents.id'], ondelete='CASCADE'),
sa.ForeignKeyConstraint(['organization_id'], ['organizations.id'], ),
sa.PrimaryKeyConstraint('id')
"agent_passages",
sa.Column("id", sa.String(), nullable=False),
sa.Column("text", sa.String(), nullable=False),
sa.Column("embedding_config", EmbeddingConfigColumn(), nullable=False),
sa.Column("metadata_", sa.JSON(), nullable=False),
sa.Column("embedding", Vector(dim=4096), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.text("now()"), nullable=True),
sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.text("now()"), nullable=True),
sa.Column("is_deleted", sa.Boolean(), server_default=sa.text("FALSE"), nullable=False),
sa.Column("_created_by_id", sa.String(), nullable=True),
sa.Column("_last_updated_by_id", sa.String(), nullable=True),
sa.Column("organization_id", sa.String(), nullable=False),
sa.Column("agent_id", sa.String(), nullable=False),
sa.ForeignKeyConstraint(["agent_id"], ["agents.id"], ondelete="CASCADE"),
sa.ForeignKeyConstraint(
["organization_id"],
["organizations.id"],
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index('agent_passages_org_idx', 'agent_passages', ['organization_id'], unique=False)
op.create_index("agent_passages_org_idx", "agent_passages", ["organization_id"], unique=False)
op.create_table(
'source_passages',
sa.Column('id', sa.String(), nullable=False),
sa.Column('text', sa.String(), nullable=False),
sa.Column('embedding_config', EmbeddingConfigColumn(), nullable=False),
sa.Column('metadata_', sa.JSON(), nullable=False),
sa.Column('embedding', Vector(dim=4096), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=True),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=True),
sa.Column('is_deleted', sa.Boolean(), server_default=sa.text('FALSE'), nullable=False),
sa.Column('_created_by_id', sa.String(), nullable=True),
sa.Column('_last_updated_by_id', sa.String(), nullable=True),
sa.Column('organization_id', sa.String(), nullable=False),
sa.Column('file_id', sa.String(), nullable=True),
sa.Column('source_id', sa.String(), nullable=False),
sa.ForeignKeyConstraint(['file_id'], ['files.id'], ondelete='CASCADE'),
sa.ForeignKeyConstraint(['organization_id'], ['organizations.id'], ),
sa.ForeignKeyConstraint(['source_id'], ['sources.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id')
"source_passages",
sa.Column("id", sa.String(), nullable=False),
sa.Column("text", sa.String(), nullable=False),
sa.Column("embedding_config", EmbeddingConfigColumn(), nullable=False),
sa.Column("metadata_", sa.JSON(), nullable=False),
sa.Column("embedding", Vector(dim=4096), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.text("now()"), nullable=True),
sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.text("now()"), nullable=True),
sa.Column("is_deleted", sa.Boolean(), server_default=sa.text("FALSE"), nullable=False),
sa.Column("_created_by_id", sa.String(), nullable=True),
sa.Column("_last_updated_by_id", sa.String(), nullable=True),
sa.Column("organization_id", sa.String(), nullable=False),
sa.Column("file_id", sa.String(), nullable=True),
sa.Column("source_id", sa.String(), nullable=False),
sa.ForeignKeyConstraint(["file_id"], ["files.id"], ondelete="CASCADE"),
sa.ForeignKeyConstraint(
["organization_id"],
["organizations.id"],
),
sa.ForeignKeyConstraint(["source_id"], ["sources.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
)
op.create_index('source_passages_org_idx', 'source_passages', ['organization_id'], unique=False)
op.drop_table('passages')
op.drop_constraint('files_source_id_fkey', 'files', type_='foreignkey')
op.create_foreign_key(None, 'files', 'sources', ['source_id'], ['id'], ondelete='CASCADE')
op.drop_constraint('messages_agent_id_fkey', 'messages', type_='foreignkey')
op.create_foreign_key(None, 'messages', 'agents', ['agent_id'], ['id'], ondelete='CASCADE')
op.create_index("source_passages_org_idx", "source_passages", ["organization_id"], unique=False)
op.drop_table("passages")
op.drop_constraint("files_source_id_fkey", "files", type_="foreignkey")
op.create_foreign_key(None, "files", "sources", ["source_id"], ["id"], ondelete="CASCADE")
op.drop_constraint("messages_agent_id_fkey", "messages", type_="foreignkey")
op.create_foreign_key(None, "messages", "agents", ["agent_id"], ["id"], ondelete="CASCADE")
# ### end Alembic commands ###


def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint(None, 'messages', type_='foreignkey')
op.create_foreign_key('messages_agent_id_fkey', 'messages', 'agents', ['agent_id'], ['id'])
op.drop_constraint(None, 'files', type_='foreignkey')
op.create_foreign_key('files_source_id_fkey', 'files', 'sources', ['source_id'], ['id'])
op.drop_constraint(None, "messages", type_="foreignkey")
op.create_foreign_key("messages_agent_id_fkey", "messages", "agents", ["agent_id"], ["id"])
op.drop_constraint(None, "files", type_="foreignkey")
op.create_foreign_key("files_source_id_fkey", "files", "sources", ["source_id"], ["id"])
op.create_table(
'passages',
sa.Column('id', sa.VARCHAR(), autoincrement=False, nullable=False),
sa.Column('text', sa.VARCHAR(), autoincrement=False, nullable=False),
sa.Column('file_id', sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column('agent_id', sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column('source_id', sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column('embedding', Vector(dim=4096), autoincrement=False, nullable=True),
sa.Column('embedding_config', postgresql.JSON(astext_type=sa.Text()), autoincrement=False, nullable=False),
sa.Column('metadata_', postgresql.JSON(astext_type=sa.Text()), autoincrement=False, nullable=False),
sa.Column('created_at', postgresql.TIMESTAMP(timezone=True), autoincrement=False, nullable=False),
sa.Column('updated_at', postgresql.TIMESTAMP(timezone=True), server_default=sa.text('now()'), autoincrement=False, nullable=True),
sa.Column('is_deleted', sa.BOOLEAN(), server_default=sa.text('false'), autoincrement=False, nullable=False),
sa.Column('_created_by_id', sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column('_last_updated_by_id', sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column('organization_id', sa.VARCHAR(), autoincrement=False, nullable=False),
sa.ForeignKeyConstraint(['agent_id'], ['agents.id'], name='passages_agent_id_fkey'),
sa.ForeignKeyConstraint(['file_id'], ['files.id'], name='passages_file_id_fkey', ondelete='CASCADE'),
sa.ForeignKeyConstraint(['organization_id'], ['organizations.id'], name='passages_organization_id_fkey'),
sa.PrimaryKeyConstraint('id', name='passages_pkey')
"passages",
sa.Column("id", sa.VARCHAR(), autoincrement=False, nullable=False),
sa.Column("text", sa.VARCHAR(), autoincrement=False, nullable=False),
sa.Column("file_id", sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column("agent_id", sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column("source_id", sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column("embedding", Vector(dim=4096), autoincrement=False, nullable=True),
sa.Column("embedding_config", postgresql.JSON(astext_type=sa.Text()), autoincrement=False, nullable=False),
sa.Column("metadata_", postgresql.JSON(astext_type=sa.Text()), autoincrement=False, nullable=False),
sa.Column("created_at", postgresql.TIMESTAMP(timezone=True), autoincrement=False, nullable=False),
sa.Column("updated_at", postgresql.TIMESTAMP(timezone=True), server_default=sa.text("now()"), autoincrement=False, nullable=True),
sa.Column("is_deleted", sa.BOOLEAN(), server_default=sa.text("false"), autoincrement=False, nullable=False),
sa.Column("_created_by_id", sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column("_last_updated_by_id", sa.VARCHAR(), autoincrement=False, nullable=True),
sa.Column("organization_id", sa.VARCHAR(), autoincrement=False, nullable=False),
sa.ForeignKeyConstraint(["agent_id"], ["agents.id"], name="passages_agent_id_fkey"),
sa.ForeignKeyConstraint(["file_id"], ["files.id"], name="passages_file_id_fkey", ondelete="CASCADE"),
sa.ForeignKeyConstraint(["organization_id"], ["organizations.id"], name="passages_organization_id_fkey"),
sa.PrimaryKeyConstraint("id", name="passages_pkey"),
)
op.drop_index('source_passages_org_idx', table_name='source_passages')
op.drop_table('source_passages')
op.drop_index('agent_passages_org_idx', table_name='agent_passages')
op.drop_table('agent_passages')
op.drop_index("source_passages_org_idx", table_name="source_passages")
op.drop_table("source_passages")
op.drop_index("agent_passages_org_idx", table_name="agent_passages")
op.drop_table("agent_passages")
# ### end Alembic commands ###
Loading
Loading