Skip to content

patvice/ruby_llm-mcp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

RubyLLM

Aiming to make using MCPs with RubyLLM and Ruby as easy as possible.

This project is a Ruby client for the Model Context Protocol (MCP), designed to work seamlessly with RubyLLM. This gem enables Ruby applications to connect to MCP servers and use their tools, resources and prompts as part of LLM conversations.

For a more detailed guide, see the RubyLLM::MCP docs.

Currently full support for MCP protocol version up to 2025-06-18.

Gem Version Gem Downloads

RubyLLM::MCP Features

  • πŸ”Œ Multiple Transport Types: Streamable HTTP, and STDIO and legacy SSE transports
  • πŸ› οΈ Tool Integration: Automatically converts MCP tools into RubyLLM-compatible tools
  • πŸ“„ Resource Management: Access and include MCP resources (files, data) and resource templates in conversations
  • 🎯 Prompt Integration: Use predefined MCP prompts with arguments for consistent interactions
  • πŸŽ›οΈ Client Features: Support for sampling and roots
  • 🎨 Enhanced Chat Interface: Extended RubyLLM chat methods for seamless MCP integration
  • πŸ”„ Multiple Client Management: Create and manage multiple MCP clients simultaneously for different servers and purposes
  • πŸ“š Simple API: Easy-to-use interface that integrates seamlessly with RubyLLM

Installation

bundle add ruby_llm-mcp

or add this line to your application's Gemfile:

gem 'ruby_llm-mcp'

And then execute:

bundle install

Or install it yourself as:

gem install ruby_llm-mcp

Usage

Basic Setup

First, configure your RubyLLM client and create an MCP connection:

require 'ruby_llm/mcp'

# Configure RubyLLM
RubyLLM.configure do |config|
  config.openai_api_key = "your-api-key"
end

# Connect to an MCP server via SSE
client = RubyLLM::MCP.client(
  name: "my-mcp-server",
  transport_type: :sse,
  config: {
    url: "http://localhost:9292/mcp/sse"
  }
)

# Or connect via stdio
client = RubyLLM::MCP.client(
  name: "my-mcp-server",
  transport_type: :stdio,
  config: {
    command: "node",
    args: ["path/to/mcp-server.js"],
    env: { "NODE_ENV" => "production" }
  }
)

# Or connect via streamable HTTP
client = RubyLLM::MCP.client(
  name: "my-mcp-server",
  transport_type: :streamable,
  config: {
    url: "http://localhost:8080/mcp",
    headers: { "Authorization" => "Bearer your-token" }
  }
)

Using MCP Tools with RubyLLM

# Get available tools from the MCP server
tools = client.tools
puts "Available tools:"
tools.each do |tool|
  puts "- #{tool.name}: #{tool.description}"
end

# Create a chat session with MCP tools
chat = RubyLLM.chat(model: "gpt-4")
chat.with_tools(*client.tools)

# Ask a question that will use the MCP tools
response = chat.ask("Can you help me search for recent files in my project?")
puts response

Manual Tool Execution

You can also execute MCP tools directly:

# Tools Execution
tool = client.tool("search_files")

# Execute a specific tool
result = tool.execute(
  name: "search_files",
  parameters: {
    query: "*.rb",
    directory: "/path/to/search"
  }
)

puts result

Working with Resources

MCP servers can provide access to resources - structured data that can be included in conversations. Resources come in two types: normal resources and resource templates.

Normal Resources

# Get available resources from the MCP server
resources = client.resources
puts "Available resources:"
resources.each do |resource|
  puts "- #{resource.name}: #{resource.description}"
end

# Access a specific resource by name
file_resource = client.resource("project_readme")
content = file_resource.content
puts "Resource content: #{content}"

# Include a resource in a chat conversation for reference with an LLM
chat = RubyLLM.chat(model: "gpt-4")
chat.with_resource(file_resource)

# Or add a resource directly to the conversation
file_resource.include(chat)

response = chat.ask("Can you summarize this README file?")
puts response

Resource Templates

Resource templates are parameterized resources that can be dynamically configured:

# Get available resource templates
templates = client.resource_templates
log_template = client.resource_template("application_logs")

# Use a template with parameters
chat = RubyLLM.chat(model: "gpt-4")
chat.with_resource_template(log_template, arguments: {
  date: "2024-01-15",
  level: "error"
})

response = chat.ask("What errors occurred on this date?")
puts response

# You can also get templated content directly
content = log_template.to_content(arguments: {
  date: "2024-01-15",
  level: "error"
})
puts content

Working with Prompts

MCP servers can provide predefined prompts that can be used in conversations:

# Get available prompts from the MCP server
prompts = client.prompts
puts "Available prompts:"
prompts.each do |prompt|
  puts "- #{prompt.name}: #{prompt.description}"
  prompt.arguments.each do |arg|
    puts "  - #{arg.name}: #{arg.description} (required: #{arg.required})"
  end
end

# Use a prompt in a conversation
greeting_prompt = client.prompt("daily_greeting")
chat = RubyLLM.chat(model: "gpt-4")

# Method 1: Ask prompt directly
response = chat.ask_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
puts response

# Method 2: Add prompt to chat and then ask
chat.with_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
response = chat.ask("Continue with the greeting")

Development

After checking out the repo, run bundle to install dependencies. Then, run bundle exec rake to run the tests. Tests currently use bun to run test MCP servers You can also run bin/console for an interactive prompt that will allow you to experiment.

There are also examples you you can run to verify the gem is working as expected.

bundle exec ruby examples/tools/local_mcp.rb

Contributing

We welcome contributions! Bug reports and pull requests are welcome on GitHub at https://github.com/patvice/ruby_llm-mcp.

License

Released under the MIT License.

About

Making full featured MCP support with RubyLLM and Ruby as easy as possible.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 5

Languages