Releases: pipecat-ai/pipecat-flows
v0.0.10
Changed
-
Nodes now have two message types to better delineate defining the role or
persona of the bot from the task it needs to accomplish. The message types are:role_messages
, which defines the personality or role of the bottask_messages
, which defines the task to be completed for a given node
-
role_messages
can be defined for the initial node and then inherited by
subsequent nodes. You can treat this as an LLM "system" message. -
Simplified FlowManager initialization by removing the need for manual context
setup in both static and dynamic flows. Now, you need to create aFlowManager
and initialize it to start the flow. -
All examples have been updated to align with the API changes.
Fixed
- Fixed an issue where importing the Flows module would require OpenAI,
Anthropic, and Google LLM modules.
v0.0.9
Changed
- Fixed function handler registration in FlowManager to handle
__function__:
tokens- Previously, the handler string was used directly, causing "not callable" errors
- Now correctly looks up and uses the actual function object from the main module
- Supports both direct function references and function names exported from the Flows editor
v0.0.8
Changed
- Improved type safety in FlowManager by requiring keyword arguments for initialization
- Enhanced error messages for LLM service type validation
v0.0.7
Added
- New
transition_to
field for static flows- Combines function handlers with state transitions
- Supports all LLM providers (OpenAI, Anthropic, Gemini)
- Static examples updated to use this new transition
Changed
- Static flow transitions now use
transition_to
instead of matching function names- Before: Function name had to match target node name
- After: Function explicitly declares target via
transition_to
Fixed
- Duplicate LLM responses during transitions
v0.0.6
Added
- New FlowManager supporting both static and dynamic conversation flows
- Provider-specific examples demonstrating dynamic flows:
- OpenAI:
insurance_openai.py
- Anthropic:
insurance_anthropic.py
- Gemini:
insurance_gemini.py
- OpenAI:
- Type safety improvements:
FlowArgs
: Type-safe function argumentsFlowResult
: Type-safe function returns
Changed
- Simplified function handling:
- Automatic LLM function registration
- Optional handlers for edge nodes
- Updated all examples to use unified FlowManager interface
v0.0.5
Added
-
Added LLM support for:
- Anthropic
- Google Gemini
-
Added
LLMFormatParser
, a format parser to handle LLM provider-specific
messages and function call formats -
Added new examples:
movie_explorer_anthropic.py
(Claude 3.5)movie_explorer_gemini.py
(Gemini 1.5 Flash)travel_planner_gemini.py
(Gemini 1.5 Flash)
v0.0.4
Added
- New example
movie_explorer.py
demonstrating:- Real API integration with TMDB
- Node functions for API calls
- Edge functions for state transitions
- Proper function registration pattern
Changed
-
Renamed function types to use graph terminology:
- "Terminal functions" are now "node functions" (operations within a state)
- "Transitional functions" are now "edge functions" (transitions between states)
-
Updated function registration process:
- Node functions must be registered directly with the LLM before flow initialization
- Edge functions are automatically registered by FlowManager during initialization
- LLM instance is now required in FlowManager constructor
-
Added flexibility to node naming with the Editor:
- Start nodes can now use any descriptive name (e.g., "greeting")
- End nodes conventionally use "end" but support custom names
- Flow configuration's
initial_node
property determines the starting state
Updated
- All examples updated to use new function registration pattern
- Documentation updated to reflect new terminology and patterns
- Editor updated to support flexible node naming