Install
go get github.com/hupe1980/agentmesh@latest
Requires Go 1.24+ and appropriate model credentials (for example,
OPENAI_API_KEY).
Quick start
Create a simple ReAct agent that can call tools to answer questions:
package main
import (
"context"
"fmt"
"log"
"github.com/hupe1980/agentmesh/pkg/agent"
"github.com/hupe1980/agentmesh/pkg/graph"
"github.com/hupe1980/agentmesh/pkg/message"
"github.com/hupe1980/agentmesh/pkg/model/openai"
"github.com/hupe1980/agentmesh/pkg/tool"
)
type WeatherArgs struct {
Location string `json:"location"`
}
func main() {
// Define a tool
weatherTool, err := tool.NewFuncTool(
"get_weather",
"Get current weather for a location",
func(ctx context.Context, args WeatherArgs) (map[string]any, error) {
return map[string]any{
"location": args.Location,
"temperature": 22,
"conditions": "Sunny",
}, nil
},
)
if err != nil {
log.Fatal(err)
}
// Create a ReAct agent (returns *message.Graph)
reactAgent, err := agent.NewReAct(
openai.NewModel(),
agent.WithTools(weatherTool),
)
if err != nil {
log.Fatal(err)
}
// Execute the agent
ctx := context.Background()
messages := []message.Message{
message.NewSystemMessageFromText("You are a helpful weather assistant."),
message.NewHumanMessageFromText("What's the weather in Paris?"),
}
// Execute the agent with iterator pattern
for msg, err := range reactAgent.Run(ctx, messages) {
if err != nil {
log.Fatal(err)
}
// Print AI response content
if aiMsg, ok := msg.(*message.AIMessage); ok {
fmt.Println(aiMsg.Content())
}
}
}
Run it:
export OPENAI_API_KEY="your-key"
go run main.go
The agent will automatically reason about the userβs query, decide to call the weather tool, and generate a natural language response based on the toolβs output.
Explore examples
The repository includes 18 comprehensive examples demonstrating key features:
examples/basic_agentβ Simple ReAct agent with tool callingexamples/supervisor_agentβ Multi-agent coordination with supervisor patternexamples/state_builderβ Simplified state initialization with fluent APIexamples/mcp_toolsβ Model Context Protocol integrationexamples/streamingβ Real-time event streaming for responsive UIsexamples/conditional_flowβ Dynamic routing based on agent outputsexamples/parallel_tasksβ Parallel execution of independent graph nodesexamples/human_pauseβ Human-in-the-loop workflowsexamples/human_approvalβ Approval workflows with conditional guards and audit trailsexamples/time_travelβ Debug workflows by replaying from checkpointsexamples/checkpointingβ Automatic state persistence and recoveryexamples/middlewareβ Middleware system demonstrationexamples/circuit_breakerβ Fault tolerance patternsexamples/guardrailsβ Content filtering and PII protectionexamples/observabilityβ OpenTelemetry metrics and distributed tracingexamples/subgraphβ Compose complex workflows from reusable graphsexamples/message_retentionβ Conversation history managementexamples/semantic_cachingβ Text embeddings for semantic searchexamples/a2a_integrationβ Agent-to-Agent protocol
Browse all examples: github.com/hupe1980/agentmesh/tree/main/examples
Run any example with:
go run ./examples/<dir>/main.go
Local docs preview
The devcontainer (or a local Ruby install) includes everything you need to iterate on this documentation site.
cd docs
bundle config set --local path 'vendor/bundle'
bundle install
bundle exec jekyll serve --livereload --host 0.0.0.0 --config _config.yml,_config.dev.yml
Prefer a single command in the devcontainer? Run just docs-serve from the repo root.
Once the server is running, open http://localhost:4000 for the rendered docs with live reload.
Graph Introspection
Debug and visualize your graphs with the introspection API:
// Inspect graph structure
nodes := compiled.GetNodes()
topo := compiled.GetTopology()
fmt.Printf("Entry points: %v\n", topo.EntryPoints)
fmt.Printf("Max depth: %d\n", topo.MaxDepth)
// Get execution paths
paths := compiled.GetExecutionPath(10)
// Generate Mermaid flowchart
flowchart := compiled.GenerateMermaidFlowchart("TD")
os.WriteFile("graph.mmd", []byte(flowchart), 0644)
See the graph_introspection example for complete usage.
Next steps
- Architecture β Understand the Pregel BSP execution model and graph builder pattern
- Agents β Learn about ReAct agents, RAG agents, and custom graph workflows
- Tools β Create function tools with automatic JSON schema generation
- Models β Connect OpenAI, Anthropic, LangChainGo, or custom LLM providers