Co-Pilot
Updated 24 days ago

langgraphgo

Ssmallnest
0.2k
smallnest/langgraphgo
80
Agent Score

💡 Summary

LangGraphGo is a Go library for building AI workflows with advanced state management and parallel execution capabilities.

🎯 Target Audience

AI developersData scientistsSoftware engineersProduct managersTechnical project leads

🤖 AI Roast:Powerful, but the setup might scare off the impatient.

Security AnalysisMedium Risk

Risk: Medium. Review: shell/CLI command execution; outbound network access (SSRF, data egress); API keys/tokens handling and storage; filesystem read/write scope and path traversal. Run with least privilege and audit before enabling in production.

LangGraphGo

Abbreviated as lango, 中文: 懒狗

License go.dev reference github actions Go Report Card Coverage Status

English | 简体中文

Website: http://lango.rpcx.io

🔀 Forked from paulnegz/langgraphgo - Enhanced with streaming, visualization, observability, and production-ready features.

This fork aims for feature parity with the Python LangGraph library, adding support for parallel execution, persistence, advanced state management, pre-built agents, and human-in-the-loop workflows.

Test coverage

🌐 Websites Built with LangGraphGo

Real-world applications built with LangGraphGo:

| Insight | NoteX | | :--------------------------------: | :----------------------------: | | | |

Insight - An AI-powered knowledge management and insight generation platform that uses LangGraphGo to build intelligent analysis workflows, helping users extract key insights from massive amounts of information.

NoteX - An intelligent note-taking and knowledge organization tool that leverages AI for automatic categorization, tag extraction, and content association, making knowledge management more efficient.

📦 Installation

go get github.com/smallnest/langgraphgo

Note: This repository uses Git submodules for the showcases directory. When cloning, use one of the following methods:

# Method 1: Clone with submodules git clone --recurse-submodules https://github.com/smallnest/langgraphgo # Method 2: Clone first, then initialize submodules git clone https://github.com/smallnest/langgraphgo cd langgraphgo git submodule update --init --recursive

🚀 Features

  • Core Runtime:

    • Parallel Execution: Concurrent node execution (fan-out) with thread-safe state merging.
    • Runtime Configuration: Propagate callbacks, tags, and metadata via RunnableConfig.
    • Generic Types: Type-safe state management with generic StateGraph implementations.
    • LangChain Compatible: Works seamlessly with langchaingo.
  • Persistence & Reliability:

    • Checkpointers: Redis, Postgres, SQLite, and File implementations for durable state.
    • File Checkpointing: Lightweight file-based checkpointing without external dependencies.
    • State Recovery: Pause and resume execution from checkpoints.
  • Advanced Capabilities:

    • State Schema: Granular state updates with custom reducers (e.g., AppendReducer).
    • Smart Messages: Intelligent message merging with ID-based upserts (AddMessages).
    • Command API: Dynamic control flow and state updates directly from nodes.
    • Ephemeral Channels: Temporary state values that clear automatically after each step.
    • Subgraphs: Compose complex agents by nesting graphs within graphs.
    • Enhanced Streaming: Real-time event streaming with multiple modes (updates, values, messages).
    • Pre-built Agents: Ready-to-use ReAct, CreateAgent, and Supervisor agent factories.
    • Programmatic Tool Calling (PTC): LLM generates code that calls tools programmatically, reducing latency and token usage by 10x.
  • Developer Experience:

    • Visualization: Export graphs to Mermaid, DOT, and ASCII with conditional edge support.
    • Human-in-the-loop (HITL): Interrupt execution, inspect state, edit history (UpdateState), and resume.
    • Observability: Built-in tracing and metrics support.
    • Tools: Integrated Tavily and Exa search tools.

🎯 Quick Start

package main import ( "context" "fmt" "log" "github.com/smallnest/langgraphgo/graph" "github.com/tmc/langchaingo/llms" "github.com/tmc/langchaingo/llms/openai" ) func main() { ctx := context.Background() model, _ := openai.New() // 1. Create Graph g := graph.NewStateGraph[map[string]any]() // 2. Add Nodes g.AddNode("generate", "generate", func(ctx context.Context, state map[string]any) (map[string]any, error) { input, ok := state["input"].(string) if !ok { return nil, fmt.Errorf("invalid input") } response, err := model.Call(ctx, input) if err != nil { return nil, err } state["output"] = response return state, nil }) // 3. Define Edges g.AddEdge("generate", graph.END) g.SetEntryPoint("generate") // 4. Compile runnable, _ := g.Compile() // 5. Invoke initialState := map[string]any{ "input": "Hello, LangGraphGo!", } result, _ := runnable.Invoke(ctx, initialState) fmt.Println(result) }

📚 Examples

This project includes 90+ comprehensive examples organized into categories:

Featured Examples

Example Categories

View All 90+ Examples →

🔧 Key Concepts

Parallel Execution

LangGraphGo automatically executes nodes in parallel when they share the same starting node. Results are merged using the graph's state merger or schema.

g.AddEdge("start", "branch_a") g.AddEdge("start", "branch_b") // branch_a and branch_b run concurrently

Human-in-the-loop (HITL)

Pause execution to allow for human approval or input.

config := &graph.Config{ InterruptBefore: []string{"human_review"}, } // Execution stops before "human_review" node state, err := runnable.InvokeWithConfig(ctx, input, config) // Resume execution resumeConfig := &graph.Config{ ResumeFrom: []string{"human_review"}, } runnable.InvokeWithConfig(ctx, state, resumeConfig)

Pre-built Agents

Quickly create complex agents using factory functions.

// Create a ReAct agent agent, err := prebuilt.CreateReactAgent(model, tools) // Create an agent with options agent, err := prebuilt.CreateAgent(model, tools, prebuilt.WithSystemMessage("System prompt")) // Create a Supervisor agent supervisor, err := prebuilt.CreateSupervisor(model, agents)

Programmatic Tool Calling (PTC)

Generate code that calls tools directly, reducing API round-trips and token usage.

// Create a PTC agent agent, err := ptc.CreatePTCAgent(ptc.PTCAgentConfig{ Model: model, Tools: toolList, Language: ptc.LanguagePython, // or ptc.LanguageGo ExecutionMode: ptc.ModeDirect, // Subprocess (default) or ModeServer MaxIterations: 10, }) // LLM generates code that calls tools programmatically result, err := agent.Invoke(ctx, initialState)

See the PTC README for detailed documentation.

🎨 Graph Visualization

exporter := runnable.GetGraph() fmt.Println(exporter.DrawMermaid()) // Generates Mermaid flowchart

📈 Performance

  • Graph Operations: ~14-94μs depending on format
  • Tracing Overhead: ~4μs per execution
  • Event Processing: 1000+ events/second
  • Streaming Latency: <100ms

🧪 Testing

go test ./... -v

Contributors

This project is op

5-Dim Analysis
Clarity8/10
Novelty8/10
Utility9/10
Completeness8/10
Maintainability7/10
Pros & Cons

Pros

  • Supports parallel execution for efficiency.
  • Offers advanced state management features.
  • Integrates well with existing AI tools.

Cons

  • May have a steep learning curve for beginners.
  • Documentation could be more detailed.
  • Dependency on Go may limit user base.

Related Skills

useful-ai-prompts

A
toolCo-Pilot
88/ 100

“A treasure trove of prompts, but don’t expect them to write your novel for you.”

mcpspy

A
toolCo-Pilot
86/ 100

“MCPSpy: because who doesn't want to spy on their AI's secrets?”

fastmcp

A
toolCo-Pilot
86/ 100

“FastMCP: because who doesn't love a little complexity with their AI?”

Disclaimer: This content is sourced from GitHub open source projects for display and rating purposes only.

Copyright belongs to the original author smallnest.