Hello everyone, I am Coffee Brother.
In today's rapidly developing field of artificial intelligence, agents are becoming an important bridge connecting AI with the real world. They not only understand user intentions but also take proactive actions to complete complex tasks by invoking various tools.
This article will delve into the core concepts of AI Agents, their implementation methods, and their applications in real-world scenarios, especially how to build a “Programmer Encourager”—an AI assistant that not only provides technical advice but also offers psychological support when programmers feel down.
What is an AI Agent?
An AI Agent is a system that canperceive the environment, make autonomous decisions, and execute actions. It combines the understanding capabilities of large language models (LLMs) with the execution capabilities of predefined tools, allowing it to autonomously complete complex tasks. This capability makes AI Agents a primary form of future AI applications in life and production.
In the open-source Eino framework by ByteDance, building an AI Agent mainly relies on two core components:
- ChatModel
- As the “brain” of the Agent, it is responsible for understanding user input, analyzing task requirements, and deciding whether to invoke specific tools.
- Tool
- As the “hands” of the Agent, it provides specific functional implementations, such as calling external APIs, executing local logic, etc.
1. ChatModel: The Brain of the Agent
ChatModel is a core component of the AI Agent, processing user natural language input through its powerful language understanding capabilities. When a user makes a request, ChatModel deeply understands the user’s intent, analyzes task requirements, and decides whether specific tools need to be invoked to complete the task.
In the Eino framework, ChatModel provides a unified interface that supports various large language model services, such as:
- OpenAI
- Supports models like GPT-3.5, GPT-4, etc.
- Ollama
- Supports locally deployed open-source models (like Llama2)
- Volcano Engine (Ark)
- Supports ByteDance’s Doubao large model, etc.
-
Example: Initializing ChatModel
We demonstrate how to initialize a ChatModel using the Eino framework and generate responses based on user input:
// ai-agent/chat.go
func NewChatModel(ctx context.Context, modelType ChatModelType) (schema.ChatModel, error) {
switch modelType {
case OpenAIModel:
return openai.NewChatModel(ctx, &openai.ChatModelConfig{
Model: "gpt-4",
APIKey: os.Getenv("OPENAI_API_KEY"),
})
case OllamaModel:
return ollama.NewChatModel(ctx, &ollama.ChatModelConfig{
BaseURL: "http://localhost:11434",
Model: "llama2",
})
default:
return nil, fmt.Errorf("unsupported chat model type: %s", modelType)
}
}
2. Tool: The Hands of the Agent
Tool is the executor of the AI Agent, providing specific functional implementations. Each Tool has clear functional definitions and parameter specifications, allowing ChatModel to accurately invoke them. Tools can encapsulate various functionalities, from simple data operations to complex external service calls.
The Eino framework provides multiple ways to implement Tools:
-
Method 1: Using
<span><span>NewTool</span></span>to build
Suitable for simple tool implementations, creating a Tool by defining tool information and processing functions:
// ai-agent/todoagent.go
func GetAddTodoTool() tool.InvokableTool {
info := &schema.ToolInfo{
Name: "add_todo",
Desc: "Add a todo item",
ParamsOneOf: schema.NewParamsOneOfByParams(map[string]*schema.ParameterInfo{
"content": {
Desc: "The content of the todo item",
Type: schema.String,
Required: true,
},
}),
}
return utils.NewTool(info, AddTodoFunc)
}
-
Method 2: Using <span><span>NewTool</span></span> to build
Method 2: Using <span><span>NewTool</span></span> to build
Define parameter information through struct tags to achieve a source-synchronized parameter struct and description:
// ai-agent/todoagent.go
func GetUpdateTodoTool() (tool.InvokableTool, error) {
return utils.InferTool(
"update_todo",
"Update a todo item, eg: content,deadline...",
UpdateTodoFunc,
)
}
-
Method 3: Using the
<span><span>Tool</span></span>interface
Suitable for scenarios requiring more custom logic:
// ai-agent/todoagent.go
type ListTodoTool struct{}
func (lt *ListTodoTool) Info(ctx context.Context) (*schema.ToolInfo, error) {
return &schema.ToolInfo{
Name: "list_todo",
Desc: "List all todo items",
ParamsOneOf: schema.NewParamsOneOfByParams(map[string]*schema.ParameterInfo{
"finished": {
Desc: "filter todo items if finished",
Type: schema.Boolean,
Required: false,
},
}),
}, nil
}
func (lt *ListTodoTool) InvokableRun(ctx context.Context, argumentsInJSON string, opts ...tool.Option) (string, error) {
return `{"todos": [{"id": "1", "content": "Complete the preparation of the Eino project presentation by May 12, 2025", "started_at": 1717401600, "deadline": 1717488000, "done": false}]}`, nil
}
-
Method 4: Using officially packaged tools
For example, the DuckDuckGo search tool can be directly integrated into the project:
// ai-agent/agent.go
searchTool, err := duckduckgo.NewTool(ctx, &duckduckgo.Config{})
3. Building an Agent with Chain
When building an AI Agent, <span><span>ToolsNode</span></span> is a core component responsible for managing and executing tool calls.<span><span>ToolsNode</span></span> can integrate multiple tools and provide a unified calling interface. It supports both synchronous (Invoke) and streaming (Stream) calls, flexibly handling different types of tool execution needs.
-
Example: Building a Complete Agent
The following code demonstrates how to use Chain to build a complete Agent and invoke multiple tools to complete complex tasks:
// ai-agent/agent.go
func RunAgent(ctx context.Context) {
// Initialize Todo tools
addTool := GetAddTodoTool()
updateTool, err := GetUpdateTodoTool()
if err != nil {
log.Fatal(err)
}
listTool := &ListTodoTool{}
// Initialize DuckDuckGo search tool
searchTool, err := duckduckgo.NewTool(ctx, &duckduckgo.Config{})
if err != nil {
log.Fatal(err)
}
// Create and configure ChatModel
chatModel, err := openai.NewChatModel(ctx, &openai.ChatModelConfig{
Model: "gpt-4",
APIKey: os.Getenv("OPENAI_API_KEY"),
})
if err != nil {
log.Fatal(err)
}
// Get tool information and bind to ChatModel
tools := []tool.BaseTool{addTool, updateTool, listTool, searchTool}
toolInfos := make([]*schema.ToolInfo, 0, len(tools))
for _, tool := range tools {
info, err := tool.Info(ctx)
if err != nil {
log.Fatal(err)
}
toolInfos = append(toolInfos, info)
}
err = chatModel.BindTools(toolInfos)
if err != nil {
log.Fatal(err)
}
// Create ToolsNode
toolsNode, err := compose.NewToolNode(ctx, &compose.ToolsNodeConfig{
Tools: tools,
})
if err != nil {
log.Fatal(err)
}
// Build complete processing chain
chain := compose.NewChain[[]*schema.Message, []*schema.Message]()
chain.
AppendChatModel(chatModel, compose.WithNodeName("chat_model")).
AppendToolsNode(toolsNode, compose.WithNodeName("tools"))
// Compile and run chain
agent, err := chain.Compile(ctx)
if err != nil {
log.Fatal(err)
}
// Run example: Add a TODO to learn Eino and search for the cloudwego/eino repository address
resp, err := agent.Invoke(ctx, []*schema.Message{
{
Role: schema.User,
Content: "Add a TODO to learn Eino and search for the cloudwego/eino repository address",
},
})
if err != nil {
log.Fatal(err)
}
// Output results
for _, msg := range resp {
fmt.Println(msg.Content)
}
}
4. Programmer Encourager (AI Agent)
In addition to executing tasks, you can also think of the AI Agent as a “Programmer Encourager”. It not only provides technical advice but also offers psychological support when programmers feel down, haha.
-
Example: Building a Programmer Encourager
The following code demonstrates how to use ChatModel and message templates to build a Programmer Encourager:
// ai-agent/encourager.go
func EncouragerPromptTemplate() *prompt.PromptTemplate {
return prompt.FromMessages(schema.FString,
schema.SystemMessage("You are a {role}. You need to answer questions in a {style} tone. Your goal is to help programmers maintain a positive and optimistic mindset while providing technical advice and also paying attention to their mental health."),
schema.MessagesPlaceholder("chat_history", true),
schema.UserMessage("Question: {question}"),
)
}
func RunEncourager(ctx context.Context) {
// Create ChatModel (using OpenAI)
chatModel, err := NewChatModel(ctx, "openai")
if err != nil {
log.Fatal(err)
}
// Create conversation template
template := EncouragerPromptTemplate()
// Generate messages using the template
messages, err := template.Format(ctx, map[string]interface{}{
"role": "Programmer Encourager",
"style": "positive, warm, and professional",
"question": "My code keeps throwing errors, I feel so frustrated, what should I do?",
"chat_history": []*schema.Message{
schema.UserMessage("Hello"),
schema.AssistantMessage("Hey! I am your Programmer Encourager! Remember, every great programmer grows from debugging. How can I help you?", nil),
schema.UserMessage("I feel like my code is terrible"),
schema.AssistantMessage("Every programmer goes through this stage! The important thing is that you are constantly learning and improving. Let's take a look at the code together, I believe through refactoring and optimization, it will get better. Remember, Rome wasn't built in a day, and code quality is improved through continuous enhancement.", nil),
},
})
if err != nil {
log.Fatal(err)
}
// Run ChatModel and get results
result, err := chatModel.Generate(ctx, messages)
if err != nil {
log.Fatal(err)
}
// Output results
fmt.Println("Encourager's reply:")
fmt.Println(result.Content)
}
Final Thoughts
We introduced the basic methods for building AI Agents using the Byte AI framework Eino, including:
- The core components of the Agent
- ChatModel and Tool
- Implementation methods of Tool
- NewTool, InferTool, implementing Tool interface, using officially packaged tools
- Ways to build Agents
- Using Chain, ReAct Agent, Multi Agent
- Implementation of Programmer Encourager (AI Agent)
- Combining ChatModel and message templates to provide technical advice and emotional support.
As the capabilities of large language models continue to improve, AI Agents will play an increasingly important role in the future. We look forward to Eino providing users with more powerful and user-friendly Agent building solutions, driving more innovations based on Agents.
If you have any questions, feel free to discuss in the comments, or you can add me on WeChat tianxingjianlrk for in-depth discussions.
Reference Links:
[1] https://www.cloudwego.io/zh/docs/eino/ecosystem_integration/chat_model/chat_model_ollama/
Feel free to add me on WeChat for communication:tianxingjianlrk