This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

How-To Guides

Practical guides for solving specific problems with gomcptest

How-to guides are problem-oriented recipes that guide you through the steps involved in addressing key problems and use cases. They are practical and goal-oriented.

These guides will help you solve specific tasks and customize gomcptest for your needs.

1 - How to Create a Custom MCP Tool

Build your own Model Context Protocol (MCP) compatible tools

This guide shows you how to create a new custom tool that’s compatible with the Model Context Protocol (MCP) in gomcptest.

Prerequisites

  • A working installation of gomcptest
  • Go programming knowledge
  • Understanding of the MCP protocol basics

Steps to create a custom tool

1. Create the tool directory structure

mkdir -p tools/YourToolName/cmd

2. Create the README.md file

Create a README.md in the tool directory with documentation:

touch tools/YourToolName/README.md

Include the following sections:

  • Tool description
  • Parameters
  • Usage notes
  • Example

3. Create the main.go file

Create a main.go file in the cmd directory:

touch tools/YourToolName/cmd/main.go

4. Implement the tool functionality

Here’s a template to start with:

package main

import (
	"encoding/json"
	"fmt"
	"log"
	"os"

	"github.com/mark3labs/mcp-go"
)

// Define your tool's parameters structure
type Params struct {
	// Add your parameters here
	// Example:
	InputParam string `json:"input_param"`
}

func main() {
	server := mcp.NewServer()

	// Register your tool function
	server.RegisterFunction("YourToolName", func(params json.RawMessage) (any, error) {
		var p Params
		if err := json.Unmarshal(params, &p); err != nil {
			return nil, fmt.Errorf("failed to parse parameters: %w", err)
		}

		// Implement your tool's logic here
		result := doSomethingWithParams(p)

		return result, nil
	})

	if err := server.Run(os.Stdin, os.Stdout); err != nil {
		log.Fatalf("Server error: %v", err)
	}
}

func doSomethingWithParams(p Params) interface{} {
	// Your tool's core functionality
	// ...
	
	// Return the result
	return map[string]interface{}{
		"result": "Your processed result",
	}
}

5. Add the tool to the Makefile

Open the Makefile in the root directory and add your tool:

YourToolName:
	go build -o bin/YourToolName tools/YourToolName/cmd/main.go

Also add it to the all target.

6. Build your tool

make YourToolName

7. Test your tool

Test the tool directly:

echo '{"name":"YourToolName","params":{"input_param":"test"}}' | ./bin/YourToolName

8. Use with the CLI

Add your tool to the CLI command:

./bin/cliGCP -mcpservers "./GlobTool;./GrepTool;./LS;./View;./YourToolName;./dispatch_agent;./Bash;./Replace"

Tips for effective tool development

  • Focus on a single, well-defined purpose
  • Provide clear error messages
  • Include meaningful response formatting
  • Implement proper parameter validation
  • Handle edge cases gracefully
  • Consider adding unit tests in a _test.go file

2 - How to Configure the OpenAI-Compatible Server

Customize the OpenAI-compatible server for different use cases

This guide shows you how to configure and customize the OpenAI-compatible server in gomcptest for different use cases.

Prerequisites

  • A working installation of gomcptest
  • Basic familiarity with the OpenAI server from the tutorial
  • Understanding of environment variables and configuration

Environment Variables Configuration

Basic Server Configuration

The OpenAI server can be configured using the following environment variables:

# Server port (default: 8080)
export PORT=8080

# Log level: DEBUG, INFO, WARN, ERROR (default: INFO)
export LOG_LEVEL=INFO

# Directory to store images (required)
export IMAGE_DIR=/path/to/image/directory

GCP Configuration

Configure the Google Cloud Platform integration:

# GCP Project ID (required)
export GCP_PROJECT=your-gcp-project-id

# GCP Region (default: us-central1)
export GCP_REGION=us-central1

# Comma-separated list of Gemini models (default: gemini-1.5-pro,gemini-2.0-flash)
export GEMINI_MODELS=gemini-1.5-pro,gemini-2.0-flash

# Comma-separated list of Imagen models (optional)
export IMAGEN_MODELS=imagen-3.0-generate-002

Setting Up a Production Environment

For a production environment, create a proper systemd service file:

sudo nano /etc/systemd/system/gomcptest-openai.service

Add the following content:

[Unit]
Description=gomcptest OpenAI Server
After=network.target

[Service]
User=yourusername
WorkingDirectory=/path/to/gomcptest/host/openaiserver
ExecStart=/path/to/gomcptest/host/openaiserver/openaiserver -mcpservers "/path/to/gomcptest/bin/GlobTool;/path/to/gomcptest/bin/GrepTool;/path/to/gomcptest/bin/LS;/path/to/gomcptest/bin/View;/path/to/gomcptest/bin/Bash;/path/to/gomcptest/bin/Replace"
Environment=PORT=8080
Environment=LOG_LEVEL=INFO
Environment=IMAGE_DIR=/path/to/image/directory
Environment=GCP_PROJECT=your-gcp-project-id
Environment=GCP_REGION=us-central1
Environment=GEMINI_MODELS=gemini-1.5-pro,gemini-2.0-flash
Restart=on-failure

[Install]
WantedBy=multi-user.target

Then enable and start the service:

sudo systemctl enable gomcptest-openai
sudo systemctl start gomcptest-openai

Configuring MCP Tools

Adding Custom Tools

To add custom MCP tools to the server, include them in the -mcpservers parameter when starting the server:

go run . -mcpservers "../bin/GlobTool;../bin/GrepTool;../bin/LS;../bin/View;../bin/YourCustomTool;../bin/Bash;../bin/Replace"

Tool Parameters and Arguments

Some tools require additional parameters. You can specify these after the tool path:

go run . -mcpservers "../bin/GlobTool;../bin/dispatch_agent -glob-path ../bin/GlobTool -grep-path ../bin/GrepTool -ls-path ../bin/LS -view-path ../bin/View"

API Usage Configuration

Enabling CORS

For web applications, you may need to enable CORS. Add a middleware to the main.go file:

package main

import (
    "net/http"
    // other imports
)

// CORS middleware
func corsMiddleware(next http.Handler) http.Handler {
    return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
        w.Header().Set("Access-Control-Allow-Origin", "*")
        w.Header().Set("Access-Control-Allow-Methods", "GET, POST, OPTIONS")
        w.Header().Set("Access-Control-Allow-Headers", "Content-Type, Authorization")
        
        if r.Method == "OPTIONS" {
            w.WriteHeader(http.StatusOK)
            return
        }
        
        next.ServeHTTP(w, r)
    })
}

func main() {
    // existing code...
    
    http.Handle("/", corsMiddleware(openAIHandler))
    
    // existing code...
}

Setting Rate Limits

Add a simple rate limiting middleware:

package main

import (
    "net/http"
    "sync"
    "time"
    // other imports
)

type RateLimiter struct {
    requests     map[string][]time.Time
    maxRequests  int
    timeWindow   time.Duration
    mu           sync.Mutex
}

func NewRateLimiter(maxRequests int, timeWindow time.Duration) *RateLimiter {
    return &RateLimiter{
        requests:    make(map[string][]time.Time),
        maxRequests: maxRequests,
        timeWindow:  timeWindow,
    }
}

func (rl *RateLimiter) Middleware(next http.Handler) http.Handler {
    return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
        ip := r.RemoteAddr
        
        rl.mu.Lock()
        
        // Clean up old requests
        now := time.Now()
        if reqs, exists := rl.requests[ip]; exists {
            var validReqs []time.Time
            for _, req := range reqs {
                if now.Sub(req) <= rl.timeWindow {
                    validReqs = append(validReqs, req)
                }
            }
            rl.requests[ip] = validReqs
        }
        
        // Check if rate limit is exceeded
        if len(rl.requests[ip]) >= rl.maxRequests {
            rl.mu.Unlock()
            http.Error(w, "Rate limit exceeded", http.StatusTooManyRequests)
            return
        }
        
        // Add current request
        rl.requests[ip] = append(rl.requests[ip], now)
        rl.mu.Unlock()
        
        next.ServeHTTP(w, r)
    })
}

func main() {
    // existing code...
    
    rateLimiter := NewRateLimiter(10, time.Minute) // 10 requests per minute
    http.Handle("/", rateLimiter.Middleware(corsMiddleware(openAIHandler)))
    
    // existing code...
}

Performance Tuning

Adjusting Memory Usage

For high-load scenarios, adjust Go’s garbage collector:

export GOGC=100  # Default is 100, lower values lead to more frequent GC

Increasing Concurrency

If handling many concurrent requests, adjust the server’s concurrency limits:

package main

import (
    "net/http"
    // other imports
)

func main() {
    // existing code...
    
    server := &http.Server{
        Addr:         ":" + strconv.Itoa(cfg.Port),
        Handler:      openAIHandler,
        ReadTimeout:  30 * time.Second,
        WriteTimeout: 120 * time.Second,
        IdleTimeout:  120 * time.Second,
        MaxHeaderBytes: 1 << 20,
    }
    
    err = server.ListenAndServe()
    
    // existing code...
}

Troubleshooting Common Issues

Debugging Connection Problems

If you’re experiencing connection issues, set the log level to DEBUG:

export LOG_LEVEL=DEBUG

Common Error Messages

  • Failed to create MCP client: Ensure the tool path is correct and the tool is executable
  • Failed to load GCP config: Check your GCP environment variables
  • Error in LLM request: Verify your GCP credentials and project access

Checking Tool Registration

To verify tools are registered correctly, look for log messages like:

INFO server0 Registering command=../bin/GlobTool
INFO server1 Registering command=../bin/GrepTool

3 - How to Configure the cliGCP Command Line Interface

Customize the cliGCP tool with environment variables and command-line options

This guide shows you how to configure and customize the cliGCP command line interface for various use cases.

Prerequisites

  • A working installation of gomcptest
  • Basic familiarity with the cliGCP tool from the tutorial
  • Understanding of environment variables and configuration

Command Line Arguments

The cliGCP tool accepts the following command line arguments:

# Specify the MCP servers to use (required)
-mcpservers "tool1;tool2;tool3"

# Example with tool arguments
./cliGCP -mcpservers "./GlobTool;./GrepTool;./dispatch_agent -glob-path ./GlobTool -grep-path ./GrepTool -ls-path ./LS -view-path ./View;./Bash"

Environment Variables Configuration

GCP Configuration

Configure the Google Cloud Platform integration with these environment variables:

# GCP Project ID (required)
export GCP_PROJECT=your-gcp-project-id

# GCP Region (default: us-central1)
export GCP_REGION=us-central1

# Comma-separated list of Gemini models (required)
export GEMINI_MODELS=gemini-1.5-pro,gemini-2.0-flash

# Directory to store images (required for image generation)
export IMAGE_DIR=/path/to/image/directory

Advanced Configuration

You can customize the behavior of the cliGCP tool with these additional environment variables:

# Set a custom system instruction for the model
export SYSTEM_INSTRUCTION="You are a helpful assistant specialized in Go programming."

# Adjust the model's temperature (0.0-1.0, default is 0.2)
# Lower values make output more deterministic, higher values more creative
export MODEL_TEMPERATURE=0.3

# Set a maximum token limit for responses
export MAX_OUTPUT_TOKENS=2048

Creating Shell Aliases

To simplify usage, create shell aliases in your .bashrc or .zshrc:

# Add to ~/.bashrc or ~/.zshrc
alias gpt='cd /path/to/gomcptest/bin && ./cliGCP -mcpservers "./GlobTool;./GrepTool;./LS;./View;./Bash;./Replace"'

# Create specialized aliases for different tasks
alias code-assistant='cd /path/to/gomcptest/bin && GCP_PROJECT=your-project GEMINI_MODELS=gemini-2.0-flash ./cliGCP -mcpservers "./GlobTool;./GrepTool;./LS;./View;./Bash;./Replace"'

alias security-scanner='cd /path/to/gomcptest/bin && SYSTEM_INSTRUCTION="You are a security expert focused on finding vulnerabilities in code" ./cliGCP -mcpservers "./GlobTool;./GrepTool;./LS;./View;./Bash"'

Customizing the System Instruction

To modify the default system instruction, edit the agent.go file:

// In host/cliGCP/cmd/agent.go
genaimodels[model].SystemInstruction = &genai.Content{
    Role: "user",
    Parts: []genai.Part{
        genai.Text("You are a helpful agent with access to tools. " +
            "Your job is to help the user by performing tasks using these tools. " +
            "You should not make up information. " +
            "If you don't know something, say so and explain what you would need to know to help. " +
            "If not indication, use the current working directory which is " + cwd),
    },
}

Creating Task-Specific Configurations

For different use cases, you can create specialized configuration scripts:

Code Review Helper

Create a file called code-reviewer.sh:

#!/bin/bash

export GCP_PROJECT=your-gcp-project-id
export GCP_REGION=us-central1
export GEMINI_MODELS=gemini-2.0-flash
export IMAGE_DIR=/tmp/images
export SYSTEM_INSTRUCTION="You are a code review expert. Analyze code for bugs, security issues, and areas for improvement. Focus on providing constructive feedback and detailed explanations."

cd /path/to/gomcptest/bin
./cliGCP -mcpservers "./GlobTool;./GrepTool;./LS;./View;./Bash"

Make it executable:

chmod +x code-reviewer.sh

Documentation Generator

Create a file called doc-generator.sh:

#!/bin/bash

export GCP_PROJECT=your-gcp-project-id
export GCP_REGION=us-central1
export GEMINI_MODELS=gemini-2.0-flash
export IMAGE_DIR=/tmp/images
export SYSTEM_INSTRUCTION="You are a documentation specialist. Your task is to help create clear, comprehensive documentation for code. Analyze code structure and create appropriate documentation following best practices."

cd /path/to/gomcptest/bin
./cliGCP -mcpservers "./GlobTool;./GrepTool;./LS;./View;./Bash;./Replace"

Advanced Tool Configurations

Configuring dispatch_agent

When using the dispatch_agent tool, you can configure its behavior with additional arguments:

./cliGCP -mcpservers "./GlobTool;./GrepTool;./LS;./View;./dispatch_agent -glob-path ./GlobTool -grep-path ./GrepTool -ls-path ./LS -view-path ./View -timeout 30s;./Bash;./Replace"

Creating Tool Combinations

You can create specialized tool combinations for different tasks:

# Web development toolset
./cliGCP -mcpservers "./GlobTool -include '*.{html,css,js}';./GrepTool;./LS;./View;./Bash;./Replace"

# Go development toolset
./cliGCP -mcpservers "./GlobTool -include '*.go';./GrepTool;./LS;./View;./Bash;./Replace"

Troubleshooting Common Issues

Model Connection Issues

If you’re having trouble connecting to the Gemini model:

  1. Verify your GCP credentials:
gcloud auth application-default print-access-token
  1. Check that the Vertex AI API is enabled:
gcloud services list --enabled | grep aiplatform
  1. Verify your project has access to the models you’re requesting

Tool Execution Failures

If tools are failing to execute:

  1. Ensure the tool paths are correct
  2. Verify the tools are executable
  3. Check for permission issues in the directories you’re accessing

Performance Optimization

For better performance:

  1. Use more specific tool patterns to reduce search scope
  2. Consider creating specialized agents for different tasks
  3. Set a lower temperature for more deterministic responses

4 - How to Use the OpenAI Server with big-AGI

Configure the gomcptest OpenAI-compatible server as a backend for big-AGI

This guide shows you how to set up and configure the gomcptest OpenAI-compatible server to work with big-AGI, a popular open-source web client for AI assistants.

Prerequisites

Why Use big-AGI with gomcptest?

big-AGI provides a polished, feature-rich web interface for interacting with AI models. By connecting it to the gomcptest OpenAI-compatible server, you get:

  • A professional web interface for your AI interactions
  • Support for tools/function calling
  • Conversation history management
  • Persona management
  • Image generation capabilities
  • Multiple user support

Setting Up big-AGI

  1. Clone the big-AGI repository:

    git clone https://github.com/enricoros/big-agi.git
    cd big-agi
    
  2. Install dependencies:

    npm install
    
  3. Create a .env.local file for configuration:

    cp .env.example .env.local
    
  4. Edit the .env.local file to configure your gomcptest server connection:

    # big-AGI configuration
    
    # Your gomcptest OpenAI-compatible server URL
    OPENAI_API_HOST=http://localhost:8080
    
    # This can be any string since the gomcptest server doesn't use API keys
    OPENAI_API_KEY=gomcptest-local-server
    
    # Set this to true to enable the custom server
    OPENAI_API_ENABLE_CUSTOM_PROVIDER=true
    
  5. Start big-AGI:

    npm run dev
    
  6. Open your browser and navigate to http://localhost:3000 to access the big-AGI interface.

Configuring big-AGI to Use Your Models

The gomcptest OpenAI-compatible server exposes Google Cloud models through an OpenAI-compatible API. In big-AGI, you’ll need to configure the models:

  1. Open big-AGI in your browser
  2. Click on the Settings icon (gear) in the top right
  3. Go to the Models tab
  4. Under “OpenAI Models”:
    • Click “Add Models”
    • Add your models by ID (e.g., gemini-1.5-pro, gemini-2.0-flash)
    • Set context length appropriately (8K-32K depending on the model)
    • Set function calling capability to true for models that support it

Enabling Function Calling with Tools

To use the MCP tools through big-AGI’s function calling interface:

  1. In big-AGI, click on the Settings icon
  2. Go to the Advanced tab
  3. Enable “Function Calling” under the “Experimental Features” section
  4. In a new chat, click on the “Functions” tab (plugin icon) in the chat interface
  5. The available tools from your gomcptest server should be listed

Configuring CORS for big-AGI

If you’re running big-AGI on a different domain or port than your gomcptest server, you’ll need to enable CORS on the server side. Edit the OpenAI server configuration:

  1. Create or edit a CORS middleware for the OpenAI server:

    // CORS middleware with specific origin allowance
    func corsMiddleware(next http.Handler) http.Handler {
        return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
            // Allow requests from big-AGI origin
            w.Header().Set("Access-Control-Allow-Origin", "http://localhost:3000")
            w.Header().Set("Access-Control-Allow-Methods", "GET, POST, OPTIONS")
            w.Header().Set("Access-Control-Allow-Headers", "Content-Type, Authorization")
    
            if r.Method == "OPTIONS" {
                w.WriteHeader(http.StatusOK)
                return
            }
    
            next.ServeHTTP(w, r)
        })
    }
    
  2. Apply this middleware to your server routes

Troubleshooting Common Issues

Model Not Found

If big-AGI reports that models cannot be found:

  1. Verify your gomcptest server is running and accessible
  2. Check the server logs to ensure models are properly registered
  3. Make sure the model IDs in big-AGI match exactly the ones provided by your gomcptest server

Function Calling Not Working

If tools aren’t working properly:

  1. Ensure the tools are properly registered in your gomcptest server
  2. Check that function calling is enabled in big-AGI settings
  3. Verify the model you’re using supports function calling

Connection Issues

If big-AGI can’t connect to your server:

  1. Verify the OPENAI_API_HOST value in your .env.local file
  2. Check for CORS issues in your browser’s developer console
  3. Ensure your server is running and accessible from the browser

Production Deployment

For production use, consider:

  1. Securing your API:

    • Add proper authentication to your gomcptest OpenAI server
    • Update the OPENAI_API_KEY in big-AGI accordingly
  2. Deploying big-AGI:

  3. Setting up HTTPS:

    • For production, both big-AGI and your gomcptest server should use HTTPS
    • Consider using a reverse proxy like Nginx with Let’s Encrypt certificates

Example: Basic Chat Interface

Once everything is set up, you can use big-AGI’s interface to interact with your AI models:

  1. Start a new chat
  2. Select your model from the model dropdown (e.g., gemini-1.5-pro)
  3. Enable function calling if you want to use tools
  4. Begin chatting with your AI assistant, powered by gomcptest

The big-AGI interface provides a much richer experience than a command-line interface, with features like conversation history, markdown rendering, code highlighting, and more.