How to Use the OpenAI Server with big-AGI
This guide shows you how to set up and configure the gomcptest OpenAI-compatible server to work with big-AGI, a popular open-source web client for AI assistants.
Prerequisites
- A working installation of gomcptest
- The OpenAI-compatible server running (see the OpenAI Server tutorial)
- Node.js (version 18.17.0 or newer)
- Git
Why Use big-AGI with gomcptest?
big-AGI provides a polished, feature-rich web interface for interacting with AI models. By connecting it to the gomcptest OpenAI-compatible server, you get:
- A professional web interface for your AI interactions
- Support for tools/function calling
- Conversation history management
- Persona management
- Image generation capabilities
- Multiple user support
Setting Up big-AGI
Clone the big-AGI repository:
git clone https://github.com/enricoros/big-agi.git cd big-agi
Install dependencies:
npm install
Create a
.env.local
file for configuration:cp .env.example .env.local
Edit the
.env.local
file to configure your gomcptest server connection:# big-AGI configuration # Your gomcptest OpenAI-compatible server URL OPENAI_API_HOST=http://localhost:8080 # This can be any string since the gomcptest server doesn't use API keys OPENAI_API_KEY=gomcptest-local-server # Set this to true to enable the custom server OPENAI_API_ENABLE_CUSTOM_PROVIDER=true
Start big-AGI:
npm run dev
Open your browser and navigate to
http://localhost:3000
to access the big-AGI interface.
Configuring big-AGI to Use Your Models
The gomcptest OpenAI-compatible server exposes Google Cloud models through an OpenAI-compatible API. In big-AGI, you’ll need to configure the models:
- Open big-AGI in your browser
- Click on the Settings icon (gear) in the top right
- Go to the Models tab
- Under “OpenAI Models”:
- Click “Add Models”
- Add your models by ID (e.g.,
gemini-1.5-pro
,gemini-2.0-flash
) - Set context length appropriately (8K-32K depending on the model)
- Set function calling capability to
true
for models that support it
Enabling Function Calling with Tools
To use the MCP tools through big-AGI’s function calling interface:
- In big-AGI, click on the Settings icon
- Go to the Advanced tab
- Enable “Function Calling” under the “Experimental Features” section
- In a new chat, click on the “Functions” tab (plugin icon) in the chat interface
- The available tools from your gomcptest server should be listed
Configuring CORS for big-AGI
If you’re running big-AGI on a different domain or port than your gomcptest server, you’ll need to enable CORS on the server side. Edit the OpenAI server configuration:
Create or edit a CORS middleware for the OpenAI server:
// CORS middleware with specific origin allowance func corsMiddleware(next http.Handler) http.Handler { return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { // Allow requests from big-AGI origin w.Header().Set("Access-Control-Allow-Origin", "http://localhost:3000") w.Header().Set("Access-Control-Allow-Methods", "GET, POST, OPTIONS") w.Header().Set("Access-Control-Allow-Headers", "Content-Type, Authorization") if r.Method == "OPTIONS" { w.WriteHeader(http.StatusOK) return } next.ServeHTTP(w, r) }) }
Apply this middleware to your server routes
Troubleshooting Common Issues
Model Not Found
If big-AGI reports that models cannot be found:
- Verify your gomcptest server is running and accessible
- Check the server logs to ensure models are properly registered
- Make sure the model IDs in big-AGI match exactly the ones provided by your gomcptest server
Function Calling Not Working
If tools aren’t working properly:
- Ensure the tools are properly registered in your gomcptest server
- Check that function calling is enabled in big-AGI settings
- Verify the model you’re using supports function calling
Connection Issues
If big-AGI can’t connect to your server:
- Verify the
OPENAI_API_HOST
value in your.env.local
file - Check for CORS issues in your browser’s developer console
- Ensure your server is running and accessible from the browser
Production Deployment
For production use, consider:
Securing your API:
- Add proper authentication to your gomcptest OpenAI server
- Update the
OPENAI_API_KEY
in big-AGI accordingly
Deploying big-AGI:
- Follow the big-AGI deployment guide
- Configure the environment variables to point to your production gomcptest server
Setting up HTTPS:
- For production, both big-AGI and your gomcptest server should use HTTPS
- Consider using a reverse proxy like Nginx with Let’s Encrypt certificates
Example: Basic Chat Interface
Once everything is set up, you can use big-AGI’s interface to interact with your AI models:
- Start a new chat
- Select your model from the model dropdown (e.g.,
gemini-1.5-pro
) - Enable function calling if you want to use tools
- Begin chatting with your AI assistant, powered by gomcptest
The big-AGI interface provides a much richer experience than a command-line interface, with features like conversation history, markdown rendering, code highlighting, and more.
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.