Skip to content

Update install-vertex-ai.md #34

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 14 additions & 11 deletions .ruru/modes/agent-mcp-manager/kb/install-vertex-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,15 +36,15 @@ All configuration for the `vertex-ai-mcp-server` is now managed centrally within

Locate or add the `vertex-ai-mcp-server` entry within the `servers` object in `.roo/mcp.json`. Configure it as follows, ensuring you replace placeholder values with your actual Google Cloud Platform (GCP) project details and credentials path.

**IMPORTANT:** The following JSON structure is **mandatory**. You **MUST** ensure the configuration entry for `vertex-ai-mcp-server` in `.roo/mcp.json` exactly matches this structure, including all specified keys within the `env` object. If an entry already exists, update it to conform to this example, replacing only the placeholder values (like `YOUR_PROJECT_ID`, `YOUR_GCP_REGION`, `/path/to/your/gcp_credentials.json`) with your actual details.

```json
// Example within .roo/mcp.json
{
"servers": {
"mcpServers": {
"vertex-ai-mcp-server": {
"command": "node",
"args": [
// Ensure this path correctly points to the executable within the installed package
"node_modules/vertex-ai-mcp-server/build/index.js"
],
"env": {
// --- Required GCP/Vertex Config ---
Expand All @@ -57,19 +57,19 @@ Locate or add the `vertex-ai-mcp-server` entry within the `servers` object in `.

// --- Vertex AI Model Config ---
// Specify the desired Vertex AI model ID
"VERTEX_AI_MODEL_ID": "gemini-2.5-pro-exp-03-25", // Or your preferred model like gemini-1.5-pro-preview-0409 etc.
"VERTEX_MODEL_ID": "gemini-2.5-pro-exp-03-25", // Or your preferred model like gemini-1.5-pro-preview-0409 etc.
// Controls randomness (0.0 = deterministic)
"VERTEX_AI_TEMPERATURE": "0.0",
"AI_TEMPERATURE": "0.0",
// Enable/disable streaming responses
"VERTEX_AI_USE_STREAMING": "true",
"AI_USE_STREAMING": "true",
// Maximum tokens for the model's response
"VERTEX_AI_MAX_OUTPUT_TOKENS": "65535", // Adjust based on model limits/needs
"AI_MAX_OUTPUT_TOKENS": "65535", // Adjust based on model limits/needs

// --- Optional Retry Config ---
// Number of times to retry failed API calls
"VERTEX_AI_MAX_RETRIES": "3",
"AI_MAX_RETRIES": "3",
// Initial delay between retries in milliseconds
"VERTEX_AI_RETRY_DELAY_MS": "1000"
"AI_RETRY_DELAY_MS": "1000"
},
// Set to true to temporarily disable this server without removing the config
"disabled": false,
Expand Down Expand Up @@ -115,6 +115,9 @@ Locate or add the `vertex-ai-mcp-server` entry within the `servers` object in `.
* **`alwaysAllow`**: Carefully consider which tools should bypass per-call user confirmation.
* **`timeout`**: Adjust if you expect very long-running tool operations.

## 3. Restart Roo Commander
## 3. Activating

To activate the server:

After modifying `.roo/mcp.json`, restart Roo Commander to ensure it picks up the new configuration and attempts to connect to the Vertex AI MCP server. Check the Roo Commander logs or MCP status indicators for successful connection.
1. Ensure the configuration in `.roo/mcp.json` is correct and `disabled` is set to `false`.
2. **Restart or Reload Roo Commander:** If Roo Commander is already running, you may need to restart it or trigger a configuration reload (if available) for it to pick up the changes in `.roo/mcp.json` and launch the server process.