Skip to main content

Gemini + Manglai

Option 1: Cursor with Gemini model

If you use Cursor with the Gemini model, configure the Manglai MCP in .cursor/mcp.json:
{
  "mcpServers": {
    "manglai": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://mcp.manglai.io/sse",
        "--header",
        "Authorization:Bearer ${MANGLAI_TOKEN}"
      ],
      "env": {
        "MANGLAI_TOKEN": "your_token_here"
      }
    }
  }
}
Cursor will handle MCP calls regardless of the underlying model.

Option 2: Gemini API (programmatic)

If you use the Gemini API directly, you can integrate Manglai by building a backend that combines both APIs:
import { GoogleGenerativeAI } from '@google/generative-ai';

const genAI = new GoogleGenerativeAI(process.env.GEMINI_API_KEY);
const model = genAI.getGenerativeModel({ model: 'gemini-1.5-pro' });

// When the user asks about emissions, call Manglai:
const manglaiResponse = await fetch(
  'https://www.manglai.io/api/v1/emissions/dashboard?companyId=UUID&startDate=2024-01-01&endDate=2024-12-31',
  { headers: { Authorization: `Bearer ${process.env.MANGLAI_TOKEN}` } }
);
const data = await manglaiResponse.json();

const result = await model.generateContent(
  `Manglai emissions data: ${JSON.stringify(data)}. Analyze and summarize the key insights.`
);

Option 3: Vertex AI

In enterprise environments with Vertex AI, you can use the Manglai REST API as an external data source for grounding or function calling.

Generate token

Get your Manglai token