Integrating MCP Server in a custom application or chatbot

Integrating MCP Server in a custom application or chatbot

If you are building a chatbot client with your own agent and orchestration logic, you can use the MCP Server to call MCP tools behind a custom web experience and integrate it with other systems or services as needed.

When integrated, the agent in your custom application can:

  • Automatically discover ThoughtSpot MCP tools.

  • Support natural language conversation sessions for data questions.

  • Generate embeddable visualizations and programmatically create a Liveboard.

Before you begin๐Ÿ”—

Before you begin, review the following prerequisites:

  • Node.js version 22 or later is installed and available in your environment.

  • Ensure that your setup has access to a ThoughtSpot application instance with 10.11.0.cl or a later release. For MCP Server with Spotter 3 capabilities, ensure that your ThoughtSpot instance is 26.2.0.cl or later.

  • Ensure that the users have the necessary permissions to view data from relevant models and tables in ThoughtSpot. Existing RLS/CLS rules on tables are enforced automatically in data source responses. To create charts or Liveboards from a conversation session, data download and content creation privileges are required.

Authenticating users๐Ÿ”—

If your own application or backend service manages user identities, and you want to implement a seamless authentication experience without redirecting users to an external OAuth flow from the chatbot host, use the trusted authentication method.

Trusted authentication flow๐Ÿ”—

In a typical trusted authentication flow, your backend service calls the /api/rest/2.0/auth/token/full REST API endpoint to obtain a full access token (TS_AUTH_TOKEN) for a ThoughtSpot user or service account.

The token generated for the user session is used as a bearer token when your backend calls ThoughtSpot APIs or when it brokers MCP tool calls.

Connecting clients๐Ÿ”—

If your custom chatbot implementation uses Claude, OpenAI, or Gemini LLM APIs to call MCP tools, ensure that your MCP Server endpoint, authentication token, and ThoughtSpot host are included in the API request.

When integrating with apps using custom workflows, ThoughtSpot recommends using the MCP server URL with the date-based api-version parameter. This allows you to pin a specific version and avoid introducing unintended changes to custom workflows that rely on tool responses.

Claude MCP connector๐Ÿ”—

If your application uses the Claude MCP connector, use the following API request format to connect Claude to the MCP Server:

curl https://api.anthropic.com/v1/messages \
  -H "Content-Type: application/json" \
  -H "X-API-Key: $ANTHROPIC_API_KEY" \
  -H "anthropic-version: 2023-06-01" \
  -H "anthropic-beta: mcp-client-2025-04-04" \
  -d '{
    "model": "claude-3-5-sonnet-latest",
    "max_tokens": 1000,
    "messages": [{
      "role": "user",
      "content": "How do I increase my sales?"
    }],
    "mcp_servers": [
      {
        "type": "url",
        "url": "https://agent.thoughtspot.app/token/mcp?api-version={YYYY-MM-DD}",
        "name": "thoughtspot",
        "authorization_token": "[email protected]"
      }
    ]
  }'

In the above example, the API call includes:

  • The userโ€™s message.

  • ThoughtSpotโ€™s MCP Server endpoint https://agent.thoughtspot.app/token/mcp?api-version={YYYY-MM-DD}. Replace YYYY-MM-DD with an actual date string.

  • An authorization_token that encodes which ThoughtSpot instance and user/token to use.

Claude uses the configured MCP Server to call ThoughtSpot MCP tools as needed, using the bearer token you provided.

OpenAI Responses API๐Ÿ”—

If your application uses an OpenAI LLM, use the following API request format to connect OpenAI to the MCP Server:

curl https://api.openai.com/v1/responses \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "gpt-4.1",
    "tools": [
      {
        "type": "mcp",
        "server_label": "thoughtspot",
        "server_url": "https://agent.thoughtspot.app/token/mcp?api-version={YYYY-MM-DD}",
        "headers": {
          "Authorization": "Bearer TS_AUTH_TOKEN",
          "x-ts-host": "my-instance.thoughtspot.cloud"
        }
      }
    ],
    "input": "How can I increase my sales?"
  }'

In the above example, the API call includes the following parameters:

  • MCP as the tool type.

  • ThoughtSpot MCP Server URL. Replace YYYY-MM-DD in the URL with an actual date string.

  • Authentication token and ThoughtSpot host URL.

The OpenAI LLM uses the configured MCP Server, sends the provided headers on each MCP tool call, and gets the requested data from your ThoughtSpot instance under that tokenโ€™s identity.

Gemini API๐Ÿ”—

If your application is the MCP host and Gemini is the LLM provider, use the following code example to connect Gemini to the ThoughtSpot MCP Server.

import {
  GoogleGenAI,
  mcpToTool,
} from '@google/genai';
import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js";

const transport = new StreamableHTTPClientTransport(
  new URL("https://agent.thoughtspot.app/token/mcp?api-version={YYYY-MM-DD}"),
  {
    requestInit: {
      headers: {
        "Authorization": "Bearer TS_AUTH_TOKEN",
        "x-ts-host": "my-instance.thoughtspot.cloud"
      },
    }
  }
);

const mcpClient = new Client({
  name: "example-client",
  version: "1.0.0",
});

await mcpClient.connect(transport);

const ai = new GoogleGenAI({});

const response = await ai.models.generateContent({
  model: "gemini-2.5-flash",
  contents: `Show me last quarter's sales by region`,
  config: {
    tools: [mcpToTool(mcpClient)],
  },
});

console.log(response.text);
await mcpClient.close();

The above example:

  • Creates an MCP client and connects it to the ThoughtSpot MCP Server using StreamableHTTPClientTransport. Ensure that you replace YYYY-MM-DD in the URL with an actual date string.

  • Sends the required headers with the authentication token and ThoughtSpot host URL in MCP requests.

  • Wraps the MCP client as a tool and passes it into GoogleGenAI so Gemini can call ThoughtSpot tools as part of answering a userโ€™s query.

Verifying the integration๐Ÿ”—

To verify the integration:

  1. Start a chat session by asking a question and verify whether your chatbotโ€™s LLM is calling the ThoughtSpot MCP tools to generate a response.

  2. Verify the tool calls. For information about tool calls and responses, see MCP tool reference guide.

  3. Verify the Liveboard creation workflow and check whether a Liveboard is created in ThoughtSpot.

  4. Verify whether the metadata in the output includes the URL to embed a visualization in an iframe or HTML snippet.

Displaying visualization in iframe๐Ÿ”—

Visualizations are returned as ThoughtSpot Answers via iframes. Use the startAutoMCPFrameRenderer function to display the iframe.

An example project showing how this is used in the app.jsx file with a code sample is available in the developer-examples GitHub repository.

Troubleshooting errors๐Ÿ”—

Cannot connect to MCP Server
  • Verify that the MCP Server is reachable.

  • Ensure that the correct MCP Server URL is used in API requests.

  • If the issue persists, verify the logs and contact ThoughtSpot Support for assistance.

Authentication failure
  • Ensure that the correct ThoughtSpot host URL and authentication token are in the API requests.

  • Verify whether the token used for authorizing MCP requests has expired. If the token is invalid, generate a new token and retry the API calls.

  • Verify whether the MCP Server and ThoughtSpot host are reachable.

  • Verify whether the user has the necessary privileges to view data or create content.

MCP tools and response๐Ÿ”—

For information about tool calls and responses, refer to the MCP tool reference guide.

Additional resources๐Ÿ”—

ยฉ 2026 ThoughtSpot Inc. All Rights Reserved.