Skip to main content
Credible’s MCP tools don’t just give your LLM access to data—they use the Intelligent Context Engine to find the right data for each query. When you ask a natural language question, our retrieval system searches your model’s metadata (documentation tags, indexed field values) and uses embedding models to find semantically relevant fields—not just exact matches. Your LLM gets suggested queries and related entities, so it can construct accurate queries without hallucinating field names or misunderstanding your data structure.
LLMs and MCP clients change frequently. If you encounter any issues with these integrations, please contact support.

Connecting LLMs

Use your organization’s MCP server URL: https://<your-org>.mcp.credibledata.com/mcp
  • Claude (web)
  • ChatGPT (web)
  • VSCode
  • Cursor
  • Claude Code
  • Gemini CLI
  • Windsurf
Requires a Pro or Max subscription, OR a business/enterprise account with custom connectors enabled
  1. Click on user icon (bottom-left)
  2. Connectors
  3. Browse
  4. Click on the link “add a custom one” (note this is not a button)
  5. Enter Name and URL: https://<your-org>.mcp.credibledata.com/mcp
  6. Leave advanced settings alone
  7. Click “connect” on the next page
If you don’t see the option to add custom connectors on an enterprise account, contact your Claude administrator to enable this feature.

Available Tools

Once connected, your LLM has access to two tools:
  • suggestAnalysis - Searches your semantic models based on natural language and returns suggested queries plus related data entities
  • executeQuery - Executes Malloy queries against your semantic models and returns JSON results
For complete technical details on parameters and responses, see the MCP Reference.