get_context tool parses your input into semantic phrases and matches each phrase to data entities (dimensions, measures, views) in your semantic model — searching against the #(doc) descriptions and #(index_values) annotations declared in your model. Your LLM gets ranked entity matches and Malloy syntax guidance, so it can construct accurate queries without hallucinating field names or misunderstanding your data structure.
LLMs and MCP clients change frequently. If you encounter any issues with these integrations, please contact support.
Connecting LLMs
Use your organization’s MCP server URL:https://<your-org>.mcp.credibledata.com/mcp
- Claude (web)
- ChatGPT (web)
- VS Code Copilot
- Cursor
- Claude Code
- Gemini CLI
- Windsurf
Requires a Pro or Max subscription, OR a business/enterprise account with custom connectors enabled
- Click on user icon (bottom-left)
- Connectors
- Browse
- Click on the link “add a custom one” (note this is not a button)
- Enter Name and URL:
https://<your-org>.mcp.credibledata.com/mcp - Leave advanced settings alone
- Click “connect” on the next page
If you don’t see the option to add custom connectors on an enterprise account, contact your Claude administrator to enable this feature.
Available Tools
Once connected, your LLM has access to two tools:- get_context — Parses your question into phrases, matches each phrase to data entities (dimensions, measures, views) in your semantic model, and returns ranked matches grounded in your model’s
#(doc)and#(index_values)annotations - execute_query — Executes Malloy queries against your semantic models and returns JSON results