Building an MCP Server for Snowflake Cortex Agents
Step-by-step guide to building a Model Connectivity Protocol (MCP) server for Cortex Agents with Snowflake, enabling AI applications like Cursor to connect to your Snowflake data and documentation.

Step-by-Step Guide: Building an MCP Server for Cortex Agents with Snowflake
This guide will walk you through setting up a Model Connectivity Protocol (MCP) server for Cortex Agents, enabling you to connect AI applications like Cursor to your data in Snowflake.
I. Setting Up Your Snowflake Environment
First, prepare your Snowflake account.
-
Sign Up for a Snowflake Trial Account:
- Go to the Snowflake website.
- Sign up for a free 30-day trial account.
- Provide your name, email, company details, and preferred region.
- Complete the sign-up process and activate your account via email.
-
Get Data via Cortex Knowledge Extension:
- Log in to your Snowflake account.
- Navigate to Data Products in the Marketplace.
- Search for "Snowflake Documentation".
- Select the extension and click Get.
- Confirm by clicking Get again. This makes your Snowflake documentation searchable by Cortex agents.
-
Verify Cortex Search Service:
- In your Snowflake account, go to AI & ML > Cortex Search.
- Select the database "Snowflake Documentation".
- You should see a "doc service" already created with an attached search service.
-
Create a Programmatic Access Token (PAT):
- Go to Admin > Users & Roles.
- Select your user.
- Click Generate Token.
- Enter a name for the token (e.g., "MCP agents").
- Set an expiration period (e.g., 1 day).
- Ensure the token can access Any of your roles.
- Click Generate.
- Important: Copy the generated token to your clipboard. You will need it later.
- (Optional) Temporarily Bypass Network Policy:
- If you anticipate network policy restrictions, you can temporarily bypass them for this PAT.
- Click Edit next to the network policy setting for the token.
- Choose a temporary bypass duration (e.g., 8 hours).
- Click Grant Access.
II. Building the MCP Server
Next, set up the MCP server on your local machine.
- Install UV (Universal Virtual Environment Manager):
- Open your terminal or command prompt.
- Run the following command:
curl -LsSf https://astral.sh/uv/install.sh | sh* Follow any on-screen prompts during the installation.
2. Clone the Repository: * In your terminal, run the following command to clone the necessary files:
git clone https://github.com/AkhilGurrapu/snowflake-mpc-cortex-agent.git* Navigate into the cloned repository directory:
cd snowflake-mpc-cortex-agent- Activate UV Environment and Install Dependencies:
- Create and activate the UV virtual environment:
uv venv* Activate the environment:
* On macOS/Linux:
source .venv/bin/activate * On Windows:
.venv\Scripts\activate* Install the required Python packages (MCP SDK and HTTP client):
uv add "mpc[cli]" httpx-
Open the Repository in a Code Editor:
- Open the
snowflake-mpc-cortex-agentfolder in your preferred code editor (e.g., VS Code, Sublime Text).
- Open the
-
Modify the Agent Setup (Removing Cortex Analyst components):
- In your code editor, open the
cortex_agents.pyfile. - Locate and comment out (by adding a
#at the beginning of the line) the line referencingsemantic_model_file. - Locate and remove or comment out the sections of code related to
analyst_tool_resources. - Locate and remove or comment out the sections of code related to
SQL_EXECUTION. - Save the
cortex_agents.pyfile.
- In your code editor, open the
-
Configure Environment Variables:
- In the
snowflake-mpc-cortex-agentdirectory, find the.env.templatefile. - Create a copy of this file and rename the copy to
.env. - Open the
.envfile in your code editor and fill in the following details:SNOWFLAKE_ACCOUNT_URL:- Go back to your Snowflake account in your web browser.
- Click on your username in the bottom left corner.
- Select Connect to Tool.
- Copy your Account Identifier (it looks like
youraccount.snowflakecomputing.com). - Paste this value into the
.envfile forSNOWFLAKE_ACCOUNT_URL.
SNOWFLAKE_PAT: Paste the Programmatic Access Token (PAT) you copied earlier from Snowflake.CORTEX_SEARCH_SERVICE:- In Snowflake, navigate to AI & ML > Cortex Search.
- Find your doc service (related to the Snowflake documentation extension).
- Copy the Key of your doc service (it might be something like
KEY_DOC_SERVICES). - Paste this value into the
.envfile in the format<database>.<schema>.<service_name>.
- Save the
.envfile.
- In the
-
Run the MCP Server:
- Go back to your terminal, ensuring you are still in the
snowflake-mpc-cortex-agentdirectory and the UV virtual environment is active. - Run the MCP server:
- Go back to your terminal, ensuring you are still in the
uv run cortex_agents.py* You should see output indicating that the server is running. Keep this terminal window open and the server running.
III. Integrating with Cursor
Finally, connect your running MCP server to the Cursor application.
-
Add MCP Host in Cursor:
- Open the Cursor application.
- Go to Cursor Settings (often found via a gear icon or in the application menu).
- Select the MCP section.
- Click on Add new global MCP server.
-
Edit the MCP JSON Configuration:
- A JSON configuration file (usually
mcp.json) will open in Cursor or your default JSON editor. - Add a new entry for your Cortex agent MCP server. It should look similar to this:
- A JSON configuration file (usually
{
"mcpServers": {
"snowflake_docs_agent": {
"command": "uv",
"args": [
"--directory",
"/Users/akhilgurrapu/Documents/Projects/snowflake-mpc-cortex-agent",
"run",
"cortex_agents.py"
]
}
}
}* **Important Notes:**
* Replace `"snowflake_docs_agent"` with your desired server name. **Crucially, use underscores (`_`) instead of dashes (`-`) in this name** due to a potential compatibility issue in Cursor.
* Replace `"/Users/akhilgurrapu/Documents/Projects/snowflake-mpc-cortex-agent"` with the **absolute path** to the `snowflake-mpc-cortex-agent` directory on your computer.
* Save the `mcp.json` file.
3. Verify MCP Server in Cursor: * Go back to the MCP section in Cursor settings. * You should now see your newly added server (e.g., "snowflake_docs_agent") listed. * It should have a green status indicator, signifying that Cursor can connect to your running MCP server.
-
Add a Rule to Use MCP (Optional but Recommended):
- In Cursor settings, navigate to the Rules section.
- Click Add new rule.
- Create an instruction that tells Cursor when to use your MCP server. For example:
- Instruction Trigger: "When I ask about Snowflake documentation" or "Use snowflake agent for Snowflake questions."
- MCP Server to Use: Select your
snowflake_docs_agent(or whatever you named it) from the dropdown.
- Save the rule.
-
Test the Integration:
- In Cursor, switch to the AI pane or chat interface.
- Ensure you are in Agents mode or a mode that allows agent interaction.
- Ask a question related to the Snowflake documentation (e.g., "How do I create a virtual warehouse in Snowflake?").
- Cursor should now utilize your MCP server (which connects to Snowflake Cortex Search) to help answer your question.
You have now successfully built an MCP server for Cortex Agents and integrated it with Cursor!
On this page
Keep exploring
matched by tag + title overlap
Read next
SQL Server Setup on Mac with Docker: A Step-by-Step Guide
A straightforward guide to setting up SQL Server on Mac using Docker. From installation to running a SQL Server container, this article provides easy-to-follow steps tailored for developers and database enthusiasts on macOS
#data-engineeringOptmizing Query Performance with Clustering Keys in Snowflake
This blog explores how Clustering Keys Data Pruning, and the Search Optimization Service (SOS) enhance query efficiency in Snowflake. It explains how clustering keys physically organize data into micro-partitions enabling faster queries by…
#snowflake#data-engineeringPowerBI and Snowflake Integration
This blog will explore how to effectively integrate Power BI with Snowflake, focusing on best practices and technical details from the provided documents. It aims to help users understand the process and optimize their data analytics…
#snowflake#data-engineeringIceberg and Snowflake Integration
Learn how to integrate Apache Iceberg with Snowflake, covering key concepts like external tables, time travel, and file management, along with practical implementation examples and best practices.
#snowflake#data-engineering