Setup Guide8 min readMay 5, 2026

Best MCPs for Windsurf in 2026 (Cascade-Ready Setup)

Windsurf wires MCP servers into Cascade — its agent that runs multi-step work without supervision. The right MCP stack turns Cascade from an autocomplete assistant into a true coding agent that can read files, query databases, drive a browser, and ship a PR. Here are the six worth installing first.

Why Windsurf + MCPs?

Cascade is built for autonomous, multi-step coding work — propose an approach, plan the changes, edit files, run a check, fix what broke, repeat. That loop is only as good as the tools Cascade can call. Out of the box it has the editor, the working tree, and shell commands; everything else is on you to wire up.

MCP servers are how Cascade reaches the rest of the world. With the stack below, Cascade can pull current library docs (no more hallucinated APIs), query Postgres or Supabase to verify a schema before writing migrations, drive a real browser to test a UI flow it just edited, and open a PR on GitHub when the work is done. Each MCP runs as a separate process and surfaces its tools to Cascade automatically — no plugins, no proxies.

Setup time

15–25 min for all 6 MCPs

Impact

Cascade chains tool calls across files, DB, and browser

Cost

All 6 are free or open source

The 6 MCPs every Windsurf user should install

These are picked specifically for Cascade's multi-step style. The first four cover the most common tool calls; the last two unlock production-app workflows where Cascade builds, tests, and verifies end to end.

#1

Filesystem

2 min setup

Most AI workflows involve reading or modifying files. This MCP is the standard way to give models that access without exposing the full system.

npx -y @modelcontextprotocol/server-filesystem /path/to/allowed/dir

Reading and editing local codeGenerating files from AI outputNavigating project structures
Full details and install guide
#2

GitHub

5 min setup

GitHub is where most code lives. This MCP lets agents interact with that code directly, without copy-pasting between interfaces.

npx @github/github-mcp-server

Automated issue creationPR review and managementCode search across repos
Full details and install guide
#3

Context7

3 min setup

Models hallucinate outdated APIs constantly. Context7 eliminates this by grounding every answer in real, current documentation.

npx -y @upstash/context7-mcp

Library API lookupFramework usage patternsVersion-specific code examples
Full details and install guide
#4

Supabase

5 min setup

Supabase is a popular backend platform. This MCP lets AI models interact with every layer of a Supabase project without switching interfaces.

npx @supabase/mcp-server-supabase@latest

Supabase project managementSchema design and migrationsRLS policy creation
Full details and install guide
#5

Puppeteer

5 min setup

Many modern pages require JavaScript to render. Puppeteer is the only reliable way to interact with SPAs, dashboards, and login-gated pages.

npx -y @modelcontextprotocol/server-puppeteer

Web scraping with JavaScriptForm automationScreenshot capture
Full details and install guide
#6

PostgreSQL

3 min setup

Database access is one of the most requested AI capabilities. This MCP provides it safely, with schema context that improves query quality.

npx -y @modelcontextprotocol/server-postgres postgresql://user:pass@localhost/db

Database schema explorationSQL query generationData analysis
Full details and install guide

Cascade rewards tool-rich MCPs

Cascade plans across many tool calls per turn. The more discrete operations an MCP exposes, the more leverage Cascade gets — Filesystem (read, write, list, search), GitHub (PRs, issues, search, comments), and Puppeteer (navigate, click, fill, screenshot) all expose enough surface area for Cascade to chain confidently. Single-tool MCPs work but rarely change Cascade's behaviour.

How to configure MCPs in Windsurf

Windsurf reads MCP configuration from a single JSON file. The Cascade panel offers a Configure button that opens it; the four-step setup:

1

Open the Cascade MCP panel

Inside Windsurf, click the hammer icon on the Cascade panel and choose Configure. This opens (or creates) ~/.codeium/windsurf/mcp_config.json in your default editor. You can also edit the file by hand — both paths work.

Cascade → hammer icon → Configure
2

Add a server under mcpServers

The JSON shape is identical to Claude Desktop and Cursor — paste any install snippet from a top-mcps.com detail page directly into mcpServers. Each entry becomes a tool group Cascade can call during agent work.

{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/Users/you/code"
      ]
    }
  }
}
3

Set tokens via env

Tokens go in the env block, never in args. Windsurf passes them to the server process at launch but never echoes them in the chat transcript. Scope tokens narrowly — repo-only access for GitHub, read-only for databases.

"github": {
  "command": "npx",
  "args": ["@github/github-mcp-server"],
  "env": {
    "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_..."
  }
}
4

Refresh Cascade and verify

Click the Refresh button in the Cascade MCP panel. Connected servers show a green dot; expand any server to see the exact tools it exposes. Red dot = the server failed to start — hover for the underlying error.

Full mcp_config.json with all 6 MCPs

Copy this into ~/.codeium/windsurf/mcp_config.json. Replace the placeholder paths and tokens for your environment, click Refresh in the Cascade panel, and every server appears as a tool group Cascade can call.

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/Users/you/code"
      ]
    },
    "github": {
      "command": "npx",
      "args": ["@github/github-mcp-server"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_your_token_here"
      }
    },
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp"]
    },
    "supabase": {
      "command": "npx",
      "args": ["-y", "@supabase/mcp-server-supabase"],
      "env": {
        "SUPABASE_ACCESS_TOKEN": "sbp_your_token_here"
      }
    },
    "puppeteer": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-puppeteer"]
    },
    "postgres": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres",
        "postgresql://user:pass@localhost/mydb"
      ]
    }
  }
}

Scope database tokens to non-production

Cascade is autonomous — it composes tool calls without asking permission for each one. A Postgres or Supabase MCP wired to a production database is the fastest way to have an exploratory query lock a table you cared about. Use a non-production project token for Cascade work, or use the Postgres MCP's read-only mode if you must point at production.

Quick comparison

MCPPrimary use in WindsurfSetupAPI key?
FilesystemRead/write files outside the open project2 minNo
GitHubPRs, issues, comments — Cascade can ship5 minGitHub token
Context7Version-pinned library documentation2 minNo
SupabaseSchema, queries, RLS-aware DB access5 minSupabase token
PuppeteerDrive a real browser to verify UI flows5 minNo
PostgreSQLRead-only schema introspection3 minConnection string

Common gotchas

Refresh, not Restart

Editing mcp_config.json does not require a full Windsurf restart — click Refresh in the Cascade MCP panel and the new config is picked up. Restarting works too but is slower; only needed when a server is wedged.

Cascade may use any tool autonomously

Unlike a chat client where the user picks each tool, Cascade composes tool calls during agent work. Audit which MCPs you give it write access to — a misconfigured GitHub token or a Postgres MCP with write scope can lead to surprising changes.

Windsurf shares MCP config across all projects

There is no project-scoped mcp_config.json today — every Windsurf workspace sees the same servers. If you need different MCPs per project, swap mcp_config.json files manually or keep multiple copies and symlink the active one.

Browser MCPs spawn many processes

Puppeteer launches a Chromium instance per tool call by default. On laptops, watch memory usage during long Cascade runs — and consider closing the browser MCP when you are not actively testing UI flows.

Frequently asked questions

Does Windsurf support MCP natively?

Yes. Windsurf wires MCP servers directly into Cascade — its agent. Drop a server into mcp_config.json and Cascade can call it autonomously during multi-step work, the same way it uses its built-in tools.

Where is Windsurf's MCP config stored?

At ~/.codeium/windsurf/mcp_config.json. The Cascade hammer icon → "Configure" button opens the file directly. You can also edit it in any text editor — Windsurf re-reads it when you click Refresh in the MCP panel.

Do I need to restart Windsurf after editing mcp_config.json?

No. Click the Refresh button in the Cascade MCP panel and Windsurf re-reads the config without a full app restart. A complete restart is only needed if a server is wedged or you suspect a stale process.

Will my existing Claude Desktop or Cursor config work in Windsurf?

Yes — the mcpServers JSON shape is identical. Copy the mcpServers block from claude_desktop_config.json or .cursor/mcp.json into mcp_config.json and it works unchanged. Only the file path differs.

Which MCPs work best with Cascade?

Tool-rich MCPs that let Cascade chain multiple steps: Filesystem for the working tree, GitHub for repo operations, Puppeteer for in-browser flows, and a database MCP (Postgres or Supabase) when the agent is building or testing an app. Cascade thrives on MCPs with lots of operations to compose.

How do I see what tools an MCP exposes in Windsurf?

Open the Cascade MCP panel (hammer icon). Each connected server expands to show every tool it provides, with descriptions. A green dot = ready. A red dot = the server failed to start; hover for the error message.

More for Windsurf

See the full Windsurf client page for the complete config reference, the latest top picks, and one-click install snippets per MCP.

More guides