How to Give Your AI Coding Assistant Access to Your Webhooks via MCP

Published February 12, 20268 min read

You're deep in a debugging session. Your AI coding assistant is helping you trace a payment flow bug. It asks what the last Stripe webhook payload looked like. So you leave your editor, open the Hooklistener dashboard, find the endpoint, locate the request, copy the JSON body, and paste it back into your conversation.

The whole thing takes 30 seconds, but it breaks your focus completely. And you do it dozens of times a day.

There's a better way. The Model Context Protocol (MCP) lets your coding assistant talk directly to Hooklistener. No tab-switching. No copy-pasting. Just ask "show me the last webhook on my Stripe endpoint" and get the answer inline. This guide shows you how to set it up in under two minutes.

What we cover:Connecting Hooklistener's MCP server to Claude Code, Cursor, Windsurf, and OpenAI Codex CLI. What you can do with it, and the gotchas we ran into while building it. We won't cover building your own MCP server from scratch—just using ours.

What is MCP, and Why Should You Care?

Think of MCP as a USB-C port for AI assistants. Before USB-C, every device had its own charger. Before MCP, every AI tool needed custom integrations to access external data.

MCP standardizes how AI assistants discover and use external tools. Here's the mental model:

  • MCP Server = a service that exposes "tools" (functions the AI can call). Hooklistener runs one at app.hooklistener.com/api/mcp.
  • MCP Client = your AI assistant (Claude Code, Cursor, etc.). It connects to the server, discovers the tools, and calls them when relevant.
  • Tools = specific actions like "list endpoints", "get a captured request", or "create an uptime monitor". The AI decides when to call them based on your conversation.

The AI doesn't just fetch raw data—it understands the tool schemas, picks the right one, formats the arguments, and presents the result in context. You ask a question in plain English; it handles the plumbing.

Prerequisites

You need two things:

  1. A Hooklistener account with API access (included on the free plan)
  2. An API key generated from Organization Settings > API Keys

Your API key starts with hklst_. Keep it somewhere safe—you'll need it in the next step.

Important:Never commit your API key to version control. For team setups, each developer should use their own key, or store it in an environment variable.

Setup: Claude Code

Claude Code has first-class MCP support. One command, and you're connected.

claude mcp add --transport http hooklistener \
  https://app.hooklistener.com/api/mcp \
  --header "Authorization: Bearer hklst_your_api_key_here"

This registers the server for the current project. Use --scope user to make it available across all your projects, or --scope project to write it to .mcp.json so your whole team gets it.

Verify it's working by typing /mcp in Claude Code. You should see "hooklistener" listed with 8 tools.

Manual alternative

If you prefer editing config files directly, add this to .mcp.json (project-level) or ~/.claude/mcp.json (global):

.mcp.json
{
  "mcpServers": {
    "hooklistener": {
      "type": "streamable-http",
      "url": "https://app.hooklistener.com/api/mcp",
      "headers": {
        "Authorization": "Bearer hklst_your_api_key_here"
      }
    }
  }
}

Pitfall:If you add the server while Claude Code is running, you need to restart it. The MCP connection is established at startup. A common frustration is editing the config and wondering why nothing changed.

Setup: Cursor

Add this to .cursor/mcp.json in your project root:

.cursor/mcp.json
{
  "mcpServers": {
    "hooklistener": {
      "type": "streamable-http",
      "url": "https://app.hooklistener.com/api/mcp",
      "headers": {
        "Authorization": "Bearer hklst_your_api_key_here"
      }
    }
  }
}

Restart Cursor or reload MCP servers from the settings panel. The tools will appear in Cursor's agent mode.

Setup: OpenAI Codex CLI

Codex CLI uses environment variables for bearer tokens, which is a nice security pattern:

export HOOKLISTENER_API_KEY=hklst_your_api_key_here

codex mcp add hooklistener --transport streamable-http \
  --url https://app.hooklistener.com/api/mcp \
  --bearer-token-env-var HOOKLISTENER_API_KEY

Or add it manually to ~/.codex/config.toml:

~/.codex/config.toml
[mcp_servers.hooklistener]
url = "https://app.hooklistener.com/api/mcp"
bearer_token_env_var = "HOOKLISTENER_API_KEY"

Setup: Windsurf

Add this to ~/.codeium/windsurf/mcp_config.json:

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "hooklistener": {
      "serverUrl": "https://app.hooklistener.com/api/mcp",
      "headers": {
        "Authorization": "Bearer hklst_your_api_key_here"
      }
    }
  }
}

Note:Windsurf uses serverUrl instead of url. Small difference, but it'll silently fail if you use the wrong key.

What You Can Actually Do With It

Once connected, your AI assistant has access to 8 tools across three categories. You don't call these tools directly—the assistant picks the right one based on what you ask.

Debug Endpoints

ToolWhat it does
list_endpointsLists all your debug endpoints with their webhook URLs
get_endpointGets details for a specific endpoint
create_endpointCreates a new debug endpoint and returns its URL

Captured Requests

ToolWhat it does
list_requestsLists captured webhooks with filtering by method and path
get_requestGets the full request: headers, body, query params, remote IP

Uptime Monitors

ToolWhat it does
list_monitorsLists all your uptime monitors
get_monitor_statusGets uptime percentage, avg response time, and recent checks
create_monitorCreates a new monitor with configurable interval and timeout

Real Workflow Examples

Here's where it clicks. These aren't hypothetical—they're the workflows that made us build this in the first place.

"Create a debug endpoint for Stripe and give me the URL"

The assistant calls create_endpoint with the name "Stripe Webhooks" and hands you the public URL. You paste it into Stripe's dashboard. No context switch, no clicking around. Ten seconds.

"Show me the last webhook that came in on my Stripe endpoint"

It calls list_endpoints to find your Stripe endpoint, then list_requests to grab the most recent capture, then get_request to fetch the full payload. You see the headers, body, and metadata right in your conversation. If the body contains a checkout.session.completed event, the assistant can immediately help you write the handler.

"Is my production API healthy? What's the uptime this month?"

The assistant calls get_monitor_status and reports back: "99.95% uptime over the last 30 days, average response time 125ms. The last check was 2 minutes ago, status 200." If something looks off, you're already in the right context to investigate.

"Set up a health check for our new staging environment"

It calls create_monitor with the URL, a 5-minute check interval, and a 30-second timeout. Done. You didn't leave your terminal.

How It Works Under the Hood

You don't need to understand the protocol to use it, but knowing the basics helps when things go wrong.

Hooklistener's MCP server uses the Streamable HTTP transport. Your AI assistant communicates with it via JSON-RPC 2.0 over plain HTTP POST requests to a single endpoint: /api/mcp.

The flow looks like this:

  1. Your assistant sends an initialize request with its client info
  2. The server responds with its capabilities (what tools it supports)
  3. The assistant calls tools/list to discover available tools and their parameter schemas
  4. When you ask a question, the assistant picks the right tool and calls tools/call with the arguments
  5. The server validates your API key, runs the query, and returns the result

Every request after initialize includes an mcp-session-id header to maintain session context. Authentication happens on every tool call via your Authorization: Bearer header—there's no separate login step.

Testing It Manually (For the Curious)

If you want to see what your AI assistant is doing behind the scenes, you can hit the MCP server directly with curl. This is also useful for debugging connection issues.

Initialize a session

curl -X POST https://app.hooklistener.com/api/mcp \
  -H "Authorization: Bearer hklst_your_key" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -d '{
    "jsonrpc": "2.0",
    "id": 1,
    "method": "initialize",
    "params": {
      "protocolVersion": "2025-03-26",
      "capabilities": {},
      "clientInfo": {
        "name": "my-test",
        "version": "1.0"
      }
    }
  }'

Grab the mcp-session-id from the response headers, then list tools:

List available tools

curl -X POST https://app.hooklistener.com/api/mcp \
  -H "Authorization: Bearer hklst_your_key" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -H "mcp-session-id: SESSION_ID_FROM_ABOVE" \
  -d '{
    "jsonrpc": "2.0",
    "id": 2,
    "method": "tools/list",
    "params": {}
  }'

You'll see all 8 tools with their names, descriptions, and parameter schemas. This is exactly what your AI assistant sees when it connects.

Troubleshooting

"Authentication required" error

Your API key is missing or invalid. Double-check that it starts with hklst_ and that the Authorization header is formatted as Bearer hklst_... with a space after "Bearer". Generate a fresh key from Organization Settings if needed.

Server not showing up in your assistant

Most MCP clients only load servers at startup. Restart your editor or CLI after editing the config. In Claude Code, run /mcp to check the connection status. Verify your config file is valid JSON—a trailing comma will silently break it.

Tools appear but calls fail

This usually means your plan doesn't include the feature you're trying to use. Debug endpoints and request inspection are available on all plans. Uptime monitors require a paid plan. If you're hitting a limit (e.g., max endpoints), the error message will tell you.

Windsurf silently fails

Check that you're using serverUrl, not url. Windsurf uses a different key name than Cursor and Claude Code. This is the most common mistake we see.

A Note on Security

Every tool call is authenticated with your API key and scoped to your organization. There's no way to access another organization's data through the MCP server—every database query is filtered by your organization ID.

Your API key is hashed with SHA-256 before storage. We never store it in plaintext. When you make a request, we match on a prefix, then verify the full hash.

For team environments, avoid putting API keys directly in .mcp.json if it's committed to version control. Use environment variables or keep the key in a local config file that's gitignored. Codex CLI's bearer-token-env-var pattern is a good example of how to do this right.

What This Unlocks

The real value isn't any single tool call. It's the compound effect of your AI assistant having live context about your webhooks while it helps you write code.

When you're writing a Stripe webhook handler and the assistant can see the actual payload that just came in, it writes better code. When you're debugging a failed integration and the assistant can inspect the headers and body, it finds the bug faster. When you ask "is my API healthy?" and get a real answer with numbers, you make better decisions.

The setup takes two minutes. The time you save compounds every day.