MCP Server

Connect your AI assistant to ralph-gpu documentation using the Model Context Protocol.

Setup

Add ralph-gpu MCP server to Cursor

Or add manually to your .cursor/mcp.json:

.cursor/mcp.jsonjson
{
  "mcpServers": {
    "ralph-gpu-docs": {
      "url": "https://ralph-gpu.labs.vercel.dev/mcp/mcp"
    }
  }
}

What is MCP?

The Model Context Protocol (MCP) is an open standard that allows AI assistants to access external tools and data sources. With the ralph-gpu MCP server, your AI can:

Quick Start Guide

Get a comprehensive guide with all patterns and best practices.

Access Documentation

Get complete API reference, concepts, and getting started guides.

Browse Examples

List and retrieve full code for all shader examples.

Available Tools

get_started

Returns the comprehensive quickstart guide (~1600 lines) with all patterns, code examples, and best practices. Recommended first call.

get_documentationtopic: "getting-started" | "concepts" | "api"

Get full documentation for a specific topic.

list_examples

List all available examples with slug, title, and description.

get_exampleslug: string

Get full code and shader for a specific example.

How It Works

When you ask your AI assistant about ralph-gpu, it can now call the MCP server to fetch relevant information:

You: "Create a particle system with ralph-gpu"
AI: [calls get_started]
AI: [receives comprehensive guide]
AI: "Here's how to create a particle system with ralph-gpu..."

Next Steps