A native Slate-based plugin that integrates the world's most advanced language models directly into the Unreal Editor. Seamlessly introspect your projects, analyze blueprints, search code, and accelerate development with real-time AI assistance powered by Claude, GPT-4, Gemini, or local models.
Choose from the world's most advanced language models, or run locally for complete privacy
Best-in-class reasoning with native tool use. The Anthropic Messages API provides the most robust streaming, structured tool responses, and content block architecture. Claude consistently delivers the most accurate project analysis and code understanding across complex codebases. Default model: claude-sonnet-4-5-20250514 with extended thinking capabilities for advanced problem-solving.
Powerful function calling with Chat Completions API. Full support for GPT-4o with multimodal capabilities and advanced reasoning. Streaming responses with chunked message delivery. Perfect balance of capability and speed for rapid iteration and creative problem-solving in development workflows.
Fast inference with Generative AI API and function declarations. Excellent for context understanding and project introspection. Competitive pricing with strong performance on code analysis and technical documentation tasks. Supports long context windows for comprehensive project analysis.
Run any OpenAI-compatible endpoint: Ollama, LM Studio, or your own inference server. Zero API costs, complete privacy, and offline capability. Perfect for proprietary projects or teams with data sovereignty requirements. Full streaming support with any local model.
Everything you need to revolutionize your development workflow
Access 11 specialized tools that introspect your entire project. Read asset metadata, analyze blueprint graphs, enumerate world actors, search source code, and retrieve class hierarchies. Complete visibility into your project structure.
Unified interface for Claude, GPT-4, Gemini, and local models. Hot-swap providers without closing the editor. Extensible provider pattern makes adding custom endpoints effortless. Choose the best model for each task.
See responses appear instantly as they stream. Native implementation for Claude/OpenAI with chunked Gemini support. Progressive UI updates create a responsive, interactive experience. No waiting for complete responses.
Integrated function calling system. The AI automatically selects and chains tools to answer complex questions. Supports parameter inference and structured JSON responses. Tools work seamlessly with conversation context.
Native Slate widget docks anywhere in the editor interface. Open with Ctrl+Shift+M or from the Tools menu. Persists position and state across editor sessions. Fully integrated with UE5's UI paradigms.
Deep inspection of blueprint variables, functions, event graphs, node connections, components, and interfaces. Understand blueprint logic, identify issues, and get optimization suggestions from the AI assistant.
Full-text search across all .h and .cpp files in your project. Read source files with configurable line limits for efficient context. Automatically scoped to your project directory for safety and performance.
Add new LLM providers, create custom tools, build custom UI panels, and hook into delegates. Well-documented extension points and a clean architecture make extending functionality straightforward for C++ developers.
Comprehensive introspection capabilities for complete project understanding
Search the Asset Registry by name and class. Returns matched assets with paths and metadata. Essential for finding blueprints, materials, and other project assets by partial name matching or filter criteria.
Retrieve complete information about a blueprint including variables, functions, event graph connections, components, and interfaces. Essential for understanding blueprint structure and logic flow.
Read C++ source files (.h, .cpp) with configurable line limits. Returns code with line numbers and proper context. Automatically enforces safety boundaries to prevent reading outside the project.
Full-text search across all C++ source files in your project. Returns matching lines with file paths and line numbers. Perfect for finding functions, classes, and specific code patterns across large codebases.
Retrieve high-level project information including name, version, target engine version, and paths. Get summary statistics about blueprints, C++ classes, and project plugins.
Enumerate all actors in the current level with their classes, locations, and components. Understand the level composition and actor relationships for the current working context.
Retrieve inheritance hierarchies for C++ classes and blueprints. Understand parent-child relationships and see what classes inherit from a given base. Essential for API exploration.
Inspect material instances and master materials. Retrieve parameters, scalar values, vector values, and texture assignments. Understand material shader graphs and parameter relationships.
Get information about currently selected actors in the editor. Retrieve properties, components, and transforms. Provides context about what you're actively working on in the viewport.
Retrieve information about the current level including name, actor count, static mesh count, and other statistics. Get a comprehensive overview of level composition and streaming volumes.
Execute editor console commands safely with whitelist validation. Useful for gameplay debugging, profiling commands, and editor state manipulation. Prevents dangerous or destructive commands.
A clean, extensible architecture built for maximum flexibility
Mudflood AI is structured as two complementary modules working in perfect harmony. The MudfloodAICore runtime module contains all provider implementations, conversation management, streaming logic, and tool execution systems. The MudfloodAIEditor module provides the native Slate UI, tool handler, project introspection systems, and editor integration hooks.
The abstract ULLMProvider base class defines the interface for all LLM integrations. Each provider implements streaming, tool handling, and response parsing. The plugin ships with four providers (Claude, OpenAI, Gemini, Local), but you can easily add custom providers by subclassing ULLMProvider.
Tools are registered in the MudfloodToolHandler as structured metadata. When the LLM requests a tool call, the handler validates parameters and executes the appropriate function. Results are serialized to JSON and returned to the conversation context for multi-turn reasoning.
All messages are tracked in a conversation history with automatic context management. The system includes configurable message limits, system prompt generation, and conversation state persistence. Tools are automatically included in the system context so the AI knows what it can do.
Ctrl+Shift+M or go to Tools โ Mudflood AI โ Open Chat. The panel will dock in your editor interface. Pin it to keep it visible while you work.
UMudfloodToolHandler and registering them in the constructor. The AI will automatically include your tools in its context and use them when appropriate.
All settings are configured in Project Settings โ Mudflood AI. Changes take effect immediately without restarting.
Select which LLM provider to use: Claude (recommended), OpenAI, Gemini, or Local.
| Setting | Type | Default | Description |
|---|---|---|---|
SelectedProvider |
ELLMProvider | Claude | Active LLM provider for conversations |
ClaudeAPIKey |
FString | Empty | API key from console.anthropic.com |
OpenAIAPIKey |
FString | Empty | API key from platform.openai.com |
GeminiAPIKey |
FString | Empty | API key from aistudio.google.com |
LocalBaseURL |
FString | http://localhost:11434 | Base URL for local/custom OpenAI-compatible endpoints |
| Setting | Type | Default | Description |
|---|---|---|---|
ModelName |
FString | Empty (uses provider default) | Specific model to use (e.g., gpt-4o for OpenAI) |
MaxTokens |
int32 | 8192 | Maximum tokens per response (256-128000) |
Temperature |
float | 0.7 | Response randomness (0.0-2.0, higher = more creative) |
| Setting | Type | Default | Description |
|---|---|---|---|
bEnableStreaming |
bool | true | Show responses in real-time as they stream |
bEnableToolUse |
bool | true | Allow the AI to use introspection tools |
bAutoIncludeProjectContext |
bool | true | Automatically include project info in system prompt |
MaxContextMessages |
int32 | 50 | Maximum conversation history to maintain (5-200) |
Claude provides the best reasoning capability, native tool use, and most reliable code analysis. Go to console.anthropic.com, create an API key, and enter it in the settings.
Get your API key from platform.openai.com. Supports GPT-4o with multimodal capabilities and function calling.
Get a free API key from aistudio.google.com. Fast inference with strong code understanding.
Run any OpenAI-compatible endpoint locally. Use Ollama or LM Studio for complete privacy and offline capability.
Use any OpenAI-compatible endpoint. Many services offer this (Groq, Together, Fireworks, etc.).
The ProjectContextGatherer scans your entire Unreal project and provides detailed information to the LLM. This enables the AI to understand your project structure, architecture, and code.
The introspection system uses Unreal's Asset Registry to enumerate all assets in your project. This includes blueprints, materials, meshes, textures, sounds, and all other project assets. The system extracts metadata like asset type, class, and path.
For blueprint assets, the system reflects on the blueprint graph structure, variables (with types and defaults), functions, event implementations, and component hierarchies. This allows the AI to understand your blueprint logic and architecture.
When a level is open, the system enumerates all actors in the world using TActorIterator. For each actor, it retrieves the class, location, rotation, components, and other relevant properties. This provides context about what's in the current level.
The system has safe, validated access to all C++ source files in your project's Source folder. Files are read with configurable line limits to prevent excessive context sizes. All file paths are validated to ensure they're within the project folder.
All gathered context is intelligently formatted into a system prompt that tells the LLM about your project. This includes project metadata, available assets, blueprints, classes, and tools. The system prompt is automatically updated as you work.
The tool system allows the LLM to call specialized functions to introspect your project. Tools are defined as structured metadata and executed by the tool handler.
Tools are registered in the MudfloodToolHandler constructor. Each tool maps a name to an execution function. When the LLM requests a tool, the handler validates the request, executes the function, and returns the result as JSON.
When the LLM decides to use a tool, this sequence occurs:
To add a custom tool, subclass FMudfloodToolHandler and register your tool:
The SMudfloodChatPanel is a native Slate widget with this hierarchy:
SMudfloodChatMessage handles displaying both user and assistant messages. It automatically detects code blocks and renders them with syntax highlighting. Streaming messages update in real-time as tokens arrive.
Text is displayed progressively as it streams from the LLM. The widget uses an FString that grows with each update and automatically reformats for word wrapping. Streaming status is shown to the user.
When the AI uses a tool, the panel shows which tool is being called and with what parameters. Tool results are displayed in a collapsible format so you can understand what data the AI is working with.
FMudfloodAIStyle defines all colors, fonts, and visual properties. Extend or override the style by subclassing FMudfloodAIStyle and registering your custom style.
FMudfloodAICommands defines editor shortcuts:
Ctrl+Shift+M - Toggle Mudflood AI panelCtrl+Enter - Send message from input boxShift+Enter - New line in input boxSubclass ULLMProvider and implement the required virtual functions:
Register tools by subclassing the tool handler or by adding to the global tool registry:
Create a custom Slate widget that inherits from SMudfloodChatPanel. Override the layout, styling, or message rendering to match your preferences.
The system provides delegates for key events:
Override ProjectContextGatherer to add custom project information to the system prompt. This allows the AI to know about custom project structures or metadata.