A Burp Suite extension for analyzing HTTP requests and responses with AI.
This extension allows security testers to quickly analyze requests for potential vulnerabilities and understand complex responses.
- Analyze HTTP requests for security issues with "AI Suggest" tab
- Understand HTTP responses with the "AI Explain" tab
- Support for multiple AI providers (OpenRouter or local Ollama)
- Caching system to avoid repeated analysis of identical requests
- Configurable prompt templates
- API Key management
- Download the extension files
- In Burp Suite, go to Extensions > Add
- Select the main.py file as the Python extension entry point
- The extension will appear as a new tab called "AI Request Analyzer"
The extension supports two AI providers:
- Requires an API key (get one from openrouter.ai)
- Supports a wide range of powerful models like Claude, GPT, etc.
- Select a model from the list after fetching available models
- For best results, prefer "instruct" models
- Requires a local Ollama installation (get from ollama.ai)
- Run models locally with no API keys needed
- Enter your Ollama URL (default: http://localhost:11434/api/generate)
- Click "Fetch Models" to see available models on your Ollama server
The extension works with default values without any custom configuration. However, if you want to customize its behavior:
- A template file
.env-distis provided instead - Copy or rename
.env-distto.envto start using custom settings - Edit the
.envfile to change settings according to your needs
If no .env file is found, the extension will use these default values:
# Cache configuration
CACHE_MAX_SIZE_MB=100 # Maximum cache size in MB
CACHE_MAX_AGE_DAYS=30 # Maximum days to keep cache entries
CACHE_MAX_ENTRIES=1000 # Maximum number of cache entries
# Message configuration
MAX_MESSAGE_LENGTH=4000 # Maximum length before truncating HTTP messages
# OpenRouter configuration
OPENROUTER_MAX_TOKENS=800 # Maximum tokens for OpenRouter responses
OPENROUTER_API_URL= # Custom API URL (leave empty for default)
OPENROUTER_DEFAULT_MODEL= # Default model to use (leave empty to select manually)
OPENROUTER_TEMPERATURE=0.3 # Temperature for response generation (0.0-1.0)
# Ollama configuration
OLLAMA_API_URL= # Custom API URL (leave empty for default)
OLLAMA_DEFAULT_MODEL= # Default model to use (leave empty to select manually)
OLLAMA_TEMPERATURE=0.3 # Temperature for response generation (0.0-1.0)
- Configure your preferred AI provider in the "AI Request Analyzer" tab
- Browse the application you're testing in Burp Suite
- When you find an interesting request:
- Use the "AI Suggest" tab to get vulnerability suggestions for that request
- Use the "AI Explain" tab to understand complex responses
- Enable/disable automatic analysis in the options panel
- Clear the cache as needed
- After fetching models, you can type keywords to filter them:
- "claude" for Claude models
- "gpt" for GPT models
- "free" for models without credit requirements in OpenRouter
- "Instruct" models typically provide better security-focused responses
- Large HTTP messages are automatically truncated to save tokens
- Analysis results are cached to improve performance
- You can customize the prompts in the "AI Request Analyzer" tab
The extension uses prompt templates to guide the AI in analyzing requests and responses. There are two ways to customize these prompts:
-
Temporary Customization (UI):
- You can modify prompts directly in the "AI Request Analyzer" tab
- These changes apply immediately but are not persistent
- Changes will be lost when Burp Suite is restarted
-
Permanent Customization (Files):
- For persistent custom prompts, modify the text files in the
prompts/directory:suggest_prompt.txt: Template for request vulnerability suggestionsexplain_prompt.txt: Template for response analysis
- These files are loaded each time the extension starts
- Changes to these files will persist across Burp Suite restarts
- For persistent custom prompts, modify the text files in the
When customizing prompts, focus on providing clear instructions for the AI about what to analyze and how to format the results.
- This extension is written in Python for Burp Suite's Jython environment
- The project uses a modular architecture
- Contribute via issues and pull requests
The project is organized into the following modules:
- main.py: Main extension entry point
- core/: Core functionality
- api_handlers.py: API communication with OpenRouter and Ollama
- cache.py: Caching system with performance optimizations
- config_loader.py: Configuration and .env file handling
- models.py: Model management
- ui/: UI components and layout
- analyzer_tabs.py: Request and response analysis tabs
- config_tab.py: Configuration tab interface
- components.py: Reusable UI components
- utils/: Utility functions
- helpers.py: Helper functions
- listeners.py: Event listeners
- prompt_manager.py: Prompt template management
- settings.py: Settings handling
- prompts/: Template prompts for different analysis types
- explain_prompt.txt: Template for response analysis
- suggest_prompt.txt: Template for request vulnerability suggestions
This project is open source and available under the MIT License.

