An MCP (Model Context Protocol) server for managing, storing, and providing prompts and prompt templates for LLM interactions.
- Store and retrieve prompts and templates
- Apply variable substitution to prompt templates
- Tag-based organization and search
- MCP Prompts protocol support
- Tools for prompt management
- Import and process prompts from various sources
- Export prompts in different formats (JSON, Markdown)
- Share and import prompt collections
- Docker and devcontainer support
- Comprehensive test client
- New: PGAI database integration with semantic search
- New: High-quality professional prompt collection
The simplest way to install is using the unified installer script:
./install.sh
The installer supports multiple installation modes:
# Install locally (default)
./install.sh --mode=local
# Install globally with npm
./install.sh --mode=npm
# Build and install as a Docker image
./install.sh --mode=docker --docker-user=yourusername
# Process raw prompts during installation
./install.sh --process-prompts
# Install in development mode
./install.sh --dev
For more options, run:
./install.sh --help
-
Clone this repository:
git clone https://github.com/yourusername/mcp-prompts.git cd mcp-prompts
-
Install dependencies:
npm install
-
Build the project:
npm run build
Build and run with Docker:
# Build the image
docker build -t yourusername/mcp-prompts .
# Run the container
docker run -it --rm yourusername/mcp-prompts
Run the server with:
npm start
The easiest way to process, tag, and organize prompts is with the complete pipeline:
# Create your rawprompts.txt file first, then run:
npm run prompt:pipeline
This runs the complete process:
- Extracts prompts from the raw file and generates metadata
- Performs intelligent tagging based on content
- Organizes prompts into appropriate category directories
- Summarizes the results
Additional options:
# Preview without making changes
npm run prompt:pipeline:dry
# Show detailed output from each step
npm run prompt:pipeline:verbose
# Run pipeline but keep the raw prompts file
npm run prompt:pipeline:keep
To process raw prompts from a file:
# First create a rawprompts.txt file with your prompts
npm run prompt:process
This will:
- Extract prompts from the raw file
- De-duplicate similar prompts
- Generate metadata, tags, and descriptions
- Export as both JSON and Markdown formats
- Place them in the
processed_prompts
directory
Additional options include:
# Process prompts and create a backup of the raw file
npm run prompt:process:backup
# Process prompts without removing the raw file
npm run prompt:process:keep
To organize prompts into a structured directory hierarchy based on their tags:
# Organize prompts into category directories
npm run prompt:organize
# Preview what would be organized without making changes
npm run prompt:organize:dry
# Force reorganization (overwrite existing files)
npm run prompt:organize:force
This organizes prompts into categories like:
development/
- Programming, coding, debugging promptsanalysis/
- Data analysis, research, and insights promptscontent/
- Translation and language promptsplanning/
- Future planning, decision-making promptsproductivity/
- Workflow and organization promptsai/
- General AI and language model promptstemplates/
- Reusable prompt templates
You can manage tags across all prompts with the tag management tool:
# List all unique tags and their usage statistics
npm run prompt:tags:list
# Add a tag to all prompts matching a search term
npm run prompt:tags:add ai-assistant "You are a"
# Remove a tag from all prompts
npm run prompt:tags:remove outdated-tag
# Rename a tag across all prompts
npm run prompt:tags:rename old-tag new-tag
You can export your prompts for sharing or backup in different formats:
# Export all prompts to a JSON file
npm run prompt:export
# Export as a ZIP archive with organized folder structure
npm run prompt:export:zip
# Export as a Markdown documentation file
npm run prompt:export:md
Additional options:
# Export only prompts with specific tags
npm run prompt:export -- --tags=ai,coding
# Specify a custom output filename
npm run prompt:export -- --out=my-collection.zip
All exports are saved to the exports/
directory with a timestamp in the filename.
You can import prompts from various sources:
# Import from a previously exported JSON file
npm run prompt:import -- --source=exports/mcp-prompts-export.json
# Import from a directory containing JSON prompt files
npm run prompt:import -- --source=path/to/prompts/
# Import from a ZIP archive
npm run prompt:import -- --source=path/to/prompts.zip
# Preview what would be imported without making changes
npm run prompt:import:dry -- --source=path/to/prompts.zip
# Force import (overwrite existing prompts)
npm run prompt:import:force -- --source=path/to/prompts.zip
Imported prompts will automatically be organized into appropriate category directories based on their tags.
The project includes a test client that can verify server functionality:
npm test
This will connect to the MCP server and test all its functionality.
To integrate with Claude Desktop, add the following to your Claude Desktop configuration file:
{
"mcpServers": {
"prompt-manager": {
"command": "node",
"args": ["/absolute/path/to/mcp-prompts/build/index.js"]
}
}
}
The server provides the following MCP tools:
add_prompt
: Add a new prompt to the collectionedit_prompt
: Edit an existing promptget_prompt
: Retrieve a prompt by IDlist_prompts
: List all prompts, optionally filtered by tagsapply_template
: Apply a template prompt with variable substitutiondelete_prompt
: Delete a prompt from the collection
Prompts can be accessed through Claude using MCP tools or the standard MCP prompts protocol:
I need to review some code.
use_mcp_tool({
server_name: "prompt-manager",
tool_name: "apply_template",
arguments: {
id: "code-review",
variables: {
language: "python",
code: "def example():\n return 'Hello, World!'"
}
}
});
The project includes VS Code DevContainer configuration for consistent development environments. To use it:
- Open the project in VS Code
- When prompted, click "Reopen in Container"
- VS Code will build the container and set up the environment
The project has been consolidated to reduce file count and improve maintainability:
-
src/core/
- Core functionality consolidated into modular componentsindex.ts
- Central type definitions and core functionsprompt-management.ts
- Unified prompt management functionalitytest.ts
- Test utilities
-
bin/
- Command-line toolsprompt-cli.js/ts
- Unified CLI for all prompt management taskscli.js
- Main MCP server CLI
Prompts can be managed using the consolidated prompt-cli tool:
# Process raw prompts
npm run prompt:process
# Import prompts
npm run prompt:import -- --source=path/to/prompts
# Export prompts
npm run prompt:export -- --format=markdown
# Manage tags
npm run prompt:tags:list
npm run prompt:tags:add -- --tag=example --prompts=id1,id2
The MCP Prompts Server now supports PostgreSQL-based storage using PGAI (PostgreSQL AI) for enhanced prompt management capabilities.
For detailed setup instructions, see our PGAI Setup Guide.
Quick setup:
# Install PostgreSQL
sudo apt-get install postgresql postgresql-contrib
# Install vector extension for embeddings
psql -U postgres -c 'CREATE EXTENSION vector;'
# Install PGAI extension
psql -U postgres -c 'CREATE EXTENSION pgai;'
# Create a database for MCP Prompts
createdb -U postgres mcp_prompts
# Install the required Node.js dependencies
npm run install:deps
To use PGAI for prompt storage, update your configuration in config/pgai.json
:
{
"server": {
"port": 3000,
"host": "localhost",
"logLevel": "info"
},
"storage": {
"type": "pgai",
"options": {
"connectionString": "postgresql://username:password@localhost:5432/mcp_prompts"
}
}
}
The system provides tools to migrate prompts from file storage to the PGAI database:
# Preview migration without making changes
npm run pgai:migrate:dry
# Migrate selected prompts to PGAI
npm run pgai:migrate
# Migrate with a custom connection string
npm run pgai:migrate -- --connection=postgresql://username:password@localhost:5432/mcp_prompts
The system now includes a collection of high-quality, professionally crafted prompts designed for various specialized use cases:
# Migrate the improved prompts collection to PGAI
npm run pgai:migrate:improved
# Preview the improved prompts without migrating
npm run pgai:migrate:improved:dry
The improved prompts collection includes:
- Enhanced Code Review Assistant - Comprehensive code review with security focus
- Advanced Code Refactoring Assistant - Structured approach to code improvement
- Intelligent Debugging Assistant - Systematic problem diagnosis and resolution
- Comprehensive Data Analysis Assistant - Advanced analytics for complex datasets
- Advanced Content Analysis Assistant - Sophisticated content structure analysis
- System Architecture Designer - Professional software architecture planning
- Comprehensive Research Assistant - Research methodology and synthesis
- Topic Modeling Specialist - Hierarchical theme identification and organization
- Contextual Translation Assistant - Context-aware translation between languages
- Strategic Foresight Planner - Decision analysis and scenario planning
- Question Generation Specialist - Creating diverse, thought-provoking questions
- Follow-up Question Generator - Targeted questions for deeper conversations
A test script is provided to verify the PGAI integration:
# Run PGAI integration tests
npm run pgai:test
# Test with a custom connection string
npm run pgai:test -- --connection=postgresql://username:password@localhost:5432/mcp_prompts
Using PGAI for prompt storage provides several advantages:
- Semantic Search: Find prompts by meaning rather than just keywords
- Scalability: Support for larger prompt collections
- Database Features: Transactions, concurrent access, and data integrity
- AI Capabilities: Leverage PostgreSQL AI features for advanced prompt management
MIT