AI MCP Service New
Oh-my-rime provides a knowledge base query service based on MCP (Model Context Protocol). With MCP, you can let AI editors (such as Cursor, Windsurf, VS Code, etc.) directly access the oh-my-rime documentation knowledge base, getting accurate configuration tutorials and usage guides during AI conversations.
In short: After configuring MCP, you can ask questions about oh-my-rime directly in your AI editor, and the AI will automatically query our knowledge base to answer you.
What is MCP?
MCP (Model Context Protocol) is an open protocol proposed by Anthropic to standardize the interaction between AI models and external data sources and tools. Think of it as a "plugin system" for AI assistants — through MCP, AI can call external tools to retrieve real-time, accurate information instead of relying solely on training data.
Service Information
| Item | Description |
|---|---|
| Endpoint | https://www.mintimate.cc/mcp |
| Transport | Streamable HTTP |
| Protocol Version | 2025-03-26 |
| Available Tools | query_oh-my-rime — Semantic search of the oh-my-rime knowledge base |
get_download_links — Get client and config package download links | |
get_schema_list — Get supported input schema list | |
get_author_info — Get author information |
Configuration in AI Editors
Cursor
Create a .cursor/mcp.json file in the project root directory, or add it in Cursor's global settings:
{
"mcpServers": {
"oh-my-rime-knowledge": {
"url": "https://www.mintimate.cc/mcp",
"transport": "streamable-http"
}
}
}Windsurf
Add the following to Windsurf's MCP configuration:
{
"mcpServers": {
"oh-my-rime-knowledge": {
"serverUrl": "https://www.mintimate.cc/mcp",
"transport": "streamable-http"
}
}
}VS Code (GitHub Copilot)
Create a .vscode/mcp.json file in the project root directory:
{
"servers": {
"oh-my-rime-knowledge": {
"type": "http",
"url": "https://www.mintimate.cc/mcp"
}
}
}Claude Desktop
Add the following to the Claude Desktop configuration file (config file path: on macOS ~/Library/Application Support/Claude/claude_desktop_config.json, on Windows %APPDATA%\Claude\claude_desktop_config.json):
{
"mcpServers": {
"oh-my-rime-knowledge": {
"url": "https://www.mintimate.cc/mcp",
"transport": "streamable-http"
}
}
}Cherry Studio
In Cherry Studio's settings, go to the MCP Servers configuration page and add a new Streamable HTTP type server:
{
"mcpServers": {
"oh-my-rime-knowledge": {
"isActive": true,
"name": "Oh My Rime Knowledge",
"type": "streamable-http",
"description": "Oh My Rime knowledge base query",
"baseUrl": "https://www.mintimate.cc/mcp"
}
}
}Note
Cherry Studio uses a different configuration format from other clients. Please use baseUrl instead of url, type instead of transport, and note that the name and type fields are required.
Note
The configuration format may vary slightly across different AI editors/clients. Please refer to the official documentation of your tool. The examples above are for reference only — the key point is to set the MCP endpoint https://www.mintimate.cc/mcp in the corresponding MCP settings.
Usage
Once configured, you can directly ask questions about oh-my-rime in your AI conversation, and the AI will automatically choose the appropriate tool to answer. For example:
- "How do I install oh-my-rime on macOS?" → calls
query_oh-my-rime - "How to configure fuzzy pinyin in oh-my-rime?" → calls
query_oh-my-rime - "What input schemas are available?" → calls
get_schema_list - "Give me the download links" → calls
get_download_links - "Who made Oh My Rime?" → calls
get_author_info
Tool Parameters
query_oh-my-rime — Knowledge Base Semantic Search
| Parameter | Type | Required | Description |
|---|---|---|---|
query | string | ✅ | Natural language query, supports both Chinese and English. e.g. How to configure oh-my-rime |
keyword | string | ❌ | Keyword filter, multiple keywords separated by semicolons. e.g. macOS;install;Rime |
top_k | number | ❌ | Maximum number of results to return (default: 5, range: 1-10) |
get_download_links — Get Download Links
| Parameter | Type | Required | Description |
|---|---|---|---|
resource | string | ❌ | Resource name, options: all, oh-my-rime, squirrel, weasel, fcitx5-rime, wanxiang-model, oh-my-rime-cli. Default: all |
get_schema_list — Get Input Schema List
No parameters required. Returns all supported input schemas and their activation status.
get_author_info — Get Author Info
No parameters required. Returns author Mintimate's profile, blog, social media links, and open-source project information.
Note
MCP tool invocations are typically handled automatically by the AI assistant. You just need to ask in natural language — there's no need to manually pass these parameters.
Technical Implementation
This MCP service is deployed on EdgeOne Pages Cloud Functions, leveraging the vector semantic search capabilities of the CNB knowledge base. The Markdown content from the documentation repository is automatically vectorized and indexed, enabling high-precision semantic search.
The overall architecture is as follows: