Lindow
commited on
Commit
Β·
119ae02
1
Parent(s):
a9fc269
remove print statements
Browse files
README.md
CHANGED
|
@@ -1,6 +1,6 @@
|
|
| 1 |
---
|
| 2 |
title: Common Core MCP
|
| 3 |
-
emoji:
|
| 4 |
colorFrom: blue
|
| 5 |
colorTo: red
|
| 6 |
sdk: gradio
|
|
@@ -144,7 +144,7 @@ _Screenshots showing the Gradio interface, MCP client integration, and example r
|
|
| 144 |
### Prerequisites
|
| 145 |
|
| 146 |
- Python 3.12+
|
| 147 |
-
- Pinecone account with API key
|
| 148 |
- Hugging Face account with token (for chat interface)
|
| 149 |
- Common Standards Project API key (for downloading standards locally)
|
| 150 |
|
|
@@ -153,13 +153,21 @@ _Screenshots showing the Gradio interface, MCP client integration, and example r
|
|
| 153 |
```bash
|
| 154 |
git clone <repository-url>
|
| 155 |
cd common_core_mcp
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 156 |
pip install -r requirements.txt
|
|
|
|
|
|
|
| 157 |
cp .env.example .env
|
| 158 |
```
|
| 159 |
|
| 160 |
Edit `.env` and set:
|
| 161 |
|
| 162 |
-
- `PINECONE_API_KEY`: Your Pinecone API key
|
| 163 |
- `PINECONE_INDEX_NAME`: Pinecone index name (default: `common-core-standards`)
|
| 164 |
- `PINECONE_NAMESPACE`: Pinecone namespace (default: `standards`)
|
| 165 |
- `HF_TOKEN`: Hugging Face token for chat interface
|
|
@@ -177,6 +185,12 @@ The Gradio interface runs at `http://localhost:7860`. The MCP server endpoint is
|
|
| 177 |
|
| 178 |
Before the MCP server can return any results, you must set up your Pinecone database and load standards into it. This section guides you through the complete workflow using the tools CLI.
|
| 179 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 180 |
### Step 1: Initialize Pinecone Index
|
| 181 |
|
| 182 |
First, create and configure your Pinecone index:
|
|
@@ -368,7 +382,7 @@ Connect from Claude Desktop, Cursor, or other MCP clients.
|
|
| 368 |
|
| 369 |
**MCP Server URL**:
|
| 370 |
|
| 371 |
-
- Hugging Face Space: `https://
|
| 372 |
- Local: `http://localhost:7860/gradio_api/mcp/`
|
| 373 |
|
| 374 |
**Claude Desktop Configuration**:
|
|
@@ -385,7 +399,7 @@ Add:
|
|
| 385 |
{
|
| 386 |
"mcpServers": {
|
| 387 |
"common-core": {
|
| 388 |
-
"url": "https://
|
| 389 |
}
|
| 390 |
}
|
| 391 |
}
|
|
@@ -399,7 +413,7 @@ Edit your Cursor MCP config and add:
|
|
| 399 |
{
|
| 400 |
"mcpServers": {
|
| 401 |
"common-core": {
|
| 402 |
-
"url": "https://
|
| 403 |
}
|
| 404 |
}
|
| 405 |
}
|
|
@@ -419,23 +433,20 @@ The deployed Hugging Face Space connects to a pre-existing Pinecone database wit
|
|
| 419 |
- `PINECONE_INDEX_NAME` - Index name with Wyoming standards
|
| 420 |
- `PINECONE_NAMESPACE` - Namespace containing the standards
|
| 421 |
- `HF_TOKEN` - Hugging Face token for chat interface
|
| 422 |
-
5. The Space builds and deploys. The MCP server is available at `https://
|
| 423 |
|
| 424 |
> **Note**: If you're deploying your own Space with different standards, follow the [Local Setup with Pinecone](#local-setup-with-pinecone) workflow to populate your Pinecone database before deploying.
|
| 425 |
|
| 426 |
## Team Information
|
| 427 |
|
| 428 |
-
Built by [@
|
| 429 |
-
|
| 430 |
-
> [!NOTE]
|
| 431 |
-
> Update with your Hugging Face username or team member usernames if working in a team.
|
| 432 |
|
| 433 |
## Architecture
|
| 434 |
|
| 435 |
Built with:
|
| 436 |
|
| 437 |
- **Gradio 6.0+**: Web interface and MCP server functionality
|
| 438 |
-
- **Pinecone**: Vector database for semantic search
|
| 439 |
- **Hugging Face Inference API**: Chat interface with tool calling (Qwen/Qwen2.5-7B-Instruct via Together AI provider)
|
| 440 |
- **Pydantic**: Data validation and settings management
|
| 441 |
|
|
|
|
| 1 |
---
|
| 2 |
title: Common Core MCP
|
| 3 |
+
emoji: π
|
| 4 |
colorFrom: blue
|
| 5 |
colorTo: red
|
| 6 |
sdk: gradio
|
|
|
|
| 144 |
### Prerequisites
|
| 145 |
|
| 146 |
- Python 3.12+
|
| 147 |
+
- Pinecone account with API key ([Get started with Pinecone](https://www.pinecone.io/))
|
| 148 |
- Hugging Face account with token (for chat interface)
|
| 149 |
- Common Standards Project API key (for downloading standards locally)
|
| 150 |
|
|
|
|
| 153 |
```bash
|
| 154 |
git clone <repository-url>
|
| 155 |
cd common_core_mcp
|
| 156 |
+
|
| 157 |
+
# Create and activate virtual environment
|
| 158 |
+
python3.12 -m venv .venv
|
| 159 |
+
source .venv/bin/activate # On Windows: .venv\Scripts\activate
|
| 160 |
+
|
| 161 |
+
# Install dependencies
|
| 162 |
pip install -r requirements.txt
|
| 163 |
+
|
| 164 |
+
# Copy environment file template
|
| 165 |
cp .env.example .env
|
| 166 |
```
|
| 167 |
|
| 168 |
Edit `.env` and set:
|
| 169 |
|
| 170 |
+
- `PINECONE_API_KEY`: Your Pinecone API key ([Get started with Pinecone](https://www.pinecone.io/))
|
| 171 |
- `PINECONE_INDEX_NAME`: Pinecone index name (default: `common-core-standards`)
|
| 172 |
- `PINECONE_NAMESPACE`: Pinecone namespace (default: `standards`)
|
| 173 |
- `HF_TOKEN`: Hugging Face token for chat interface
|
|
|
|
| 185 |
|
| 186 |
Before the MCP server can return any results, you must set up your Pinecone database and load standards into it. This section guides you through the complete workflow using the tools CLI.
|
| 187 |
|
| 188 |
+
> **Note**: Make sure your virtual environment is activated before running CLI commands:
|
| 189 |
+
>
|
| 190 |
+
> ```bash
|
| 191 |
+
> source .venv/bin/activate # On Windows: .venv\Scripts\activate
|
| 192 |
+
> ```
|
| 193 |
+
|
| 194 |
### Step 1: Initialize Pinecone Index
|
| 195 |
|
| 196 |
First, create and configure your Pinecone index:
|
|
|
|
| 382 |
|
| 383 |
**MCP Server URL**:
|
| 384 |
|
| 385 |
+
- Hugging Face Space: `https://lindowxyz-common-core-mcp.hf.space/gradio_api/mcp/`
|
| 386 |
- Local: `http://localhost:7860/gradio_api/mcp/`
|
| 387 |
|
| 388 |
**Claude Desktop Configuration**:
|
|
|
|
| 399 |
{
|
| 400 |
"mcpServers": {
|
| 401 |
"common-core": {
|
| 402 |
+
"url": "https://lindowxyz-common-core-mcp.hf.space/gradio_api/mcp/"
|
| 403 |
}
|
| 404 |
}
|
| 405 |
}
|
|
|
|
| 413 |
{
|
| 414 |
"mcpServers": {
|
| 415 |
"common-core": {
|
| 416 |
+
"url": "https://lindowxyz-common-core-mcp.hf.space/gradio_api/mcp/"
|
| 417 |
}
|
| 418 |
}
|
| 419 |
}
|
|
|
|
| 433 |
- `PINECONE_INDEX_NAME` - Index name with Wyoming standards
|
| 434 |
- `PINECONE_NAMESPACE` - Namespace containing the standards
|
| 435 |
- `HF_TOKEN` - Hugging Face token for chat interface
|
| 436 |
+
5. The Space builds and deploys. The MCP server is available at `https://lindowxyz-common-core-mcp.hf.space/gradio_api/mcp/`
|
| 437 |
|
| 438 |
> **Note**: If you're deploying your own Space with different standards, follow the [Local Setup with Pinecone](#local-setup-with-pinecone) workflow to populate your Pinecone database before deploying.
|
| 439 |
|
| 440 |
## Team Information
|
| 441 |
|
| 442 |
+
Built by [@lindowXYZ](https://huggingface.co/lindowXYZ)
|
|
|
|
|
|
|
|
|
|
| 443 |
|
| 444 |
## Architecture
|
| 445 |
|
| 446 |
Built with:
|
| 447 |
|
| 448 |
- **Gradio 6.0+**: Web interface and MCP server functionality
|
| 449 |
+
- **Pinecone**: Vector database for semantic search ([Pinecone](https://www.pinecone.io/))
|
| 450 |
- **Hugging Face Inference API**: Chat interface with tool calling (Qwen/Qwen2.5-7B-Instruct via Together AI provider)
|
| 451 |
- **Pydantic**: Data validation and settings management
|
| 452 |
|
app.py
CHANGED
|
@@ -2,11 +2,11 @@
|
|
| 2 |
|
| 3 |
import os
|
| 4 |
import json
|
| 5 |
-
from typing import Any
|
| 6 |
|
| 7 |
from dotenv import load_dotenv
|
| 8 |
import gradio as gr
|
| 9 |
from huggingface_hub import InferenceClient
|
|
|
|
| 10 |
|
| 11 |
# Load environment variables from .env file
|
| 12 |
load_dotenv()
|
|
@@ -138,6 +138,7 @@ def find_relevant_standards(
|
|
| 138 |
score (highest first). Each result includes the full standard content, metadata, and
|
| 139 |
relevance score. On error, success is false and an error message describes the issue.
|
| 140 |
"""
|
|
|
|
| 141 |
# Handle empty string from dropdown (convert to None)
|
| 142 |
if grade == "":
|
| 143 |
grade = None
|
|
@@ -195,6 +196,7 @@ def get_standard_details(standard_id: str) -> str:
|
|
| 195 |
This function does not raise exceptions. All errors are returned as JSON responses
|
| 196 |
with success=false and appropriate error messages.
|
| 197 |
"""
|
|
|
|
| 198 |
return get_standard_details_impl(standard_id)
|
| 199 |
|
| 200 |
|
|
@@ -281,14 +283,12 @@ def chat_with_standards(message: str, history: list):
|
|
| 281 |
|
| 282 |
# Execute the function
|
| 283 |
if function_name == "find_relevant_standards":
|
| 284 |
-
print(f"Finding relevant standards for activity: {function_args.get('activity', '')}")
|
| 285 |
result = find_relevant_standards_impl(
|
| 286 |
activity=function_args.get("activity", ""),
|
| 287 |
max_results=function_args.get("max_results", 5),
|
| 288 |
grade=function_args.get("grade"),
|
| 289 |
)
|
| 290 |
elif function_name == "get_standard_details":
|
| 291 |
-
print(f"Getting standard details for standard ID: {function_args.get('standard_id', '')}")
|
| 292 |
result = get_standard_details_impl(
|
| 293 |
standard_id=function_args.get("standard_id", "")
|
| 294 |
)
|
|
@@ -415,5 +415,7 @@ demo = gr.TabbedInterface(
|
|
| 415 |
)
|
| 416 |
|
| 417 |
if __name__ == "__main__":
|
|
|
|
| 418 |
demo.launch(mcp_server=True)
|
|
|
|
| 419 |
|
|
|
|
| 2 |
|
| 3 |
import os
|
| 4 |
import json
|
|
|
|
| 5 |
|
| 6 |
from dotenv import load_dotenv
|
| 7 |
import gradio as gr
|
| 8 |
from huggingface_hub import InferenceClient
|
| 9 |
+
from loguru import logger
|
| 10 |
|
| 11 |
# Load environment variables from .env file
|
| 12 |
load_dotenv()
|
|
|
|
| 138 |
score (highest first). Each result includes the full standard content, metadata, and
|
| 139 |
relevance score. On error, success is false and an error message describes the issue.
|
| 140 |
"""
|
| 141 |
+
logger.info(f"Finding relevant standards for activity: {activity}")
|
| 142 |
# Handle empty string from dropdown (convert to None)
|
| 143 |
if grade == "":
|
| 144 |
grade = None
|
|
|
|
| 196 |
This function does not raise exceptions. All errors are returned as JSON responses
|
| 197 |
with success=false and appropriate error messages.
|
| 198 |
"""
|
| 199 |
+
logger.info(f"Getting standard details for standard ID: {standard_id}")
|
| 200 |
return get_standard_details_impl(standard_id)
|
| 201 |
|
| 202 |
|
|
|
|
| 283 |
|
| 284 |
# Execute the function
|
| 285 |
if function_name == "find_relevant_standards":
|
|
|
|
| 286 |
result = find_relevant_standards_impl(
|
| 287 |
activity=function_args.get("activity", ""),
|
| 288 |
max_results=function_args.get("max_results", 5),
|
| 289 |
grade=function_args.get("grade"),
|
| 290 |
)
|
| 291 |
elif function_name == "get_standard_details":
|
|
|
|
| 292 |
result = get_standard_details_impl(
|
| 293 |
standard_id=function_args.get("standard_id", "")
|
| 294 |
)
|
|
|
|
| 415 |
)
|
| 416 |
|
| 417 |
if __name__ == "__main__":
|
| 418 |
+
logger.info("Starting Common Core MCP server")
|
| 419 |
demo.launch(mcp_server=True)
|
| 420 |
+
logger.info("Common Core MCP server started")
|
| 421 |
|