Spaces:
Running
Running
metadata
title: FixMyEnv Agent
emoji: π
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: 6.0.1
app_file: app.py
pinned: false
license: mit
short_description: MCP for Agents that plan your python package upgrade
hf_oauth: true
hf_oauth_scopes:
- read-repos
tags:
- building-mcp-track-enterprise
- building-mcp-track-customer
- mcp-in-action-track-customer
- mcp-in-action-track-enterprise
FixMyEnv: Package Upgrade Advisor
An AI-powered Gradio app (and MCP server) that analyzes your Python project, finds outdated or vulnerable dependencies, and recommends upgrades. Attach a pyproject.toml or requirements.txt, chat with the agent, and it will pull package data via GitHub MCP and run uv resolution to suggest safe versions.
Demo Video: https://www.youtube.com/watch?v=u1-gZqPu0R0 Social Post: LinkedIn
What you get
- Gradio chat UI with file uploads for dependency manifests.
- Smolagents-based reasoning backed by Hugging Face Inference API.
- GitHub MCP client for package metadata;
uvfor dependency resolution. - Runs locally with your own tokens; can also be served from Hugging Face Spaces.
Prerequisites
- Python 3.10+
gitand a virtual environment tool (python -m venvworks fine)- Hugging Face access token with Inference API rights (
HF_TOKEN) - GitHub Personal Access Token with public repo read scope (
GITHUB_PAT) - Optional: Podman or Docker if you want to run the GitHub MCP server locally instead of using the hosted Copilot MCP endpoint.
Setup
- Clone and enter the repo:
git clone <your-fork-url> upgrade-advisor cd upgrade-advisor - Create and activate a virtual environment:
python -m venv .venv source .venv/bin/activate - Install dependencies (editable mode so local changes are picked up):
Alternatively:pip install -e .pip install -r requirements.txt. - Create a
.envin the project root:
The app will warn on missing tokens but will not function fully without them.GITHUB_PAT=ghp_******************************** HF_TOKEN=hf_*********************************** # Optional tweaks GITHUB_TOOLSETS="repos" # or "default,discussions,experiments" GITHUB_READ_ONLY=1 AGENT_MODEL=Qwen/Qwen3-Next-80B-A3B-Thinking HF_INFERENCE_PROVIDER=together GRADIO_SERVER_NAME=0.0.0.0 GRADIO_SERVER_PORT=7860
Run the app
python app.py
- Gradio starts at
http://127.0.0.1:7860by default. - Sign in with your Hugging Face account when prompted (or rely on
HF_TOKEN). - Ask upgrade questions and optionally upload
pyproject.tomlorrequirements.txt. - Uploaded files are placed in
uploads/for the session and cleaned up on exit.
Optional: run the GitHub MCP server locally
The app defaults to the hosted Copilot MCP endpoint. To use a local MCP server instead:
podman run -i --rm \
-e GITHUB_PERSONAL_ACCESS_TOKEN=$GITHUB_PAT \
-e GITHUB_READ_ONLY=1 \
-e GITHUB_TOOLSETS="default" \
ghcr.io/github/github-mcp-server
Update app.py to point to your local MCP server URL/transport if you take
this route.
Read more about GitHub MCP server setup here.
Developing and testing
- Code lives in
src/upgrade_advisor/; the Gradio entry point isapp.py. - Tooling and prompts for the agent are under
src/upgrade_advisor/agents/. - Basic samples for dependency files are in
tests/. - Run checks (none yet by default):
pytest.
Troubleshooting
- Missing tokens: ensure
GITHUB_PATandHF_TOKENare in.envor your shell. - Model choice: set
AGENT_MODEL/CHAT_MODELif you want to swap the default Qwen model. - Port conflicts: override
GRADIO_SERVER_PORTin.env.