MCP server exposing local Ollama models via LiteLLM proxy to Claude Code. Tools: query_local_model, review_code, summarize, generate_boilerplate, list_models. Deployed to k8s ai-inference namespace via ArgoCD. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
5 lines
69 B
Plaintext
5 lines
69 B
Plaintext
mcp>=1.0.0
|
|
httpx>=0.27.0
|
|
starlette>=0.41.0
|
|
uvicorn[standard]>=0.32.0
|