Initial commit: Ollama MCP server
MCP server exposing local Ollama models via LiteLLM proxy to Claude Code. Tools: query_local_model, review_code, summarize, generate_boilerplate, list_models. Deployed to k8s ai-inference namespace via ArgoCD. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
4
requirements.txt
Normal file
4
requirements.txt
Normal file
@@ -0,0 +1,4 @@
|
||||
mcp>=1.0.0
|
||||
httpx>=0.27.0
|
||||
starlette>=0.41.0
|
||||
uvicorn[standard]>=0.32.0
|
||||
Reference in New Issue
Block a user