Files
ollama-mcp/requirements.txt
ai_approver 139a038505
Some checks failed
Build and Deploy / build-push (push) Has been cancelled
Build and Deploy / deploy (push) Has been cancelled
Initial commit: Ollama MCP server
MCP server exposing local Ollama models via LiteLLM proxy to Claude Code.
Tools: query_local_model, review_code, summarize, generate_boilerplate, list_models.
Deployed to k8s ai-inference namespace via ArgoCD.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-21 17:33:56 +00:00

5 lines
69 B
Plaintext

mcp>=1.0.0
httpx>=0.27.0
starlette>=0.41.0
uvicorn[standard]>=0.32.0