Tagged: Ollama

2 posts

Claude Code with Local LLMs and ANTHROPIC_BASE_URL: Ollama, LM Studio, llama.cpp, vLLM

Claude Code with Local LLMs and ANTHROPIC_BASE_URL: Ollama, LM Studio, llama.cpp, vLLM

April 29, 2026 · 16 min read · guides
Run Claude Code on a local LLM via ANTHROPIC_BASE_URL. Native Anthropic endpoints for Ollama, LM Studio, llama.cpp, vLLM. 32K context floor.
Docker Compose AI ML Development Stack: Local LLM, Vector DB, Full YAML

Docker Compose AI ML Development Stack: Local LLM, Vector DB, Full YAML

March 20, 2026 · 11 min read · blog
Complete docker compose ai ml development stack: Ollama, Qdrant, Postgres, Redis, LiteLLM. Copy-paste YAML, GPU support, and first-run commands for a local AI dev environment. Download the free AI Automation Checklist.