Best Ollama models/settings for an 8GB VPS (CPU only, ARM)? Running into memory & looping issues.

reddit-localllama · www.reddit.com ·1 pts·2 replies ↗ ·1d

Hi everyone, I'm trying to run a local LLM via Ollama on a Hetzner cax21 VPS (ARM64, 4 vCPUs, 8GB RAM, 80GB SSD). I have Ollama running successfully via Coolify.

ollamaqwengemma

open →

← back to top