I was impressed with how quickly it was to get a local LLM up and running with Ollama. It was as simple as downloading the ...