> Ollama is a very good tool to run llama models locally, and running it as a background service on macOS can be quite beneficial for…