Yes I'm using Ollama, with CORS enabled. And the llm responds correctly to direct prompts (not using Shakespeare) to the localhost:11434 url from any local terminal.
Login to reply
Replies (1)
