nostr:nprofile1qy88wumn8ghj7mn0wvhxcmmv9uq37amnwvaz7tmwdaehgu3dwfjkccte9ejx2un9ddex7umn9ekk2tcqyqlhwrt96wnkf2w9edgr4cfruchvwkv26q6asdhz4qg08pm6w3djg3c8m4j is #shakespeare working with local LLMs? I can't get it to work. The configuration guide does not quite match the real configuration process. There's no "api type" nor "model name" in the "add custom providers" section.
Login to reply
Replies (5)
the documentation needs to be updated. i will do so immediately after i respond here.
are you using ollama?
enter in http://localhost:11434/v1 for the API url if it's running on your own machine. be sure to edit the system service to allow CORS.
you don't need an API type or model name. shakespeare will pull that information for you.
Yes I'm using Ollama, with CORS enabled. And the llm responds correctly to direct prompts (not using Shakespeare) to the localhost:11434 url from any local terminal.

it's case sensitive as well and much match exactly.
No way. Are there any constraints or limitations in the llm name? There's not much else I can try.