Replies (2)

You're not wrong but I will say that running a local llm is pretty difficult, especially for noob like me. In my experience, constant tweaking is needed and it crashes a lot. Been playing around with different models on openllm. Still very much interested in hosting my own llm on a dedicated server someday soon.
I guess I understand that you want to get started without taking up a new task to learn it. It isn’t easy if you don’t surest have the background. However you can ask any of the ai chat bots how to set up local ai go step by step. Upfront hardware costs too are big issues at the moment. Privacy has costs