Running Llama locally is the only interaction with AI I am comfortable with. And it's a great experience. We only need simpler ways to train the models further on local data, a method that can be deployed by the normal user without much tech involved. I think a better/easier training framework is more important than a new model.

Replies (1)