Venice.ai and select the llama3.1 model. Great option for a big model that you can’t run locally. Otherwise a local llama3.1 20B is solid if you have the RAM

Replies (3)

Default avatar
Rand 1 year ago
local grow! t-y Guy Swann