Yes, but needs a lot of finetuning and the gui is missing for not technical users. I already optimized the prompt that i get better results and i think i get better results with llama3 8b than mistral.
My learning are, that it is realy simple to run a local llm and use f.e the nostr API.
Next example follows… but i know it is basics, for most here not something new 😅
Login to reply
Replies (1)
I‘m not a technical guy, so not really for me yet, I guess.