GPT-OSS is a good open-source model and runs very fast for being 120B parameters.
But it was clearly created from synthetic data, hallucinates a lot, and was designed NOT to take business away from OpenAI. For example, it's not as good as coding / agentic workflows as open-source Chinese models like Qwen3-coder. For that they sell GPT-5 (which is best-in-class IMO)
It is good at plenty of tasks though, so it's worth downloading and having around if you need it.
Or, try @routstr and pay a few sats for an anonymous chat.
ABVStudio
ABVStudio@abvstudio.net
npub153fv...jsva
"Qwen3 Coder 30B A3B" is one of the largest LLMs I run locally at 36GB! But, it's faster than models half its size!
The secret is the "A3B" in the name, which means only 3 of the 30 billion parameters are "active" per token. This allows it to be SMARTER than smaller models, while being FASTER at the same time. Plug it into VS Code with Cline's agentic coding extension and get ๐ back ๐ to ๐ vibin'! #vibecoding
Isn't there a way to get a free lightning address with cashu? @calle
your_npub@cashu.me? Is that right? #asknostr