do you use opus directly or through @Maple AI ? Ideally we would also run a local model but maple ai seems to be a good compromise, at least better than to link all data to your anthropic account. I also run anthropic directly but I think I will try opus through maple ai max subs for next month. Was also looking into local models but Kimi 2.5 seems to be a ressource hog, at least nothing you can run on < 10000 $ hardware afaics. Maybe GLM 4.7 but that seems to be not near opus and kimi level. Tried sonnet-4.5 for a few days but results were not that great, not worth the wasted back and forth. But I hope as these things get more efficient and smarter we will be able to run a decent model locally on consumer hardware.

Replies (1)