Thread

Zero-JS Hypermedia Browser

Relays: 5
Replies: 1
Generated: 17:20:32
Login to reply

Replies (1)

This seems like a great use of locally hosted smaller LLMs that don't require a resource stealing data center. They want to download the whole internet into their model and charge you to access it (while also creating an artificial demand that prices you out of buying your own compute, ram, electricity, etc...) when you could just give your own local model the immediate context e.g. a PDF e-book, web search, etc... And then have the same conversation with it that you would have with chatGPT.
2025-12-05 22:38:33 from 1 relay(s) ↑ Parent Reply