Above all, you should be learning to use #AI as an automated data analyst to interrogate the corpus of human "knowledge" but also to get more things done with less computer footwork.
Login to reply
Replies (1)
This seems like a great use of locally hosted smaller LLMs that don't require a resource stealing data center.
They want to download the whole internet into their model and charge you to access it (while also creating an artificial demand that prices you out of buying your own compute, ram, electricity, etc...) when you could just give your own local model the immediate context e.g. a PDF e-book, web search, etc... And then have the same conversation with it that you would have with chatGPT.