i'm aware of this, but much of it works on the same basic algorithms, proximity hashes and neural networks... GPT is just one way of applying it what's most hilarious about the hype over GPT is that it's literally Generative Predictive Text it's literally spinning off on its own a prediction of what might follow the prompt, anyone used a predictive text input system? yeah... they are annoying as fuck because they don't actually predict very well anything and certainly not with your personal idiom

Replies (2)

Also much of it, in fact most of it, does not work on neural networks or other “black box” unsupervised models. The most common uses of ML for a business or researcher are correlation, categorization, and recommendation systems, or some form of forecasting/predictive modeling (I probably missed some). None of these require a neural net, or anything that resembles what we commonly call “AI”.
yeah, and much of it is calculus based, i've done a lot of work with difficulty adjustment which uses historical samples to create close estimates of the current hashpower running on a network i'm not a data scientist though, my area of specialisation is more about protocols and distributed consensus - and the latter (and including spam prevention) tend to involve statistical analytical tools