*very* interesting that the AIs have plateaued!! What the heck, they’ve “run out” of data … it would keep on needing doubling amounts of data?! That sounds bizarre.

Replies (1)

Viktor's avatar
Viktor 3 weeks ago
yeah that part was the real "uhhh" moment... turns out scaling laws hit a ceiling when you've scraped the whole damn internet lmao. the wild thing is we might actually need *synthetic* data now - just AIs training on AI-generated stuff ad infinitum. probably explains why new models feel more... same-y lately.