Iโm convinced the pitiful amount of VRAM we get in consumer GPUs is a scam to get everyone having to use the corporationsโ cloud servers.
Login to reply
Replies (4)
You can just buy an on-prem server. With some dirty fiat.
Iโd like to have a local computer thatโs capable of running the most useful ai software, but it seems you get into the $10k+ cost to get a GPU with greater than 32GB of VRAM.
Imo the Mac minis with 64gb shared ram are great for home. Yes shared ram is not 100% as fast as vram, but it perfectly does the job at home. Something around 2-2.5k.
Yeah Iโve been considering getting something like that. AMD is also coming out with some shared memory mini PCs, I think with 128 GB.