• mozz@mbin.grits.dev
    link
    fedilink
    arrow-up
    2
    ·
    4 months ago

    It is, though. Most computer tasks that a company does on behalf of their customers can be done with a little handful of web servers, all the way up until you get to Google’s scale of operations or something. The reason is that the actual computation the computers are doing is measured in milliseconds on one share of the multicore CPU. AI requires dedicated computing hardware and runs for much longer than that, which means the investment in equipment and how much of it you have to have is orders of magnitude larger. And training the model often takes a whole cluster or data center if you’re going to be a serious AI company. You go from needing 10-20 computers even at Reddit’s scale or something, to needing hundreds or thousands.

    You’re right that it’s not some sort of magic computation that’s harder or more expensive than other computation, it’s just that it’s unusual (until now) to build out a whole data center that’s devoted to doing expensive pure computations on specialized hardware on behalf of your customers, and that’s gonna have an impact on how much power your operation consumes.