• brucethemoose@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    5
    ·
    edit-2
    17 hours ago

    The energy cost of inference is overstated. Small models, or “sparse” models like Deepseek are not expensive to run. Training is a one-time cost that still pales in comparison to, like, making aluminum.

    Doubly so once inference goes more on-device.

    Basically, only Altman and his tech bro acolytes want AI to be cost prohibitive so he can have a monopoly. Also, he’s full of shit, and everyone in the industry knows it.

    AI as it’s implemented has plenty of enshittification, but the energy cost is kinda a red herring.