@dave the problem here is that the previous uses of data centre capacity haven’t gone away because LLMs arrived so that extra capacity for LLMs is additional, not a swap of one thing for another.
@dave and also, yes, so much slop, although I think there’s a more general argument to be had there about the amount of energy (both literal and mental) we expend as a society in doing things that we could just… not and no one would really care.
@jon And that's the crux of what I'm saying (though your point about AI compute going on top is valid). We need to talk about the resources we burn for compute, but it's a more general question.
@dave broadly I think I agree, but I do see LLMs as being particularly egregious given how much is required to train them, plus whatever is needed for inference, all to (on the whole) spew out endless pages of shit nobody needed to read in the first place, which is then compounded by people using them summarise the shit that was expanded earlier.
@dave I think I’ve convinced myself here that the fundamental issue with LLMs is that they’re solving the wrong problem. Everyone is drowning in dross, and instead of us stopping and asking if we should be we invented a machine that does lossy compression on it, sometimes resulting in entirely the wrong message being received.