The generalised resource use argument against AI feels spurious to me. Our real problem is overall datacentre consumption, not what those datacentres are computing.
Datacentres consume massive resources regardless of workload. I’m willing to hear arguments about that. And GPU workloads need more cooling. But GPU compute is happening elsewhere, too.
Where I start to sympathise with the argument is when you start to specialise.
When you’re burning those resources to generate valueless slop - images, videos, music that add nothing to the world and represent no true artistic expression - that’s indefensible, but the same is true for any compute.
Burning resources for nothing in exchange isn’t a workload problem. It’s a broader ethics problem. And as for the more general question of consumption, yes, there’s conversations to be had, but involving AI in that is a red herring.