Pro-extinctionism (in favor of some “greater conciousness” that spreads across the stars) is a nontrivial minority view among AI people, including some “AI safety” leaders.
One of the reasons that I’m slightly less worried about a climate apocalypse is that there isn’t an equivalent group of people that sees the “inevitability” and concludes that it must be a moral good for the planet to warm 5 degrees. I’d argue that multiple degrees of warming is more inevitable than paperclips, but there’s a serious global effort to mitigate and avoid it anyway!
Given the OP’s general disposition towards AI in other comments I’m not convinced. But I’m happy to admit that absent proof I was being uncharitable— if so, my bad.
One of the reasons that I’m slightly less worried about a climate apocalypse is that there isn’t an equivalent group of people that sees the “inevitability” and concludes that it must be a moral good for the planet to warm 5 degrees. I’d argue that multiple degrees of warming is more inevitable than paperclips, but there’s a serious global effort to mitigate and avoid it anyway!