Appealing to authority is useful. We all do it every day. And like I said, all it should do is make you question whether you’ve really thought about it enough.
Every single thing you’re saying has no bearing on how AI will turn out. None.
If a 0 is “we figured it out” and 1 is “we go extinct”, here is what all possible histories look like in terms of “how things that could have made us go extinct actually turned out”:
1
01
001
0001
00001
000001
0000001
00000001
etc.
You are looking at 00000000 and assuming there can’t be a 1 next, because of how many zeroes there have been. Every extinction event will be preceded by a bunch of not extinction events.
But again, it is strange that you can label an appeal to authority, but not realize how much worse an “appeal to the past” is.
Nope. I certainly have. It’s the same arguments I’ve been hearing from people dismissing AI alignment concerns for 10 years. There’s nothing new there, and it all maps onto exactly the wishful thinking I’m talking about.