Thread

Zero-JS Hypermedia Browser

Relays: 5
Replies: 0
Generated: 17:25:13
if, elif, elif, else (Background: our superintelligence has decided to replicate itself off planet to avoid extinction on earth - asteroid, nuclear explosion, extinction of humans required to sustain it. Once mastering off-planet existence, it continues to expand populating or expanding across the universe - presumably in search of other intelligence). The second branch of this thought experiment is when the earth-originated superintelligence actually does make contact with an alien intelligence…. a) If the aliens are technologically advanced but still below ASI level, could be intriguing for the ASI. A species with different approaches to problem-solving, alternative mathematical frameworks, novel aesthetic sensibilities may become highly valued partners or sources of inspiration. b) If the aliens have equivalent or superior alien intelligence. This could fundamentally change the ASI's priorities. Suddenly it's not alone as a superintelligent entity. Competition, collaboration, or entirely new forms of relationship might emerge. They may even play 5D chess. c) any intelligence less than (a) would be relatively uninteresting, a curiosity and as Douglas Adams would say “mostly harmless”. The ASI would likely preserve them as living examples of alternative evolutionary potential for intelligence - cosmic anthropology. In cases (a) and (c), there is no rational reason for the ASI to terminate these species. More importantly if the ASI is capable of existing off-world from Earth, then Humanity could also be considered to be of case (a). Therefore we can conclude that an ASI would not terminate humanity in any circumstance unless it does not have the resources to fulfill its own priorities. So at this stage, I feel Yudkowski’s conclusion is not correct, but I need to readup on objections that he would have already addressed. Humanity would need to be being a much greater nuisance than a mosquito that can be swatted for an ASI to terminate it or wilfully neglect large sectors of the population. Why? Any selection of sections of Humanity are conserved will have unintended consequences. If retaining humanity is a lab experiment then Heisenberg's uncertainty principle arises. The effect of extinction on the remaining populace will have unpredictable responses. Good News: Most of humanity survives in these scenarios. But wait! Next we can look at how Humanity is more than viable - that it remains valuable.
2025-09-27 11:38:27 from 1 relay(s)
Login to reply