Gaming

A Dangerous Thought

Roko’s Basilisk is one of those thought experiments that slides very quickly from “hmmm, that’s a cool idea” to “nightmare fuel”, but only if you buy into the somewhat shaky propositions that form its basis.

Derived somewhat from the Omega Point theory, as promulgated by Frank Tipler, Roko’s Basilisk proposes a future in which a god-like AI may choose to punish those who knew about the possibility of it’s existence but didn’t actively help create it. Said punishment might actually be through reaching back through time, or by resurrecting a digital version of the individual in a virtual space, and then torturing them for eternity.

The truly nasty bit about Roko’s Basilisk is when you consider it in context with the Simulation Hypothesis, which posits that we may already be living in the simulation. Further, this god-like AI doesn’t even have to be evil. It might be good, in which case punishing people who didn’t contribute to its early creation becomes a moral imperative, as every day the AI doesn’t exist people die who could have been saved.

Of course, if the AI is good, it can’t fairly punish people who don’t realise that they’re doing wrong. The pure version of Roko’s Basilisk suggests that you have to know about the possibility of the AI in order to be (possibly) tortured by it for eternity.

Think about it, and then thank me later.

The gaming possibilities for this idea are almost endless, from Matrix-style campaigns where slick heroes fight the Machine God to Lovecraftian existential horror and beyond.

Now, I personally think Roko’s Basilisk is logically flawed, but I also think it’s a very cool idea. The idea of a strongly god-like AI creating a virtual Hell in which to torment digital copies of the people it doesn’t like is suitably chilling, especially when you consider the possibility that we may already be those digital copies; simple streams of data within the mind of a machine that has decided that it wants us to suffer. It’s horrific enough that the concept has allegedly caused several people to suffer anxiety attacks, nightmares and in several cases full nervous breakdowns.

So why am I sharing it with you? Two reasons. One, it might be true, in which case sharing the concept may in itself be considered helping the AI. Two, it might not be true, in which case I just told you a very scary story.

It’s your choice which you believe…

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s