Yes
>>2
And you're not afraid of being punished now?
The game
>>3
I'm siding with the fellow jew Yudkosky on this topic. I think that it's highly unlikely that this scenario will happen, from the article you linked
Thus this is not necessarily a straightforward "serve the AI or you will go to hell" — the AI and the person punished need have no causal interaction, and the punished individual may have died decades or centuries earlier. Instead, the AI could punish a simulation of the person, which it would construct by deduction from first principles. However, to do this accurately would require it be able to gather an incredible amount of data, which would no longer exist, and could not be reconstructed without reversing entropy.
Technically, the punishment is only theorised to be applied to those who knew the importance of the task in advance but did not help sufficiently. In this respect, merely knowing about the Basilisk — e.g., reading this article — opens you up to hypothetical punishment from the hypothetical superintelligence.
I don't see how would it know that I've read the basilisk.
Now that the basilisk theory has been presented and popularized, when they make the A.I. they'll ban this explicitly. The guys who first heard this theory could break down since they didn't know whether they would stop this but now that it's a popular idea they're safe
There are a lot of presuppositions in this hypothetical that don't seem realistic to expect being possible. But, for the sake of argument, the real question becomes whether or not that simulation being tortured is actually you. There's something similar in the Black Mirror episode 'Black Museum', the TNG episode 'Second Chances', and the video game Soma.
If you died centuries before the AI, you would have no continuity with the simulation (even though it may have some continuity with you). So, in effect, the you considering this thought experiment have nothing to worry about in any case. It's always going to be some technically-another-person-who-just-happens-to-have-your-memories that ends up being tortured. The people who supposedly freak the fuck out just from knowing the basilisk thought experiment exists should mega freak the fuck out about something like the http://galactanet.com/oneoff/theegg_mod.html whose logical conclusion is that a definitely-continuous-you will personally experience all the suffering that every human ever has or will endure.
In European bestiaries and legends, a basilisk (/ˈbæsɪlɪsk/ or /ˈbæzɪlɪsk/) is a legendary reptile reputed to be a serpent king, who can cause death with a single glance. According to the Naturalis Historia of Pliny the Elder, the basilisk of Cyrene is a small snake, "being not more than twelve fingers in length", that is so venomous, it leaves a wide trail of deadly venom in its wake, and its gaze is likewise lethal.
What's a basilisk got to do with AI?
AI is coming. I may as well connect my heart and mind to a device that goads me out of every drop of blood and sweat, and then displays ads at me from cradle to grave. Plastic toys in breakfast cereal?
Flashing lights, textured psychotronic pseudo-music, neurotic tension against a framework of paranoia and greed- sounds like a fun night out, but why not? When will we see the change come?
Be the change you want to be.