Dystopia Friday

By Dan Kervick

Chris Bertram, reflecting on cyborg technologies in a possible robot-human future, points to a potentially dystopian outcome for this technology: employers could make the willingness to undergo human technological enhancement a condition of employment contracts.  Bertram sarcastically quips, “Oh well, I expect someone will be along to explain how such contracts would be win-win.”  Matt Yglesias responds, “It seems pretty obvious how they would be win-win: They’d be agreed to voluntarily by two mentally competent adults.”

Noah Smith properly notices that voluntary contracts need not be mutually beneficial to the people who make them, even if these people are ideally rational.  Ideal rationality does not imply omniscience.   Uncertainty and risk are present in the world.  A contact that has a high expected value for some agent ex ante, might turn out to have detrimental consequences ex post, depending on how things in the world turn out.

But I think there is something more to be said here, because Smith’s point still leaves open the suggestion that an ideally rational agent would only make a voluntary contract if the contract is at least beneficial ex ante for that agent – that is, if the expected value of the contract is positive.  But one way in which a contract could be rational ex ante for one of the parties to that contract is if its expected value is negative, but greater than the expected value of any alternative action the agent could perform instead.

Suppose you only have three alternatives: not making a contract, making contract A or making contract B. And suppose the expected values of the alternatives are -1000, -100 and -10 respectively, as measured according to some standard of value.  Then it is indeed rational for you to make contract B.  But I don’t think we would say contract B is beneficial for you simpliciter: in this case you have no beneficial alternative. Contract B is only beneficial in a relative sense.

Turning now to our cyborg descendants, imagine a future in which most people have been technologically enhanced with various cyborg prosthetics.  These human enhancements are typically installed by one’s employer, who retains ownership of them and has the option of removing them from your body at any time.  (If you change jobs, the new employer buys the enhancements that are already installed in your own body from your previous employer.)  Most of these enhanced humans now lead lives that are just this side of miserable.  Many would even kill themselves if they could.  But the enhancements also include devices that read one’s thoughts, and administer severe electric shocks when they detect the presence of suicidal intentions.  But living without the enhancements is not a viable option, because the standard enhancements provide a baseline of productivity and social acceptability without which a non-enhanced human cannot earn an income or engage in normal human society.

Your employer now offers you three alternatives:

A: You will be dismissed from your job and your enhancements will be removed.

B: You will undergo the installation of a new muscle enhancing device.

C: You will undergo the installation of a new brain enhancing device.

Alternatives B and C will make you more productive and more valuable to your employer.  But neither one will bring you any greater personal rewards.  In fact, each will degrade your quality of life, the first by a lot and the second by a little.

Your preference would be to stand pat, and continue your current, barely tolerable, state of existence.  But you are not offered that option.  The only employment opportunities available in this world are provided by other employers who offer their employees similar options.  In this situation, the rational alternative for you is C.  Although the expected value of C is negative, it delivers less expected harm than A and B, which are your only alternatives.

Is this an argument against human enhancement and cyborg technologies?   No.   My dystopian example only brings out considerations about the way in which rationality interacts with power relations, and with the alternatives one has available for living out a life in a world controlled by other people.  These issues are already present in our pre-cyborg world.

16 responses to “Dystopia Friday