Skip to main content

Verified by Psychology Today

Marriage

Robots Will Have Rights

When I can see through my robot's eyes and feel what she feels...

When people talk about whether robots will have rights, they usually get tied up with some key criterion argument. Will they have consciousness? Will they have empathy? Will they, in the words of Daniel Dennett, have a lively center of narrative gravity (i.e., an enduring sense of self)? If they don't have those things, which some say they never will, then they don't get rights. So say the criterionists.

But this is wrong. None of the above arguments matter even if they rested on some objectively measurable quantity, which they don't.

Rights are granted because enough people with rights care enough about those without them.

That caring can be misguided. It can be humble. It can change on a whim. But if we care, then it matters.

That's how the bots will get their rights.

This is why slavery was abolished, why US states ended coverture (the practice of granting a woman's rights to her husband), and why the law eventually prevented us from selling our children. This is also why laws are increasing the scope of legal marriage among consenting couples, regardless of the number of Y-chromosomes involved.

Animal welfare laws are encroaching on our sovereignty as well. I can't treat my frenetic over-sized dog too poorly, or the Man will come and get me. And if marriage breaks down, who gets custody of the kitty is often determined "in the best interests" of the kitty.

Robots will have rights because we will care about them enough to empathize (regardless of whether or not they can empathize with us). And because at least some of our robots will house algorithms built to adapt to our interests, we will eventually empathize. Because we like things we can empathize with.

When the first man drowns trying to save his robot love, we'll look on in guilty horror, knowing that there but for the grace of God go I. That plot is already developing. A man recently let a woman drown because he claimed he didn't have anyone to give his phone too.

We'll never be able to tell if robots have consciousness or not. Consciousness is by definition an introspective experience that we infer about others. One of my old algorithmic cats seemed a little sketchy on the consciousness-scape. But the rest of my cat-gorithms have got the whole self-awareness thing nailed.

Panpsychists argue that everything material has an element of consciousness. Even rocks. We can't know.

What we can and eventually will do is take the robot's perspective. We can see the world through its eyes. And the more it becomes like us, the more it can tell us its story.

The more of its stories we hear, the easier it will be to take its perspective.

When the first robot writes its own autobiography, you can bet we'll all be clamoring at the Amazon Space Hub for a copy.1 And then we'll be badmouthing all the human meanies that gave that robot a hard time, crying out for justice.

And the people will rise up!

Then the robots will too.

1 Reading fiction is one of the ways to enhance your social perspective taking, if that's what you want to do (Mar & Oatley, 2008).​

References

Mar, R. A., & Oatley, K. (2008). The function of fiction is the abstraction and simulation of social experience. Perspectives on psychological science, 3(3), 173-192.

advertisement
More from Thomas Hills Ph.D.
More from Psychology Today
More from Thomas Hills Ph.D.
More from Psychology Today