Antisubtratism for PlayStation 4?

by a friend
Published: 9 Nov. 2013

"Other things being equal, conscious beings of equivalent sentience deserve equal care and respect." --David Pearce

If conscious machines were created, how should we treat them?

In his 2003 book Being No One, philosopher Thomas Metzinger envisions the lack of care that might exist for sentient robots:

What would you say if someone came along and said, "Hey, we want to genetically engineer mentally retarded human infants! For reasons of scientific progress we need infants with certain cognitive and emotional deficits in order to study their postnatal psychological development — we urgently need some funding for this important and innovative kind of research!" You would certainly think this was not only an absurd and appalling but also a dangerous idea. It would hope-fully not pass any ethics committee in the democratic world. However, what today's ethics committees don't see is how the first machines satisfying a minimally sufficient set of constraints for conscious experience could be just like such mentally retarded infants. They would suffer from all kinds of functional and representational deficits too. But they would now also subjectively experience those deficits. In addition, they would have no political lobby — no representatives in any ethics committee.

Is the substrate, the material, that someone is made from a morally relevant factor? Suffering is suffering, and it wouldn’t make sense to care less about the suffering of artificial organisms just because they are artificial. Wouldn't a lack of respect for conscious beings who are made of silicon be just as arbitrary and immoral as discriminating against others based on their sex or race? This would be “substratism”: discrimination based on the substrate that one is made of.

Still, maybe we don’t need to be very worried about future conscious machines being mistreated. Once it’s recognised that they are conscious they’ll be treated well, right?

Maybe not.

A robot that can talk and is generally quite similar to humans probably would be respected, but a simulated organism that is very dissimilar from humans might attract much less compassion. We only have to look to how we currently treat non-human animals to see how easily our empathy for conscious beings who aren’t quite like us can disappear.

David Cage, founder of Quantic Dream, the makers of Beyond: Two Souls and Heavy Rain (a critically acclaimed success that sold over two and a half million copies), was influenced by futurist Ray Kurzweil's "The Singularity Is Near" and imagines a different (and perhaps unlikely) future:

Reading through the comments on YouTube you can see that viewers are powerfully affected by the video (a further argument that human-esque conscious robots would quickly be considered morally?) and with well over one million views Quantic Dream have probably put the idea of morally relevant machines securely in a few more biological brains. But that's not all, Quantic Dream have registered the domain http://singularityps4.com/ suggesting that a future game might involve similar themes.