On one of my co-hosting gigs on the Jordan Rich show, we received a call from a listener who asked about rights for future individuals who wouldn’t be human or robot, but something in-between. He cited the term singularity, coined in a book by Ray Kurzweil where the line between man and machine will blur.
I was stumped. And my evasive answer — comical as it was — didn’t appease the caller. But the question stayed with me.
After I took time to think more about the question, I surmised that any entity with a human brain should have human rights — whether their body is organic or not.
Talk about a no-brainer, so to speak!
That leads to the question of rights for entities with nonhuman brains. Before we deny robots any rights, consider the possibility that in the future, a dying man or woman might be able to download their mind into a computer. It could be a stationary device, like in the movie Transcendence, or an ambulatory one, like a robot.
What rights would such an individual have?
They don’t have an organic brain, but they have a human mind contained within a silicon device. Shouldn’t that individual have human rights, too?
A rule whereby only those with an organic brain would have rights would be unfair.
Now consider the possibility of a computer brain without human experience — as in a robot or android. Should it have human rights? The obvious answer is no. But what if those created with silicon brains achieve sentience and live a humanlike life, with dreams of a career, friends, etc. … ?
Remember Data on Star Trek: The Next Generation? He joined Starfleet and earned a commission as a Starfleet officer. I could see something like that happening in the future. What rights would Data have?
There was a great ST:TNG episode about exactly that question: when a scientist claimed the right to dismantle Data for research. The question went to trial, where Captain Picard defended Data as a sentient being, with the right to choose to exist and continue operating as a member of the Enterprise crew.
I have to think that most of the viewing audience felt it was a travesty to treat Data like an inanimate object; and that he should be accorded the same right to exist and make choices about his future as any human member of the crew. If there is ever an android as sophisticated as Data, I agree that it should have human rights — even if their brain is silicon-based.
So I guess the answer to should cyborgs have rights? is: it depends. That’s because our understanding and definition of sentience will always be in flux, depending on the technology of the time.
Now, it’s simple: humans should have rights, and computers should not. But as the line between machines and humans becomes less distinct, the answer won’t always be as clear-cut.
It’ll be interesting to hear those arguments in some future court.