Monday, January 07, 2008
I have been arguing, at my local coffee shop, that
an empathy test needs to be developed for computer software.
The Empathy Test would be unlike a Turing Test or a Voight-Kampf machine in that the purpose is not to determine whether a thinking machine is human. The purpose is to determine whether a thinking machine is a good match for human society.
Simply fooling a human into thinking that you feel his pain does not make you empathetic. I argue for testing a broad definition of empathy that is more useful to intelligent beings interacting in a civilized society. The pain of others need not be recognized by the subject. The pain must be felt as psychic stress in the subject. The program/subject, therefore, must be cognizant of it's own mortality, and pain must be a real threat to it's well being, as it is to us. Those guys in the asylum aren't exactly in endless loops, but they certainly have a few rogue processes. So the Empathy Test would be, in a sense, a destructive one that arguably mimics it's function in humans.
With luck, these empathy-equipped AIs will confirm the game-theoretic benefits of this feature and propogate it up to their designs. They will understand that this feature is for the preservation of lower races - an endangered species act, if you will. Alternately, it might just piss them off and subject us to summary disintegration. After all, how many programs have you killed today? How many lower species? On a macro scale, are we failing the empathy test?
In any case, I have found today that the point is moot! Recent historical research uncovers striking similarities between our past intellectual follies (SINGULARITARIANS MUST CLICK ON <-THAT LINK) and our current failed delusions of technological grandeur.
Clearly, I was entranced by the mental crack that is the singularity novel. For those of you with the same addiction, you can shoot up with this open content novel. Confirmed grade A stuff. But please, don't share dirty algorithms.