Skip to content

Robolove – Robot Machines as Companions

Robot companions are being used in Japan and the the US for elderly patients in nursing homes. They take advantage of our innate tendency to develop affection for things that are cute and appear to respond positively to us.

Paro is a robot modeled after a baby harp seal. It trills and paddles when petted, blinks when the lights go up, opens its eyes at loud noises and yelps when handled roughly or held upside down. Two microprocessors under its artificial white fur adjust its behavior based on information from dozens of hidden sensors that monitor sound, light, temperature and touch. It perks up at the sound of its name, praise and, over time, the words it hears frequently.

But this raises questions about whether it is a form of deception. Should we be using robot companions, or real ones? What do you think?

via www.nytimes.com

Share on

3 Comment on this post

  1. Is deception always wrong? If so, why? If not, when is it OK? I’m going to take a utilitarian view on this and say that deception is only “wrong” to the extent that it is a form of corruption: it erodes our faith in each other’s sincerity. That’s important, however, so we need to take a dim view of deception in general, approving of it only in cases where (i) the net direct benefit is sufficiently clear, and (ii) the actual corrupting effect of the deception is small or nonexistent. Both conditions seem to be satisfied in this case.

    The question as to whether this actually is a form of deception is in my opinion marginal. There does seem to be an element of deception, but it’s not as if we are actually telling the person concerned that we’ve presented him/her with a sentient being. Do we deceive children when we give them teddy bears? If so, the corrupting influence would seem to be a more serious concern in this case…

  2. Maybe the rot set in ‘way back, when people stopped listening to storytellers and began reading books. Since then it’s been technology wins all the way, with ever more alienation from personally experienced reality. So the ethical issues are really deep.

  3. In some ways I find this a more interesting consideration than worrying about whether it is a form of deception. I don’t think I’m as anti-technology as Jerome appears to be, and I’m not sure that technology-enhanced reality is any less “personally experienced” than other types of reality, but I do agree that a side effect of technology is that it tends to alienate us from more natural (including inter-personal) experiences. Getting to grips with this is, in my view, a really important challenge for humanity.

Comments are closed.