Radiolab

Navigate
Return Home

Furbidden Knowledge

Back to Episode

Furby with Tim Howard Furby with Tim Howard (Lynn Levy/WNYC)

In 1999, Freedom Baird was in grad school, and Furbies--those furry little robot toys that talk to you and tell you to play with them--were all the rage. So Freedom, who was thinking about becoming a mom someday, decided to get a little practice by adopting two gerbils and one Furby. And that led to a chance discovery...and an idea for an experiment that Freedom believed could serve as a kind of emotional Turing test, a way to ask whether machines are more alive than dolls.

In order to test Freedom's idea, we gathered up a Barbie, a hamster named Gerbie, and a Furby. Then, we invited five brave kids into the studio: Taro Higashi Zimmerman, Luisa Tripoli-Krasnow, Sadie Kathryn McGearey, Olivia Tate McGearey, Lila Cipolla, and Turin Cipolla.

We ran our results by Caleb Chung, the man who created Furby. And according to Caleb, the reason Furby gets under our skin is simple...but Jad and Robert aren't ready to buy his explanation. Sherry Turkle returns to help us think about what's going on.

Gerbie the hamster

Gerbie the hamster, who lives happily with Lila and Turin and has been renamed June.

Guests:

Freedom Baird, Caleb Chung and Sherry Turkle

Comments [19]

Brooke

Thanks for sharing your thoughts about Radiolab. Regards Brooke

Mar. 07 2013 07:39 AM
hunter

Im a kitty like a boss

Oct. 17 2011 10:38 PM
James Latane

For anyone suggested in the idea of robots becoming humanlike, I recommend the short story "Helen O'Loy" by Lester Del Rey. It's a science fiction story written in 1938, and can be found in "The Science Fiction Hall of Fame". It really makes you think about what makes humans different from robots, if anything.

Sep. 30 2011 01:37 AM
Angela from Columbia, MO

Stark Trek on the very subtle differences between how artificial intelligence *feels* and the way flesh and blood humans do:

"As I experience certain sensory input patterns my mental pathways become accustomed to them. The inputs eventually are anticipated and even 'missed' when absent." -- Troi, to Riker, re: Data's definition of friendship

Sep. 12 2011 01:38 PM
Irina

This segment makes me think of a show from Being from American Public Media. The guest was recounting an experience with her young daughter (or granddaughter, I don't remember exactly the relationship or the age) where her daughter commented that a robot turtle - vs. a still (unmoving) real living turtle - would be "alive enough" to satisfy her experience of the turtle...and now I'm listening further to this Radiolab segment and researching the Being show I'm talking about and Sherry Turkle was the guest on that show as she is on this one. So, there you are.

Aug. 31 2011 05:29 PM
Greg Jackson from Australia

When my daughter was about 5 years younger she chose a furby buying it with her own birthday money. She was aware of what the toy did but when she began playing with it, the furby really started to freak her out. She put it aside for quite a long and went back to it but still did not take to the toy. She still has her furby which sits on her well used book shelf.

Jul. 06 2011 05:30 AM

In the philosophical argument towards the end of the Furbie segment, I side with Caleb for the same reasons. I don't like how Jad explained the deal though, so here are the aforementioned reasons in my own words:
First, think about computers. Silicon chips on circuit boards sending pulses of electricity through wires and minuscule transistors. Then, think about brains. Throbbing clumps of tissue sending chemical and electrical signals through intricate webs of neurons. The human brain may be more complicated than a computer, but silicon or carbon, they both run on the same principle; manipulating matter and energy to solve problems and perform tasks. How, then, can one be more limited? As much as we might think (or wish) so, emotions, feelings, personalities and thoughts are not some mystical substance or entity that can only emerge out of an authentic and "truly alive" mind. Emotions and thoughts are simply complex chemical and electrical signals transmitted throughout the neurons in our head. Computers can do that, no matter how complex. He's not saying that "If it's simulated well enough, it's something like love," rather that if it's simulated well enough, if you think about it logically and scientifically, it is undeniably no different from love.

Jun. 26 2011 02:35 PM
Bebarce El-Tayib from Clifton, NJ

My 1 year old daughter will push over her Elmo Live. When he falls over he says "Uh oh, Elmo fell down. Can you help elmo up please?" When she picks him up he says "Aw, thank you. You're Elmo's best friend."

Now this toy has numerous interactions you can make with it, from touching its tummy, foot, head, or back. And so on. But the one she likes best, is pushing Elmo over so that he calls for help.

Soon after she would start to pretend to fall, or pretend to have her foot stuck under the couch. We would have to come and help her up, or pull her foot out.

It seems like she is learning that at times, we are unable to help ourselves, and must rely on others. While that seems like something that is inherently known, much like Clever-bot, we all start pretty close to Zero.

What's fascinating, is that she is coming to understand a human emotion, by mimicking a robot.

Jun. 23 2011 12:06 PM
Corey from La Jolla, CA

I'm with Dave.

Consider the definition of psychopathy. It is a personality disorder characterized by an abnormal lack of empathy. A highly functional psychopath could portray such emotions, although never really feel or understand them.

Like..a....(gasp!!) FURBY!!!

Sorry for the dramatic example, but I just listened to "The Psychopath Test" on TAL.

Jun. 16 2011 07:10 PM
dave

The furby guy was wrong.

Just as I can say I am angry, without actually being angry, Furby is saying it is scared, without actually being scared.

There is a difference between acting out the outputs of a feeling, and actually having feeling them.

Acting the feeling does not mean you have the feeling.

Jun. 16 2011 05:08 PM
thomas from California

People often under estimate the complexity of even the simplest life forms. Multi celled life forms are mind bogglingly complex, try to imagine a machine that can grow and physically adapt to it's immediate environment, make copies of itself by interracting with other individuals of its type. The offspring mutating and further adapting - this is a tiny fraction of the enormous complexities robots cannot come close to (yet).
Neverthe less I am waiting for my furby to arrive from Ebay and I trying to get a chat bot working (ProgramE). Yeah I have thought a lot about this and even created an A-Life simulator in java:
http://thomasfonseca.com/Welcome_files/Environment.html

Jun. 15 2011 11:51 AM
Trish from Canada

The furby experiment bugs me a little. They needed to switch up the order to get a real read on the time spent upside down. I think if they made them hold Gerbie after the Furby would have actually affected the times.

Jun. 15 2011 09:43 AM
Giacomo Cerlesi from Milan, Italy

Listening to this episode made me think about how lifeless things can really influence the features we have that characterize us as humans, such as feelings and emotions. I'm not saying that a toy is like ''alive'' and has a sort of power on us, but as long as it modifies our mind structures, even for a few seconds, it stops being a mere thing, i guess it becomes something more due to the reflection it has created between it and the person involved in this sort of relationship. That's the kind of relationship between children and toys, they help kids grow, so the relationship is not just phisical, but firstly emotional. My question is : if things have a power on us, do they take it from us (so it's all in our mind) or do they have a sort of personal energy we're not aware of ? Sorry, I know this is very messy and weird ..

G

Jun. 11 2011 05:37 AM
Ben from Sydney

I think that there is should be a distinction made between appearances and reality. Just because there are programs that give the appearance of emotion, does not mean that they actually feel emotion. To lose this separation would be like considering that the actors in the movies feel (to the fullest extent) the emotions of their characters. just because we observe that something looks like it has emotions, doesn't mean it actually has feelings. http://en.wikipedia.org/wiki/Duck_typing#Criticism
It's just a machine doing a really good impersonation of a human being.

Jun. 11 2011 04:06 AM
aman from Dallas

When considering what constitutes alive, I wonder if the complexities of some insects are any more or less complex than some of the programs that mimic intelligence... We speak of feelings, but biologically they are simply chemical reactions to our environment. So aside from the biological definition what is "alive" and what is "intelligence"

Food for thought...

Jun. 11 2011 02:01 AM
aman from Dallas

When considering what constitutes alive, I wonder if the complexities of some insects are any more or less complex than some of the programs that mimic intelligence... We speak of feelings, but biologically they are simply chemical reactions to our environment. So aside from the biological definition what is "alive" and what is "intelligence"

Food for thought...

Jun. 11 2011 02:01 AM
Chris Fernandes from Mattapoisett, MA

Listening to Caleb Chung talk about how machines like Furby and humans are both alive made me think about B.F. Skinner and Behaviorism. Technically Mr. Chung is accurate when suggesting that machines are just as alive as humans. From a behaviorist perspective, love, hate, thinking, freedom, etc," are conceptual labels that we use to describe a behavioral set. They are constructs. If this is the case then any programmer could replicate a behavioral set to produce the construct. Check out B.F. Skinners "Beyond Freedom and Dignity" book to get a better outlook. Lastly, I'd like to challenge Mr. Krulwich's assertion that interacting with a robot isn't "real." If a human being is programmed to respond or feel because of past behavioral reinforcement histories, is that any more real that a computer generated response doing the same thing. Our affinity to feel "real" does not lie outside of ourselves but rather is interpreted internally!

Thanks for a great show!

Chris Fernandes, LMHC
Behavioral Psychotherapist

Jun. 05 2011 12:56 PM
Emperor XLII from Austin, TX

This brought to mind something I was reading just this weekend, in a story about where to draw the line between mimicry and reality (sorry to quote so much, but it is a very nice story :).

Klapaucius: "In that box kingdom, doesn't a journey from the capital to one of the corners take months—for those inhabitants? And don't they suffer, don't they know the burden of labor, don't they die?"
Trurl: "Now just a minute, you know yourself that all these processes take place only because I programmed them, and so they aren't genuine. Not an illusion, since they have reality, though purely as certain microscopic phenomena, which I produced by manipulating atoms."
Klapaucius: "And are not we as well, if you examine us, the result of subatomic collisions and the interplay of particles, though we ourselves perceive those molecular cartwheels as fear, longing, or meditation? You say there's no way of knowing whether Excelsius' subjects groan, when beaten, purely because of the electrons hopping about inside—like wheels grinding out the mimicry of a voice—or whether they really groan, that is, because they honestly experience the pain? A pretty distinction, this! No, Trurl, a sufferer is not one who hands you his suffering, that you may touch it, weight it, bite it like a coin; a suffer is one who behaves like a sufferer! Prove that you only _imitated_ suffering, and did not _create_ it!"
Trurl: "You know perfectly well that's impossible. Even before I took my instruments in hand, when the box was still empty, I had to anticipate the possibility of precisely such a proof—in order to rule it out. Anything that would have destroyed in the littlest way the illusion of complete reality, would have also destroyed the importance, the dignity of governing, and turned it into nothing but a mechanical game."
Klapaucius: "I understand, I understand all too well! Your intentions were the noblest—you only sought to construct a kingdom as lifelike as possible, so similar to a real kingdom, that no one, absolutely no one, could ever tell the difference, and in this, I am afraid, you were successful! Don't you see, when the imitation is perfect, the semblance becomes the truth, the pretense a reality!"

- from "The Seventh Sally" in The Cyberiad, by Stanislaw Lem

Jun. 02 2011 07:48 PM
max

i think his last name is chung, not chun. (http://vimeo.com/5974298)

Jun. 02 2011 07:18 AM

Leave a Comment

Register for your own account so you can vote on comments, save your favorites, and more. Learn more.
Please stay on topic, be civil, and be brief.
Email addresses are never displayed, but they are required to confirm your comments. Names are displayed with all comments. We reserve the right to edit any comments posted on this site. Please read the Comment Guidelines before posting. By leaving a comment, you agree to New York Public Radio's Privacy Policy and Terms Of Use.