Return Home

The Trust Engineers

Monday, February 09, 2015 - 08:01 PM

(Photo Credit: Amy Pearl)

When we talk online, things can go south fast. But they don’t have to. Today, we meet a group of social engineers who are convinced that tiny changes in wording can make the online world a kinder, gentler place. So long as we agree to be their lab rats.

Ok, yeah, we’re talking about Facebook. Because Facebook, or something like it, is more and more the way we share and like, and gossip and gripe. And because it's so big, Facebook has a created a laboratory of human behavior the likes of which we’ve never seen. We peek into the work of Arturo Bejar and a team of researchers who are tweaking our online experience, bit by bit, to try to make the world a better place. And along the way we can’t help but wonder whether that’s possible, or even a good idea.




More in:

Comments [98]

Chase from Virginia

Awesome content here. Just listened to the most recent episode.
If you EVER have a need for a behavior profiler/engineer on the show, I'd love to be on. I teach body language, profiling, interrogation, lie-detection and behavior engineering. I'd love to be a part of what you've created.


Dec. 25 2015 09:57 PM
Laura from elsewhere

I'm flabbergasted at how willingly people give up their personal information to large corporate entities like Facebook in this day and age. Frightening sign of the times.

Nov. 13 2015 01:10 AM

While Kate Crawford seems like a lovely gal from her profile and comments, going to Microsoft for comments on abuse of market position is absurd. In recent memory, they lost anti-competitive lawsuits both in the US and in Europe (and didn't live by the terms of the settlement so they were extended). They have long been known for offering shady rebates to OEMs to lockout competitive OSes, embracing open standards only to quickly corrupt them so windows clients are locked into talking with windows servers.

To me it re-enforces the maxim that power corrupts and that the truism that the true goal of every business is to make as much money as possible, which almost always works against the consumer when taken to it's logical endpoint.

Anyone that expected cautious and considered actions from Facebook is naive. But the hypocrisy of '24-hour media' to be shocked that anyone would try to sway peoples' emotions, when the reporting of the story itself is just that - tailored to shock and outrage their viewers with hyperbole. Forget Facebook, I'd place a bet that news stories have caused people to commit suicide.

Jun. 10 2015 12:09 AM
Reuben from MI, USA

I found it really interesting how strongly the media took the news. After hearing the first part of the episode, I was definitely on board with the trust engineers, thinking that what they were doing made some sense. It was crazy to hear the media though as it exploded, and how totally different and skewed their perception was. They so quickly demonized it and puffed it up, and it really made me think about how I look into stories showing up now days.

Jun. 09 2015 08:43 PM
Carson from Qingdao, CN

At around 23 minutes, Kate Crawford says, "Who are we to decide" who is compassionate. Not sure if that last part is her exact wording. It reminds me of A Clockwork Orange.

Jun. 05 2015 09:47 PM

This story makes clear the need to update the human subjects consent process.

May. 29 2015 04:57 PM
Leslie from New Zealand

Someone from Microsoft, accusing another company of abuse of power? That's the worst case of hypocrisy I've ever heard.

May. 05 2015 03:02 PM
Mary Walther from Houston, TX

I loved this but did I hear incorrectly - were there only men participating in the Friday gathering of "Trust Engineers?"

Apr. 29 2015 05:36 PM
Anna Chaucer

This was eye-opening to say the least. There are pros and cons to experiments like the ones described. Overall, careful consideration should be put into experiments using social media.

Apr. 20 2015 06:27 PM
Kenneth from chicago

I am perfectly ok with sites like facebook, collecting and experimenting with what i post. The truth is, This site and sites like it, aren't my creations, and when we have things like people shirking the responsibility of even asking hypothetical friend to remove a hypothetical photo, it is the responsibility of the moderator of a platform to experiment with ways to moderate. Out side of social media, people do this to other people, pastors, Bosses, teachers, professors, team captains and so much more. At the end of the day however, if you aren't willing to learn how to make it yourslf (your own website about you and have friends subscribed to your daily status update or whatever), you have have to trust those who know. I get that your in effect entrusting your emotional well being... however it needs to be questioning/curious trust. And for those that may mostly be blind trust, that sounds like a personal problem. blind trust is just wrong...

Apr. 16 2015 07:58 AM
Oscar Rosseau

This was a very interesting podcast. It wasn't too surprising to me that they perform experiments with their customers, since most companies do to provide better products. It makes sense that Facebook would try to make their customers experiences better, plus it sounds like these experiments might be interesting to psychologists as well.

Apr. 13 2015 09:14 PM
Andrew Zolli from Brooklyn, NY

Hi everyone - this is Andrew Zolli, one of the principal folks behind this piece. I thought you might enjoy this link:

In which the network social scientist Duncan Watts argues that it would be unethical for Facebook NOT to experiment on its users. It's well-worth a read.

Apr. 13 2015 11:03 AM
b lee from Orange County, CA

One application of FB's research into human online interactions could do a lot of good and possibly save lives.

Imagine if there were a way to determine at what point when a person reads political and violent extremist material online becomes receptive to those destructive methods to the point they want to become associated with the people who generate that material. Furthermore, at what point does a person make the decision to actively seek physical and/or financial associations with those people? What language and what behavior online creates a receptive state of mind in the reader of those websites? When are those readers willing to give up their lives and their family relationships by replacing them with relationships with perpetrators of violence?

If those questions can be objectively answered, then those associations could possibly be interrupted and curtailed by the intervention of a family member, not necessarily law enforcement or secret police.

Apr. 12 2015 08:11 PM
Sir Lancelot from My bed

Facebook is probably the best thing that has happened to social experiments. I remember facing those questions when reporting a photo, but I had no idea that I was a tool in a mass research study. Also I think it is interesting to note that a simple change in words or punctuation can have such a profound difference on someones choice. All I have to say is, I want in.

Apr. 06 2015 11:41 PM
Aldous T Chrinchton from America

It's kinda weird that people are analyzing how we interact on social media but if it helps people to stop being so mean I'm down.

Apr. 06 2015 10:32 PM
Alice A. Keats from FLorida

This podcast was relatively humorous! I think it's interesting as well how humans physically have to see the reported Facebook image to judge its take down or not. Society is crazy and people get crazier and more electronically dependent day by day. I feel very bad for the people who had to glance at every reported picture after Christmas. Their eyes must've burned after every day at work from staring at all of the dumb people's pictures that "should" be taken down. It's a waste of time to have to put these poor people who have to see each individual photo and for the photo to not even report worthy.

Apr. 06 2015 10:13 PM
Huxley T Wilder from United States

Oh, Facebook. Everyone knows how crazy Facebook can get with keyboard warriors, and 'trollers.' People get angry and report stuff because of stupid reasons like not liking how they look. Personally, I don't like to use Facebook anymore, but Facebook become so popular because they find ways to forage trust. I think this comments on the fact that people like to displace their feelings-to detach themselves from negative emotions. Facebook is huge and I do think that it will (and has) be able to be used for social sciences. It's mind boggling!

Apr. 06 2015 09:31 PM
Hung from Hanoi

How can I get a transcript for this postcast?

Apr. 04 2015 12:12 AM
Agatha Y. Coleridge

This podcast was very interesting, but hearing about how you could possibly be in an experiment could be slightly discomforting. However, there are some advantages and disadvantages to this technology. It could be a good way to expand research in social sciences and make technology more efficient, but this could also be an issue on people’s privacy. No matter how many privacy settings you set up to prevent hackers, you still won’t be able to prevent Facebook workers from going into your account and seeing your personal information. Although if you are going to set up an account, you should prepare yourself for the possibility of these events and you shouldn’t be putting things on your account that you wouldn’t want anybody else to see.

Mar. 30 2015 08:03 PM
Milo C Rousseau from Florida

I love hearing statistics about people. All of the little 4% this and 2% more that are just so weird and interesting. I don't blame the people at Facebook for trying some of these experiments. The temptation is just too great. I understand that there has to be a point at which these stop though. I also think it says a lot about how we interact online that we don't really want to confront people without any help. The car analogy made a surprising amount of sense on this subject, too.

Mar. 30 2015 06:59 PM
Harriet Truman

This podcast is completely right people need to watch how they say things over social media. One word could be changed and taken a completely wrong way. It’s cool how the scientist experimented through social media to study language and wording. If people are social media so much, people might as well take advantage of it. This really goes in depth and talks about language and how it is effected by social media. It proves that face to face communication is vital to understand someone’s true meaning.

Mar. 30 2015 03:48 PM
Becky S. Gatsby from Florida

People these day's are overly obsessed with social media and the internet in general. Miscommunication with people is so easy! On top of that the FBI are always looking at what you're doing and if you say something even slightly suspicious then you can be under a lot of trouble. That is quite scary.

Mar. 30 2015 10:46 AM
Anna B. Silverstein

This was interesting. While I find it a little discomforting that there is so much unknown access to private accounts, I can't say I'm particularly surprised. FB is a good way to get a varying sample for particular studies, so it certainly makes a good tool. Overall, I agree that people often need a nudge to communicate effectively.

Mar. 29 2015 06:26 PM
Rudyard L. Kevoac from Florida

I wonder if future social media sites will utilize these tactics. It seems like simply using face-time communications would be much simpler.

Mar. 29 2015 02:03 PM
Anna A. Dickinson from Oviedo, FL

In this generation, technology has taken over. It has its advantages, for sure- speedy communication, worldwide interaction, and social updates. But it can be incredibly easy to be misunderstood. This podcast definitely emphasized one of the biggest problems- miscommunication. Without seeing a person physically, much of the fundamental communication is lost. Even altering just one word in a sentence can completely alter how it is received. Technology and social media must be dealt with carefully. Be smart, and think twice before posting.

Mar. 28 2015 10:17 PM

Oh good lord.
Virtually company that sells us stuff does experiments with us! They change things and monitor how we respond with our attention and our buying.
MacDonalds tweaks the amount of sugar in the ketchup.
Grocery stores mess with product placement
Department stores monitor our purchases and smart-market to us
Airlines experiment with different ways to get us on and off the plane
etc etc etc etc etc
Why are we so shocked that FB is acting like a company??

Mar. 24 2015 08:02 AM
Jane J. Asimov from oviedo FL

I was compelled to listen to this podcast because I really like learning about social media. I though it would just be a typical discussion about how much social media has advance and how destructive it is blah blah blah but I was pleasantly surprised that this podcast was completely different. This discussion was all about language. I learned Language plays a huge role in social media and by adding one simple word to a sentence can increase the response by a drastic number. Language in social media actually effects psychology more than anything. This proves how important and vital language is in our daily lives. Facebook is far more than a basic social media site. Its now a poll, a study, a huge population of people that offer a vast amount of responses for experiments. This podcasts was one of the best ones yet.

Mar. 24 2015 12:00 AM
Alice L. Havisham from Oviedo, FL

This podcast was very interesting and I enjoyed hearing about a current and common issue that I would never have thought about. There was a lot of data revealed to demonstrate the difference the wordings made. I wish there was a visual for when they visited the Facebook data meeting. It was intriguing to hear what the other scientists had to say and it was a clever comical conversation. The way the podcast began was creative and the way they inputted data and the big story about “Lab Rats.” I do not think we were manipulated the way it was suggested but it was needed because they are not going to be able to test something with such ample amounts of subjects. I think it is cool because we are helping scientists and giving the scientists data to make new discoveries. I loved the title “Trust Engineers” as well because it is true that they care and we need to see it from both sides. It was interesting how there was an insert from Louie CK, the comedian.

Mar. 23 2015 11:27 PM
Catniss J. Plath

Sometimes it can be very hard to really understand the tone of a message when it has been sent through a text or email. It may come across rude or sarcastic to the receiver, when the sender did not have any sort of negative tone. That's when it may become a bad idea to try to solve a problem virtually. It's not that surprising to me that we are experimented on through social media, it's expected. With all the participants, who wouldn't have thought to study people in social media? It makes it easier.

Mar. 23 2015 10:43 PM
Catniss S. Vonnegut from Oviedo

The amount of people that use Facebook is kind of mind boggling and the amount that they post is even more so. People reporting things for false reasons is sort of dumb and caused a whole lot of work for no reason. This podcast really makes me think about how adults always say that things you post on the Internet are there forever. I feel that people are really lazy and they just want others to do things for them. A lot of people are afraid of confrontation and need help to start confrontations.

Mar. 23 2015 07:06 PM
Alice Z. Lovecraft from Florida

I think it's actually pretty cool that they use facebook for experiments. I see no problem with that at all. It's a tool that connects people and gathers tons of information. Why not use it? It's so silly that people got so upset. It had almost no effect on people. If it really upsets someone so much, get off the website. There is no reason to act like a five year old about it. Oh boo-hoo they used my facebook feed to see how it affects me emotionally.It's a website. Deal with it. Nothing on facebook is really important. It should not be a very big part of anyone's life. I think it's good that they put it to use.

Mar. 23 2015 06:40 PM
Lorelei M. Coleridge

This was really interesting. Honestly, my solution would be that people just need to quit freaking out about social media, but they made a really cool discovery about how people think. Their solution was clever, and I'm glad to feel less alone in my discomfort with composing my own written communications. I basically try to make my mother write all my emails for me. That so many others prefer pre-written communications as well is comforting. The effectiveness-of-phrases data would actually be really useful for awkward people like me to improve written messages as well as for social media companies, if they released it to the public. I think that Facebook should be more open as far as "experiments" go, and make participation optional, but I also think that the mecia can get a bit hysterical. However, I've never depended on social media enough to feel more than a general sense of dread about its deviousness. I'm honestly not too surprised by anything they may be up to, though. It's all just this weird mixture of threatening, hopeful, and completely, utterly ridiculous in its profound influence. Like the Internet as a whole.

Mar. 23 2015 06:08 PM
Thomas Mogelberg from Georgia USA (but from Denmark)

Hello out there!!

Is the research that the social scientist conduct at Facebook published anywhere and accessible for the public?

Mar. 23 2015 02:08 PM
Sastelise from Planet Earth

The way this story was presented is disappointing because it doesn’t tackle the central issue of whether FB users should be obligated to communicate with their “friends” when offensive material is posted on the site. I’m sure there are countless situations where friends or not really friends or when embarrassing photos or taken surreptitiously. Because FB is a platform that has such mass exposure, a more nuanced approach is needed then simply enabling users to “work it out” amongst themselves. FB has an obligation to protect user’s privacy at the same time that it has an obligation to avoid censorship. I’m not confident that FB cares about or has developed a good way to navigate this fine line, and this story completely misses the opportunity to address this issue. Thumbs down to the carelessness of with which this topic is handled by radiolab.

Mar. 18 2015 04:18 PM
cherlyn curtis from Indiana

Maybe we should have someone reread what we write.....since we obviously don't know how to communicate with each other without offense! Just a thought!

Mar. 14 2015 03:32 PM
Peter Q from Canada

How much did Facebook pay Radiolab for this cloaked advertisement?

Mar. 04 2015 02:36 PM
C R. Sab

What a great piece, but as someone who worked in social science research, I was surprised by the conspicuous absence of any discussion of standardized and relatively uniform human subjects protections in social science research ethics. Among the most sacrosanct of these ethics standards is the requirement for fully informed consent unless critical for the experiment, and significant safety-nets where consent is not obtained prior.

You shared a sound bite Arturo Behar sounding genuinely baffled (around 24:00 about the blowback, saying "we really care about the people who use FB."

Unless this tone was an effect of the editing, that comment makes it seem as though he's completely oblivious to the existence of standards for social research with human subjects.

Or, maybe worse, he thinks that standards of research ethics only applies to people who don't care, and it's okay to totally disregard them because he "cares"?

That bit of tape put my jaw on the floor at his hubris as he went on to acknowledge that while he would reconsider how he ran studies in the future, the only reason was because the opinions of FB users matter, NOT because he has come to any awareness that there might have been an ethics violation.

You know who else cared deeply about people?

Every one of these researchers:

And, I'd add this guy:

Great story, but failing to mention that a standard exists in social research was a massive oversight, and the many academics working with them that were interviewed should have known better or, if they did and there was an oversight board in place, you should have mentioned it.

Mar. 01 2015 02:10 PM

One of the more thought-provoking episodes of radiolab! I'm glad so many people have pointed out that private companies are constantly performing social engineering on unsuspecting people (to increase profits). But I think there is probably a lot of social engineering being done by NGOs and non-profits as well. Would people be angry about being "manipulated" into voting or getting tested for AIDS or whatever if a relevant non-profit was sending out the facebook newsfeed posts? I also agree it's just plain cool to have this amount and type of social science data.

On the other hand, it is a bit scary to imagine how much power Facebook could subtly wield over the mindstates of a sizable chunk of humanity.

Feb. 24 2015 11:23 PM
John How from Terrace, B.C., Canada

I wonder if other people see Silicon Valley as being, for the Social sciences, what los Alamos was for the 'hard' sciences?

Feb. 24 2015 12:27 PM

Love the show! What I gathered from this episode is that we as a society, are so self-centered. Reporting pictures because we do not look appealing in them is interesting because we have no problem being online bullies, yet we are reluctant to start a conversation and simply reaching out to our friends and asking them to take the picture down.

Feb. 24 2015 11:27 AM
cwatanabe from Japan

I don't mind being a lab rat if it’s for the good。

Feb. 24 2015 05:12 AM

Dear RL staff and listeners,

I had my Media class listen to your last program and then make comments to try and engage themselves and others in debate. I don't think it had the impact I was expecting. Radio lab, in my mind, presents information in such a nice and digestible way. However, over the last 5 years, forcing (I would like to say exposing) students to listen to these programs, I have found that the majority of students cannot process or take in the information in the way that Radio Lab presents the information. I'm not an old man, but I feel like an old man when I present RL as an excited listener only to have it fall so flat upon my student's brains.

And that's all I have to say about that.

Feb. 23 2015 11:54 PM

Retyping my blog since it vanished somewhere in the internet. urgh.

This sadly caught my attention… because I’m also a daily Facebook User. If it was Twitter or Tumblr, I could really careless for. Facebook now has over 1 billion users throughout the whole world which caused researchers to experiment and control online users with emotion. “Trust Engineers” who came up with that name… pfft.

The ridiculous part is people FREAKING OUT THAT THEY WERE BEING MANIPULATED... considering themselves “Lab Rats.” What a horrid generation we live in.. YES this “incident” was on the articles on television and newspapers... but HELLO? Is this a big deal ? WOW.
You’re being hypocritical because you’re on Facebook and that’s your fault, not theirs.
There are manipulation everywhere… look at commercials, ads, products, people will control and do whatever they can for you to buy their PRODUCTS. They are taking your money… and you complain about some social media controlling and taking over your "space?" PLEASE.
There are other things to worry about in this world.

Feb. 23 2015 09:50 PM
Hali Barefoot from Japan

I found it pathetic how much people nowadays are easily offended and seem to find the need to get involved into whomever's business. Unless a post or a photo truly does offend you or your culture, why take the time to report it? Especially with a misleading claim. On the other hand, I understand that a photo of you could potentially be viewed by a large audience so you may feel threatened once it is on the internet.

It is interesting how saying "sorry" was less effective when it came to asking for a favour. Because doing so shifted the responsibility back to you. I guess being polite doesn't always work out in certain situation. Lastly, it is weird to think that a huge online community used us, the audience or users as lab rats. Unknowingly, we were part of a project and helped it to improve.

Feb. 23 2015 09:46 PM
Aaron Gustafson from Chattanooga, TN

Some thoughts on form design, triggered by this episode:

Feb. 23 2015 01:06 PM

So A/B Testing turns people into lab rats? This article is so anti-science that it's scary.

Oh my god someone is conducting an experiment without me knowing about it. That's evil! Unethical!

If you're fortunate, everything you touch and taste has been the product of blind, double-blind experimentation.

Feb. 23 2015 12:13 PM
BPM from Brooklyn

RK has never been on Reddit if he thinks people don't need nudges to be more civil with each other online!

Feb. 22 2015 01:13 PM
maya from Japan

I don’t think Facebook is doing anything wrong. Listening to the headlines of multiple news reports, it sounds like people are over exaggerating. But of course, people are going to over exaggerate to get views. There are people who are willing to report one of their friends to Facebook because they’re embarrassed about a photo… Did people actually forget their social skills? They need some random person from Facebook to choose words for them to ask their friends to take a picture down? What, are your fingers paralyzed? Probably not. If you were able to click the report button on Facebook then your able to take 10 seconds and write a quick message to your friend. If I can write a 150+ response to this for homework then you can do it too.
If you’ve had a Facebook account and still didn’t know that Facebook is tricking you or testing you or keeping track of every little movement you make, you must be new to the internet. Any social media site will do this I’m guessing and if you don’t like it then don’t make an account. We’re all lab rats to everyone at some point right?

Feb. 22 2015 09:03 AM

"Everyone wants to know about whats going on in society, but no one wants to be a part of the testing. These types of experiments and this type of data collection is really the only way we CAN find out what is going on."

Society & social analysis wasn't that bad WITHOUT Facebook. Same as with smartphones: we THINK we need them, but we don't REALLY need them. Arguably, they make certain things & tasks a bit more convenient – but they're not mandatory. And they're certainly not a necessity.

Feb. 20 2015 08:38 PM

This is why I don't have any social media accounts. If I want top keep in touch with someone I just text or message them. Having a middle man in your communication is not good. Facebook is a for profit company, when you give a company that much power to manipulate your information, they will use it for they own goal.

Feb. 20 2015 01:38 PM

Everyone wants to know about whats going on in society, but no one wants to be a part of the testing. These types of experiments and this type of data collection is really the only way we CAN find out what is going on.

Yes, it is kind of off putting that I am currently involved in experiments without my knowledge but can you imagine the alternative? If Facebook had asked me if I wanted to be part of their social experiments where they would mess with my newsfeed and see my reaction, of course I probably would have said no, and so would most other people. The people who did not would be a certain kind of people who either do not care about privacy, or are heavily invested in social science and understand the benefits of this. And sampling from those people only would heavily bias the results and probably invalidate them.

Feb. 20 2015 11:24 AM

I agree with Sam. People do not want to answer your surveys and treat you like telemarketers when you call. The people that do answer are often a very certain type of people which biases the results. But so many people use Facebook and other social networks, and the type of people who do are much more varied than those who answer phone surveys or mail surveys. Also compare the cost of collecting and compiling data from social networks (the touch of a button, maybe designing an algorithm to collect it, or if you are working with twitter feeds then using already designed data mining techniques; nothing difficult, costly, or time consuming) versus the cost of doing the same with phone or mail surveys (time or employee cost of the people calling, paper and postage cost of mail surveys plus the time cost of entering the data from a form into statistical software). The sample size to dollar ratio is astounding, and you are not even bothering people by asking them to fill out a survey

Feb. 20 2015 11:17 AM

Does anyone know what to call this type of research? Or know of any companies besides Facebook that conduct it? I am a Sociology and Math major with a minor in Computer Science. This type of research sounds incredibly interesting to me and I think I could do incredible things with it once I have some experience.

Society is a complex system and when viewed as a whole it is full of people who behave in semi-probabilistic ways(in that they are probabilistic to predict, not that people do not make their own choices) . The trouble with social science is that the probabilistic nature of society makes it very difficult to find significance or to generalize without an incredibly large sample size. In addition, large sample sizes are very hard and expensive to get, especially random ones. I think this method of collecting huge samples of data in a way that is mostly random is ingenious and incredibly useful. This is the future of social science. I want to be a part of it.

Feb. 20 2015 11:16 AM
Reina from japan:)

I actually don't care if you stalk me, or call me Labor Rats. It is not much of a surprising fact when such a huge company would't do anything that is unexpected. Just hard to believe there are people who actually carried shocked, and depressed discovering the fact knowing you are stalked. What? Seriously? Huge companies will just want to increase the users, and will just want money money money! Honestly, I don't think companies would actually care bout us.
I was also suprised, how can we all be so selfish asking Facebook to take down photos instead of asking the people who posted the photos, but at the same time saying how bad Facebook is.
This was an interesting Radio. I haven't heard Radios for ages so, it sort of felt nice listening to it.

Feb. 19 2015 09:47 PM

Listening to this radio lab made me realize how ridiculous people are. It should be common knowledge that a big company like facebook would do something like this. How can people be so blind as to think that their facebook accounts would not be watched and/or altered too? Big companies always want to improve their products, so why do people think that just because the product is a website online, that they are safe and not being used in an experiment?

Feb. 19 2015 07:50 PM

Listening to this radio, I was bit surprised because I never thought such a popular social networking site like Facebook was watching what Facebook's users prefer, doing, and many other private things.

Feb. 19 2015 05:01 AM
Pete Zicato from The Burbs

If I were sending a comment to a friend, asking to have a picture taken down, I would be glad to have some ready-made verbiage known to be persuasive. Not everyone is good at that sort of thing.

Feb. 18 2015 06:50 PM
Jacob Pierce from Bakersfield, CA

I wrote and deleted a few long paragraphs here about my thoughts and feelings on this episode. But everything I wrote or feel would be so much better said in a conversation and comment sections aren't really the best place for conversations.

So, instead, I decided I'll leave a comment...

Radiolab is incredible. I love the show. It helps make running for exercise a tolerable experience. Thank you all so much!

Feb. 18 2015 02:15 PM
AreX Togashi from Japan

Listening to this episode amazed me how people "use" social media. Both users and the creators. Since internet has become a major part of people's life, understanding the backgrounds will definitely gives you more control of internet usage. No where is private.

Feb. 18 2015 10:00 AM

I really don't care. Do whatever you must to improve whatever needs improving when it comes to (online) communication.

"Please stay on topic, be civil, and be brief. By leaving a comment, you agree to our Privacy Policy and Terms Of Use."

I'm not gonna read this..

Feb. 18 2015 07:28 AM


how do you know if they are telling the truth. What is the truth. I don't even know that you are mike from Japan. you could be an alien from Georgia. what is the truth? Am I the truth?

Feb. 17 2015 11:19 PM
Mike from Japan

Why would people have to care so much about the truth, when something is working perfectly fine? All they are doing, is finding the absolute efficient way of getting people to get their unwanted photos to be removed. This requires testing, and it was almost inevitable that they must have done it this way, with just a little fun.

Feb. 17 2015 10:12 PM
SRJ from Japan

Man, you can see how selfish humans are. Not everybody, but some people just hate others. They just don't like what others do, so they want to use the power to force them down. Facebook is controlling it really well.

I am just amazed of numbers of Facebook accounts, wow, more then Christians in this world… Just think about it! We are like a dust in universe, and every single dust has a name and they are signed up to something they can use to communicate with! This is why I love human inventions.

Feb. 17 2015 10:11 PM
Leon Manjyoume from Japan

One thought that came to mind whilst listening to this episode was the sheer broadness that radiolab covers! Even though its still about science, the topics that you cover can connect to even those who aren't scientists!
Thank you :D

Feb. 17 2015 10:03 PM

What do you expect when it comes to large tech corporations like Facebook using your personal data in their studies, they need to be able to improve their site. You can't do that without gathering market research.

Feb. 17 2015 09:51 PM
Shawn from the Internet from

As I ponder what is it Radiolab is trying to get across, I remembered a peculiar thing that each economically aware, or those that I deem so. Why do people tell me that the next job in the future that I will probably have be a job that hasn't even been invented yet? Than It hits me, like a baseball bat hitting a home runner. I wouldn't mind being a trust engineer, creating and destroying social circles however that power should be without a doubt wielded by responsible, trustworthy, honest individuals.

Feb. 17 2015 09:35 PM
Anthony Bailey from UK

"COMMON SENSE! [...] COMMON SENSE!! [...] where has the common sense gone?"

Well, when we actually run properly designed experiments to learn about a large population of users that visit a very popular website, common sense often turns out to be quite wrong. We think we understand people, and we find that we do not - in my experience these data-driven learnings are very often surprising. Your guesses as to the reasons in these cases might be right - but I'd want to A/B to see.

Feb. 17 2015 08:21 PM
Richard from United States

I enjoyed the episode, but found it quite offensive that Radiolab would interview and quote someone from Microsoft, regardless of their credentials or expertise. Microsoft has a few decades before they taint of their own antisocial behavior will be washed away, and until then they have no business commenting on the morality of other corporations.

There are plenty of independent scholars that could have been interviewed.

Radiolab screwed up on this.

Feb. 16 2015 11:41 PM
Andrew Schultz from Chicago, IL

A surprising bit of innumeracy in this episode. Jad quoted an increase from 50% to 78% as a 28% increase. No, it's a 28 percentage point increase, but a 56% increase (28/50) in the number of people using the message to take down a photo on Facebook.

This reminds me, too, of an older crime, Robert Krulwich's referring to one of the planet's polar regions as "Artic." Shocking for a former science correspondent!

But these are quibbles for an otherwise uniformly excellent program.

Feb. 16 2015 05:35 PM
Nicole from Portland from PDX, OR

The interesting thing that came up for me around this podcast was the offense and shock people carried from the discovery of being involved, without consent, in Facebook's social experiment. What that upset failed to recognize is that marketing is based on this same social experiment and, as everyone is put into "target markets" or suggested to try "such and such" website, they are being experimented on without consent moment by moment in the name of capitalism. This is what the internet has been transitioning to as it has gone from the World Wide Web into a consumer's paradise...sweet dreams, my children...

Feb. 16 2015 04:55 PM
Cara from Boston

I believe it may be users were more likely to choose rudely phrased messages on Facebook as that seems to be the evolution of on line communication. Listen to Invesibelia episode "Our Computers Ourselves" the story about Petes' Facebook page about the train (M line, N line ???) Cyber bullying is more and more common. Just read the comments section on just about any cyber post and you will find more snarky mean spirited responses than not. I guess it's what comes from lessening human interaction.

Feb. 16 2015 02:54 PM
Chris from Brooklyn, NY

Interesting program on trust and social media. Towards the end of program you debated whether the prompt from Facebook suggesting to contact the user would lead to a discussion about the image being taken down and a better potential resolution taking place. I think you could have taken this one step further. Would this prompt affect future interactions between the two people. Would this prompt lead to a better connection both online and off, regardless whether there was any significant conversation from the prompt.

Feb. 16 2015 12:56 PM
Philip Snead from United States

This is an important discussion. Again and again we see that - whatever people find they *can* do with technology, they (or at least many of them) *will* do. Undoubtedly Facebook is far more conscientious in exercising this prerogative than most online entities ... about which we may never learn.

Clearly this power extends to conducting "opaque" social research. By using communications technology - even including means as primitive as the U.S. Mail - we give up aspects of privacy that include the Rumsfeldian known-knowns, unknown-knowns, known-unknowns, and unknow-unknowns. So we don't just give up privacy, we erode our ability to protect ourselves from malice and misunderstanding. In essence we *cede* this power in exchange for unstudied and unassessed benefit, convenience and pleasure we extract from use of these technological means.

My generation (Krulwich's and older) senses this as a highly dangerous corrosion of human existence. But I sense that younger generations are significantly more at ease with this exchange. And I think that may prove adaptive over the next decades, because I believe that the evolution of society in currently privileged societies (like that of the U.S.) will shift away from the primacy of personal freedoms, and toward prioritization of relatively limited individual prerogatives circumscribed by increasingly capricious public life.

Most of my life I've regarded privacy from a political viewpoint. In my dotage I'm coming more to see it as a perhaps-inevitable existential corrolary of increasing population, inflexible expectations and decreasing resource availability. We simply have less and less personal space in which to exercise our individual fear and greed.

Feb. 16 2015 08:14 AM
steve from England

I have never commented on a podcast, yet your segment on Trust Engineers struck me as the latest example of how easy it is to argue that social manipulation by internet companies is acceptable. Your program showed that that manipulation of voting has already happened, yet argued that because internet companies perform a useful social service and no longer call themselves Trust Engineers we should appreciate that their actions are reasonable.

If proof existed that that a politician, say the President of the United States had been responsible for manipulating voting and this had resulted in significant votes being cast would we be sitting here saying perhaps this is a good thing?

Why do we think something that our elected representatives could never justify is fine if done by unelected companies.

Feb. 14 2015 09:49 PM
Craig from Chula Vista Ca

I think most people are missing the point of the promt. We live our lives in a delicate balance of social acceptability. Everything we do or say is crafted to be socially acceptable.

When encountering something new--such as a takedown of a photo on Facebook--we can be confused as to what is socially acceptable in this new situation. By adding a prompt it indicates that a prominent "social entity" has sanctioned an acceptable response. Using this response is therefore "safe."

Feb. 13 2015 09:53 AM

Great episode, I am reminded of the Milgram experiments. What are the ethics approvals like for experiments of this nature? And how are we going to look back at this in 20 years? I'd ask further questions, but my phone just dinged and I really must answer it.

Feb. 13 2015 06:26 AM
Kevin from United States

It seems I'm in the minority, but knowing that I'm a part of numerous, massive social experiments actually makes me happier with Facebook. Yes, it's almost certain that they're only doing them to increase profit, but it still contributes to the overall growth of human knowledge. Turns out I'm a fan of that. What I do have a problem with is Facebook looking at my statuses, likes, etc and making money off them via targeted ads, which is why I don't have a Facebook anymore.

Great episode as always guys, keep it up!

Feb. 12 2015 09:42 PM
Paul from Ohio

Good episode, but nothing surprising. All users of Facebook should expect to be their lab rats. "If you're not paying for it, you're the product."

Feb. 12 2015 08:15 PM
Andrew Chinnici from Seattle, WA

The late analogy to early automobile signaling and traffic lights reminded me of this old video about road etiquette in Japan:

It seems that most Japanese drivers, when another drivers allows them to merge into traffic, use a few blinks of their hazard lights to say "thank you". This is an awesome social norm that deserves more widespread adoption. This, and the tradition in St Louis, MO for trick-or-treaters to tell jokes when going door to door for candy on Halloween.

Both of these things should happen everywhere.

Feb. 12 2015 02:54 PM
evo soria from chicago, IL

i found this to be one of the most pointless episodes ever. You don't need hundreds of scientists to realize common sense stuff. wow peer pressure works online just like it does in real life so if we show someone how all their friends are voting, maybe they will vote too! groundbreaking! oh wow, people are more likely to tell someone to take down a photo when they can hide behind the fact they hey, i didn't write this message, this is just how Facebook writes it. That has nothing to do with the wording. It's the simple fact that hey, this is how the program works, these aren't my actual words. COMMON SENSE!

and then for homeboy Arturo to say at the end that he's just trying to make the Facebook experience as true to the actual experience of talking to a friend or family member. GET REAL! No one wants to be their actual selves online, they want to be the most interesting, well groomed version of themselves. Its depressing to know they have to spend all this money to do these studies and they are totally missing the obvious facts about what their product actually does for people.

And then for the media to get upset that Facebook is using their database of users to make their product better by using them as "lab rats" was again the dumbest pseudo news story. OF COURSE a brand is going to use their consumers to make their brand better. You don't think mcdonalds is running different emotional campaigns to see how their consumers react? they tried the science healthy route and it didn't work, so lets go back to we are tasty and not healthy route. again COMMON SENSE!! In a society based on capitalism we are all lab rats from the moment we wake up until we go to sleep. every dollar you spend is data for whoever wants to take notice. every link you click works the same way. This data is how these brands can charge millions of dollars for ad space. this is how the game works. get used to it. and Facebook has to stop pretending they're higher up and not just making money but helping the human condition by tapping into our hearts and blah blah blah blah. if you actually asked people what they wanted out of their Facebook experience you know what they would say... why isn't there a dislike button available?

where has the common sense gone?

Feb. 12 2015 02:10 PM

Interesting show...FB's suggestion to contact the user I was reporting was exactly why I quit the site! A friend of mine had been tagged in his friend's album of an open wound (obtained playing sports). So on my feed I had all of this blood and all from someone I didn't even know. I reported it and was asked if I wanted to contact the person directly. I was very hesitant but thought I'd take the mature route. In response, that user wrote a very threatening and harassing post on my wall. I again reported, this time going to FB, and FB replied that the open wound was not against its policy. So I quit. And my life has become a lot better because of it.

Feb. 12 2015 11:56 AM

great show! I found it very funny how at xmas time there was a big increase of "reports" and very interesting how a little word "its" made such a difference. I find it really funny (if that is the right word) how people act shocked when FB uses data shared on FB or learns from actions of its users. Seriously??? what do you think advertising agencies do all the time with the information they get from what you purchases, where you purchase, and how often, or from the magazines you subscribe to, order from the catalogs you order from. Do more people buy from the catalog with the model holding the puppy or without the puppy. Its all about the testing from the data available.

Anyway - keep up the amazing work - I LOVE the way you produce your show, amazing editing, THANKS!!

Feb. 11 2015 10:15 PM

Robert, if you guys actually read these comments, I think your birthday card analogy is completely inadequate, because there's no real necessity for the card receiver to respond. If perhaps the birthday card said, "I want to take you out for dinner sometime. Let me know where you want to go." Then it would be more analogous. I know this might sound super pedantic but it's true. You're not asking for someone's assistance or really asking for any response at all by sending them a birthday card. You are however with the Facebook picture deletion tool. There's a pull there to respond.

Anyway, I'm a big fan of the show, keep up the good work, and I hope you two and your crew have a wonderful day.

And if you're ever looking for a pedant to put on your staff... let me know :)


Feb. 11 2015 04:08 PM
John I.G. from Netherlands

This podcast is a little misleading.

The uproar was against facebook manipulating which messages they got in their newsfeed. So you either saw more negative or more positive messages.
The emotional manipulation, wasn't the standard a/b testing every single online company uses, it was the clear testing on people by creating an environment that was more positive, negative, uplifting, depressing.

All to see if people would be more "engaged", another good word for addicted. Can we make someone use Facebook more, if we artificially filter the emotions in their environment. That was the problem.

Feb. 11 2015 02:24 PM
Bazilisk from United States

It sounds like the musician is called "Mooninite". Google "Mooninite musician" to see their various social network pages. Just googling for mooninite will just bring up information on a cartoon character.

Feb. 11 2015 12:26 PM

What was the bands name he mentioned at the end?

Feb. 11 2015 11:54 AM
Nate Martin from Lancaster, PA

Concerning the Facebook voting experiment. Is there any information about how many of those that clicked the "I Voted" actually voted? Or, is there another phenomenon where in order to fit in with the Facebook crowd people would simply lie about voting?

Feb. 11 2015 09:46 AM
David from Conecticut

Great program.
I was reminded of similar piece of social engineering (perhaps a subset) by Facebook that Arturo described at a public seminar I attended a year ago. This one had to do with bullying. FB did a similarly in depth analysis of bullying complaints and helped make it easier for the victim to interact directly and successfully with the perceived bully. Often the "bully" had no idea that they were bullying and ended up apologizing and taking down the offending post. I can only imagine that the victim felt more empowered to speak up for him or herself in the future.

Feb. 11 2015 09:13 AM

Not using FB probably made it easy for me to laugh at several points. This said, there are some funny aspects to this, regardless what.
One, and it's a little like with the NSA, the 'folks-what-did-you-think/expect??!'-thing...
2nd, if that particular 'scandal' unfolded as described in the pod, it'd be a bit of a funny development, from too many Xmas pics to needing to deal with it to offering politically correct options to worldwide 'outrage'... which kinda makes it plausible - it'd be much weirder if things won't at times [and even more often than that] play out like this...
All this said, and the concern of users appreciated, but as the story continued and the reacting-to-reactions was elaborated on, before any mentioning of 'scandal', the thing inevitably popping into my head was, 'hum, guess we've discovered a thing for which FB is good, after all - social studies :) :)' And how do such studies work better, with those studied being all educated about the study, or those unaware...? Questions of ethics aside, uhm... :)
Isaac from Moscow, I'm not at all surprised about sorry just not working, or being popular in such cases, and I don't find it disappointing either. To start with, the word should be declared a somewhat sacred one, due to overuse. Overuse inevitably leads to diminishing of the genuine contents of a word. Sorry, please and love lead the words abused by overuse [and, not surprisingly, in many languages]; one needs to be very attentive to see whether the word is actually meant the way it ought to be. And 'sorry' should be used when one means it, and only then - I doubt many here would mean it. If we feel the need for being civil, most languages include an equivalent for 'excuse me.'

Feb. 11 2015 04:32 AM
MC from Los Angeles

Thanks for another great show, RL. Having noticed that FB and I exist on parallel moral universes, I have found myself opting out of its "services" more and more, despite the fact that I have no replacement hub for keeping in touch with people who span continents and decades of my life. Bully for the researchers who now have a gazillion test subjects. As a researcher myself, I hardly think that a "terms of use" contract that changes subtly every fortnight and would require each FB user to employ a floor of lawyers to decipher should count as a mechanism for informed consent.

What I don't understand is -who are these people who can't just tell their acquaintances to take photos down? I have done so more times than I can count. I also recall a period of time when a high school friend had gotten into the habit of scanning tween and teen photos of a whole lot of people, and began posting them. The whole lot of us put a stop to that almost immediately the old fashioned way - by telling him not to continue. I can't believe there is/was a whole FB unit devoted to catering to dysfunctional communicators. Ech.

I should feel better now that you have laid bare this FB mess, but as is usually the case when I think too deeply about FB, I just get itchy.

Feb. 11 2015 02:57 AM
Clare from Portland, OR

Did Facebook increase voter turnout, or simply the number of people who clicked the "I voted" button? My guess is a little bit of both.

Feb. 11 2015 02:26 AM
scorpio6 from who cares

Seriously? This presumptuousness of this whole Facebook mentality is off-putting.
Oh how self-important these people think themselves
nudging us to interact" when the truth is far less interesting...; The masses stare into glowing screens making themselves seem awesome but meanwhile, they sit alone and in the dark -staring at those glowing screens...
This is not social, its the opposite of social.

Feb. 11 2015 12:24 AM
Stacy from Melbourne

sorry who was the music by? It said it a little to quick.

Feb. 10 2015 08:28 PM
Allyson from Kansas City, Missouri

Hi! I am a (non-evil) greeting card designer for Hallmark, actually- Ha! Anyway, love the show, as always :)

I have a theory... I think those who are more expressive in their face and vocal intonation in real life, tend to use emoticons more frequently in emails, texts, and facebook messages than those who are less-expressive in person. Has anyone else found this to be true? Just an interesting thought I had after listening to this week's episode!

Feb. 10 2015 02:54 PM
Arzoo from San Francisco, CA

The music for this episode was on point.

Feb. 10 2015 01:39 PM
Jill from Oberlin, Ohio

Hello from Oberlin!

It occurred to me just how often you have rat studies in your stories. I think it would be cool to have a show about this....the contribution of the rat to our understanding of behaviors and things unseen. Respect the Rat sorta thing. But maybe that is for Invisibilia?

Always, I love listening and I am so thrilled you are doing this work.

Feb. 10 2015 12:28 PM
Alan Dorman

Facebook won't let you take down photos of yourself? Creepy.

Feb. 10 2015 12:20 PM
Isaac from Moscow, Russia

I wasn't surprised at all that using the word "sorry" is actually less effective. Speaking from experience, when I was working in a coffee shop and customers would throw their tantrums, I realized that saying "sorry" (especially when I haven't done anything wrong) only made their behavior worse. It seems that saying "sorry" just makes people think that their behavior is justified.

Speaking from the other side of the counter, when I call Verizon and they say "I'm sorry you're having trouble with that" it just seems unnecessary and, to me, insincere. Just like an apology in a generic Facebook message would seem unnecessary and insincere.

I love you radiolab! Can I call in and read for the sponsors? :)

Feb. 10 2015 12:17 PM

Leave a Comment

Email addresses are required but never displayed.

Supported by